A Method to Quantify Corpus Similarity and its Application to Quantifying the Degree of Literality in a Document

A Method to Quantify Corpus Similarity and its Application to Quantifying the Degree of Literality in a Document

Etienne Denoual
Copyright: © 2006 |Pages: 16
DOI: 10.4018/jthi.2006010104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Comparing and quantifying corpora are key issues in corpus-based translation and corpus linguistics, for which there is still a notable lack of standards. This makes it difficult for a user to isolate, transpose, or extend the interesting features of a corpus to other NLP systems. In this work, we address the issue of measuring similarity between corpora. We suggest a scale between two user-chosen corpora on which any third given corpus can be assigned a coefficient of similarity, based on the cross-entropy of statistical N-gram character models. A possible application of this framework is to quantify similarity in terms of literality (or, conversely, orality). To this end, we carry out experiments on several well-known corpora in both English and Japanese and show that the defined similarity coefficient is robust in terms of language and model order variations. Comparing it to other existing similarity measures shows similar performance while extending widely the range of application to electronic data written in languages with no clear word segmentation. Within this framework, we further investigate the notion of homogeneity in the case of a large multilingual resource.

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 7 Issues (2022): 4 Released, 3 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing