Is Entropy Suitable to Characterize Data and Signals for Cognitive Informatics?

Is Entropy Suitable to Characterize Data and Signals for Cognitive Informatics?

Witold Kinsner (University of Manitoba, Canada)
DOI: 10.4018/jcini.2007040103
OnDemand PDF Download:


This article provides a review of Shannon and other entropy measures in evaluating the quality of materials used in perception, cognition, and learning processes. Energy-based metrics are not suitable for cognition, as energy itself does not carry information. Instead, morphological (structural and contextual) metrics as well as entropy-based multi-scale metrics should be considered in cognitive informatics. Appropriate data and signal transformation processes are defined and discussed in the perceptual framework followed by various classes of information and entropies suitable for characterization of data, signals, and distortion. Other entropies are also described including the RĂ©nyi generalized entropy spectrum, Kolmogorov complexity measure, Kolmogorov-Sinai entropy, and Prigogine entropy for evolutionary dynamical systems. Although such entropy-based measures are suitable for many signals, they are not sufficient for scale-invariant (fractal and multifractal) signals without corresponding complementary multi-scale measures.

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 11: 4 Issues (2017): Forthcoming, Available for Pre-Order
Volume 10: 4 Issues (2016): 2 Released, 2 Forthcoming
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing