Is Entropy Suitable to Characterize Data and Signals for Cognitive Informatics?

Is Entropy Suitable to Characterize Data and Signals for Cognitive Informatics?

Witold Kinsner
DOI: 10.4018/jcini.2007040103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This article provides a review of Shannon and other entropy measures in evaluating the quality of materials used in perception, cognition, and learning processes. Energy-based metrics are not suitable for cognition, as energy itself does not carry information. Instead, morphological (structural and contextual) metrics as well as entropy-based multi-scale metrics should be considered in cognitive informatics. Appropriate data and signal transformation processes are defined and discussed in the perceptual framework followed by various classes of information and entropies suitable for characterization of data, signals, and distortion. Other entropies are also described including the Rényi generalized entropy spectrum, Kolmogorov complexity measure, Kolmogorov-Sinai entropy, and Prigogine entropy for evolutionary dynamical systems. Although such entropy-based measures are suitable for many signals, they are not sufficient for scale-invariant (fractal and multifractal) signals without corresponding complementary multi-scale measures.

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing