Generalized Fuzzy Divergence Measure, Pattern Recognition, and Inequalities

Generalized Fuzzy Divergence Measure, Pattern Recognition, and Inequalities

Ram Naresh Saraswat, Neha Khatod
Copyright: © 2022 |Pages: 22
DOI: 10.4018/IJFSA.285983
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Many fuzzy information and divergence measures developed by various researchers and authors. Here, authors proposed new fuzzy divergence measure using the properties of convex function and fuzzy concept. The applications of novel fuzzy divergence measures in pattern recognition with case study, are discussed. Obtained various novel fuzzy information inequalities on fuzzy divergence measures. The new relations among new and existing fuzzy divergence measure by new f-divergence, Jensen inequalities, properties of convex functions and inequalities have studied. Finally, verified these results and proposed fuzzy divergence measures by numerical example.
Article Preview
Top

1. Introduction

Information Theory is the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Information Theory consider the uncertainty of randomness perfectly. The first person; Harry Nyquist (1924) and (1928), Hartley (1928) had discovered the logarithmic nature of measure of information. Harry Nyquist published the paper 'Certain Factors Affecting Telegraph Speed' in which discussed the relation IJFSA.285983.m01 where W is the speed of transmission of intelligence, m is the number of difference voltage levels to choose from at each time step and K is a constant. Its impact has been crucial to the success of the voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes and numerous other fields. As a generalization of the uncertainty theory based on the notion of possibility. In 1948, The new idea in the direction of information theory given by Shannon which called Shannon entropy. In his seminal paper ‘A Mathematical Theory of Communication’, provided the beginning of a separate branch of learning namely the Information Theory. Shannon entropy play the central role in the Information Theory and fuzzy mathematics. Entropy is mentioned to as the uncertainty measure which is defined on probability distribution and can be shown to be a good measure of randomness or uncertainty. The applications of entropy are broadly used in dissimilar areas like as decision making, quantum information theory, communication theory, image registration, finance, pattern recognition etc. (Shannon, 1948) introduced the following measure of information:

IJFSA.285983.m02
(1.1)
IJFSA.285983.m03
where:
IJFSA.285983.m04
denote the set of all finite discrete complete and generalized probability distributions, respectively. The equation (1.1) is Shannon’s entropy. The function IJFSA.285983.m05 represents the expected value of uncertainty associated with the given probability distributions. It is uniquely determined by some rather natural postulates.

Divergence measure is a distance or affinity between two probability distribution which is also called inaccuracy of information. Divergence measures involving two discrete probability distributions calling discrimination function, after that the various authors called the named as cross entropy, relative information etc. (Kullback et al., 1951) developed the very important information and divergence measure which is given by:

IJFSA.285983.m06
(1.2) where IJFSA.285983.m07 and as a generalization of the uncertainty theory based on the notion of possibility, information theory considers the uncertainty of randomness perfectly. (Jain & Saraswat, 2012 and 2013) introduced new f-divergence measure which is given by:
IJFSA.285983.m08
(1.3) where IJFSA.285983.m09 is a convex function and IJFSA.285983.m10. The idea of ‘probabilistic divergence’, which in some sense assesses how ‘close’ two probability distributions are from one another, has been widely applied in probability, statistics, and information theory.

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024)
Volume 12: 1 Issue (2023)
Volume 11: 4 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing