An Advanced Entropy Measure of IFSs via Similarity Measure

The Entropy measure of an intuitionistic fuzzy set (IFS) plays a significant role in decision making sciences, for instance, medical diagnosis, pattern recognition, criminal investigation, etc. The inadequate nature of an entropy measure may lead to some invalid results. Therefore, it is significant to use an efficient entropy measure for studying various decision-making problems under IFS environment. This paper first proposes a novel similarity measure for IFS. Based on the proposed similarity measure, an advanced entropy measure is defined with a different axiomatic approach. This axiomatic approach allows us to measure an IFS’s entropy with the help of a similarity measure. To show the efficiency of the proposed similarity measure, a comparative study is performed with the existing similarity measures. Some structural linguistic variables are taken as examples to show the validity and consistency of the proposed entropy measure along with the existing entropy measures. Finally, based on the proposed entropy measure, a multi-criteria decision-making problem is performed.


INTRodUCTIoN
In 1986, Atanassov developed the IFS theory, which is the extension of Zadeh's fuzzy set theory.Similarity and entropy measures are two essential tools for dealing with uncertainty through the IFS theory.Different similarity and entropy measures have been proposed and applied successfully in many areas.Similarity measures defined from the well-known distance measures are depicted by Smidh et al. (2000), Wang et al. (2005), Grzegorzewski (2004), Chen (2007), Hung et al. (2007).Li and Cheng (2002), Liang and Shi (2003), Hwang et al. (2012), Xu (2007), and Xu and Yager (2009) gave several new similarity measures for IFSs.Mitchell (2003) developed a statistical method of Dengfeng and Chuntian's similarity measure by giving some counterintuitive cases.Ye (2011) has compared the existing similarity measures and proposed new and weighted similarity measures using the cosine function.Xu and Chen (2008) developed a series of similarity measures by generalizing the weighted Hamming distance, the weighted Euclidean distance, and the weighted Hausdorff distance.Xia and Xu (2010) and Zeng and Guo (2008) worked on distance, similarity, and entropy measures and studied their relationship.A generalization of the existing entropy measures for IFSs is proposed by Wei et al. (2011).Boran and Akay (2014) introduced a new general type of similarity measure for IFS relating two parameters norm and the level of uncertainty.Li et al. (2012) studied both the similarity measure and entropy measure for IFSs by defining an axiomatic approach to the similarity measure.Entropy is an effective measure to give a picture of the fuzziness of a fuzzy set.Many researchers have defined many entropy measures.Bhandari and Pal (1993), Luca and Termini (1972), Fan and Ma (2002), Shore and Gray (1982), Zhang and Jiang (2008), Ye (2010), Verma andSharma (2013), pal andpal (1989), Wei et al. (2012), Wang and Wang (2012), Liu and Ren (2014), Song et al. (2014), Szmidt and Kacprzyk (2001), Vlachos and Sergiadis (2007), Burillo and Bustince (1996), Zeng and Li (2006), Zhang and Zhang (2009), Farhadinia (2013), Liu (1992), Zeng and Li (2006), Zeng and Guo (2008), Li and Deng (2012), Zhang et al. (2014), Garge et al. (2011), Hung and Yang (2006) worked on entropy measure for IFSs with a different aspect.Along with the study of entropy measures, Li et al. (2010Li et al. ( , 2011Li et al. ( , 2014Li et al. ( , 2015Li et al. ( , 2016Li et al. ( , 2017) ) have worked to develop many ranking processes of IFS and Interval Valued Intuitionistic Fuzzy Sets (IVIFSs).In the field of IVIFSs, Talukdar et al. (2019) have also proposed a novel ranking method to enhance the efficiency of the process.Some novel entropy measures have been developed by Thao (2021), Verma et al. (2017), Wei et al. (2019), Zhu et al. (2016), Wei et al. (2021), Li et al. (2015), Li et al. (2002), Li et al. (2004) with a lot of different perspective.
Though different entropy measures of IFS have been developed in the literature, there are many situations where they do not incorporate the exact measure of IFSs.Therefore, sometimes it becomes challenging to select the best choice that reflects the correct nature of the IFSs.Thus, it is a fundamental and essential task for developing efficient and reliable entropy measure with different perspective.
This motivates us to study effective novel entropy measures with different approaches.This paper presents a novel similarity measure of IFSs.Based on that similarity measure, an advanced entropy measure is defined to measure the degree of fuzziness of IFSs by introducing a new axiomatic approach.

Structure of the Paper
The detailed work has been shortened as follows.Section 2 starts with some relevant preliminary definitions.The proposed intuitionistic fuzzy similarity measure and entropy measure are presented in sections 3 & in section 4, respectively.Sections 5 & 6 show the positivity and necessity of the proposed similarity and entropy measures compared to the earlier methods.Two multi-criteria decision-making problems are discussed by using the proposed entropy measure in section 7. Finally, a concrete conclusion has been drawn in section 8.

Fuzzy Set
Fuzzy set is a set in which every element has degree of membership of belonging in it.Mathematically, let X be a universal set.Then the fuzzy subset A of X is defined by its membership function where the value of m A x ( ) at x shows the grade of membership of x in A.

Intuitionistic Fuzzy Set
A Intuitionistic fuzzy set A on a universe of discourse X is of the form A , is called the "degree of non-membership of x in A", and m A x ( ) and n A x ( ) satisfy the condition that

The amount π µ ν
1 is called hesitancy of x which is reflection of lack of commitment or uncertainty associated with the membership or non-membership or both in A.

definition
For A IFS X ∈ ( ) and B IFS X ∈ ( ) , some relation between them are defined as follows: R1.
where is s the complement of

PRoPoSEd INTUITIoNISTIC FUZZy SIMILARITy MEASURE
In this section, a novel similarity measure is proposed for IFSs.
Then we propose the novel similarity measure as follows, Where m A i x ( ) and n A i x ( ) are membership degree and non membership degree of the element x X i Î .

Theorem
( ) is the similarity measure between two IFSs A and B in X.
Thus we have (5) , , and we have Hence, it is shown that the function S A B p , ( ) satisfy all the axioms of similarity measure.

ENTRoPy MEASURE
Entropy measure gives the degree of fuzziness associated with a fuzzy set.The same illustration is put forward in the case of the IFS also.Eulalia Szmidt and Janusz Kacprzyk (2001) described that a non-fuzzy set is the points A and B that correspond to fully belongings and entirely non-belongings with the points µ , , 0 1 0 respectively having the entropy equal to zero.Also, if we move through the line from point A to point B, the membership values decrease, and the non membership values increase, and at the midpoint, both become equal to 0.5.So at the midpoint, the entropy measure is equal to 100%.Subsequently, it has been shown that for IFS the degree of fuzziness is highest for all points at which µ ν x x ( ) = ( ) .As the similarity measure between two IFS gives the degree of similarity of the sets, the higher the similarity degree between the IFSs A and x, ., .0 5 0 5 will lead to a higher entropy measure of set A. This is how we motivate to define a new axiomatic approach and a new entropy measure based on similarity measure.

de Luca-Termini's Axioms for Entropy Measure of IFS
A real valued function E IFS X : , for all ,

E4.
if is less fuzzy than that is,

Some Existing Entropy Measure
Burillio and Bustince (1996) defined entropy function as Szmidt and Kacprzyk (2001) developed the ratio based entropy measure by describing the geometric interpretation of IFS as Hung and Yang ( 2006) defined two entropy measures as Zhang and Jiang (2008) defined the entropy measure as

New Axiomatic Approach of Entropy Measure
The new axiomatic definition is developed based on the concept that the IFS x, , 1 2 1 2 , whose hesitancy margin is zero is the fuzziest IFS.More similarity of an IFS with x, , , ,...... be a universal set and A , , : µ ν be an IFS and , ., .0 5 0 5 .Then define an entropy measure as , where S A P , ( ) is the similarity measure between the IFSs A and P . .

Theorem
Since each term of left hand side is positive, therefore the equation is true only if µ ν

A CoMPARATIVE STUdy oF INTUITIoNISTIC FUZZy SIMILARITy MEASURE wITH THE EXISTING SIMILARITy MEASURE
This section exhibits a comparative study between the proposed similarity measure and the other existing similarity measures by considering some standard numerical examples of IFSs.The counterintuitive examples of IFSs, along with their similarity degree for various similarity measures for this study, are depicted in table-1.For the IFS A = 0 3 0 3 ., .and B = 0 4 0 4 ., ., in that table it is seen that the degree of similarity for the similarity measures in Chen (1995), Hwang et al. (2012) and Ye (2011) are identical and is equal to 1. Thus, the similarity measures in Chen (1995), Hwang et al. (2012) and Ye (2011) do not satisfy the axiom SP2 ( ) .On the other hand, the similarity measures in Hong and Kim   (1999), Li et al. (2002), Mitchell (2003), Liang et al. (2003) again show equal values for different pair of A, B as shown in table-1.This situation is unsuitable for application in different fields like pattern recognition, medical decision making etc.Also, the similarity measure in Mitchell (2003)  respectively.But it is a general intuition that the similarity of the pair of IFSs A B = 0 4 0 2 0 5 0 3 ., .
. , .and .Therefore, it is observed that the existing similarity measures have some limitations to describe some particular situations of IFSs.On the other hand, our proposed similarity measure can handle such situation properly and show a reasonable performance for each pair of IFSs.Furthermore, our similarity measure has the advantage of a relatively simple expression without counterintuitive examples.For instant, the similarity measure in Boran and Akay (2014) involves two parameters, L p norm and the level of uncertainty t ( ) .Compared to our similarity measure, the similarity measure in Boran and Akay   (2014) has the difficult situation of determination of the two parameters p and t.

A CoMPARATIVE STUdy oF THE dIFFERENT EXISTING ENTRoPy MEASURE wITH PRoPoSEd ENTRoPy MEASURE
Let us consider the IFSs S x and S x . It is a general intuition that if we compare these IFSs with the IFS x, , 1 2 , with highest entropy and minimum hesitancy margin then the set S 3 has the highest degree of similarity than the others.Thus from the axiom of entropy measure E 3 1 .
( ) S 3 should have highest entropy value.

A l s o t h e e n t r o p y v a l u e s fo r p r o p o s e d e n t r o p y m e a s u r e a r e g i v e n b y E S E S E S
1 2 3 0 2782 0 2911 0 5681 . This ordering of the entropy measure is identical with the Eulalia and Kacprzyk (2001) entropy measure in which they explained it by giving a geometrical interpretation IFS.But the entropy measure E BB defined by Burillio and Bustince fails to reflect the above phenomena.
Characterising the linguistic variable as "large" for A and using the above defined operation for A n , we compute the IFSs A A A A , , , with their usual meaning as follows: A 1 2 may be treated as "more or less large" A 2 may be treated as "very large" A 3 may be treated as "quite very large" A 4 may be treated as "very very large" The computed sets are given by    > ( ) > ( ) > ( ) > ( ) , whereas the other entropy measures except the entropy E BB follow the ordering as 4 .Now we will check the consistency of the proposed entropy measure for a different IFS which is obtained by reducing the hesitancy degree of the middle point x 3 .Thus the IFS A becomes 0 1 0 8 0 3 0 5 0 5 0 4 0 9 0 0 1 0 0 0 , . ,. , ,. ,. , ,. ,. , ,. ,. , ,. ,.} } .  ., .

{ }
The computed entropy values for A with their usual meaning are shown in table-3.

{ }
From the viewpoint of mathematical operations and human intuition, the entropies of these IFSs should satisfy the following requirements for the structural linguistic variable:

Entropy Measure for IFS
From the tables 2,3and table 4 it has been seen that the proposed entropy measure along with the entropy measure E SK and E ZJ satisfies the above requirements (8).Furthermore, these results are reasonable from the view point of structural linguistic variable.
Justification: For the IFSs A A , 1 and A 2 the changes in membership and non membership values occurred only for midpoint x 3 and for the other points the membership and non membership values remained same.Thus, the entropy values for all the IFSs depend only on this midpoint x 3 only.Now, we change the IFS A into A 1 by changing the midpoint x 3 0 6 0 2 , ., . to x 3 0 5 0 4 , ., . .
. .From the entropy measure and the geometrical interpretation for IFS defined by Szmidt and Kacprzyk (2001) the entropy for the point x 3 0 5 0 4 , ., . of A 1 is higher than the point x 3 0 7071 0 2254 , , .
. of A 1 1 2 .As a result the entropy measure of the IFS A 1 is higher than the entropy measure of the IFS A 1 1 2 , which is clear from the table-2.The same result reflects for the entropy measure defined by Zhang and Jiang (2008).From the view point of our new axiomatic definition of entropy measure if we compare the similarity of the points x 3 0 5 0 4 , ., .and x 3 0 7071 0 2254 , , .
Therefore, from the axiom E 1 3 ( ) it immediately follows that the entropy measure of the point x 3 0 5 0 4 , ., . of IFS A 1 is higher than the entropy measure of the point x 3 0 7071 0 2254 , , . .

of IFS
A 1 1 2 .Hence the entropy measure of the IFS A 1 is higher than the entropy measure of the IFS A 1 1 2 , which is seen from the table-2.Similarly we can have the proper explanation for the IFSs A A A Thus our proposed entropy measure along with the entropy measures defined by Szmidt & Kacprzyk (2001) and Zhang & Jiang (2008) follow the same ordering as Again we change the IFS A 1 into A 2 by changing x 3 0 5 0 4 , ., . to x 3 0 5 0 5 , ., . .Then the corresponding midpoint of the IFS A 2 1 2 is x 3 0 7071 0 2929 , , .
. .Thus from the table-3 it has been seen that the entropy measures , and E pp follow the ordering as For the IFS A entropy measures E PP , E ZJ and E SK follow the same ordering as and the other entropies , except E BB follow the ordering as When we change the IFS A into A 1 by changing the midpoint x 3 0 6 0 2 , ., . to x 3 0 5 0 4 , ., .then the corresponding midpoint of the IFS A 1 1 2 becomes x 3 0 7071 0 2254 , , .
. .We have explained in above that the entropy measure of the point x 3 0 5 0 4 , ., . of IFS A 1 is higher than the entropy measure of the point x 3 0 7071 0 2254 , , .
. of the IFS A 1 1 2 .Hence for the IFSs A 1 and A 1 1 2 the entropy measures should satisfy the ordering as Only E r 1 3 satisfies the requirements.Similarly since the point x 3 0 5 0 5 , ., . of the IFS A 2 has the highest fuzziness compared to x 3 0 7071 0 2929 , , .
. of the IFS A 2 1 2 , therefore the entropy measures should satisfy the ordering as Thus it has been seen that the proposed axiomatic approach of entropy measure can be taken as an alternative way to defining new entropy measure with the help of similarity measure.From the comparative study in tables 2, 3, 4 of different entropy measures it can be concluded that the proposed entropy measure is a reliable entropy measure of IFS and it possesses proper order pattern in the view point of structured linguistic variables.

A MULTI CRITERIA INTUITIoNISTIC FUZZy dECISIoN MAKING METHod BASEd oN PRoPoSEd ENTRoPy MEASURE
MCDM is the most reliable approach to select the best alternative among a set of alternatives with respect to various criteria when uncertainty or ill defined information involve.In this section, based on the proposed entropy measure of a multi criteria decision making method is discussed where assessments of alternatives for different criteria are specified by IFNs.Let A be the set of n , , ,....., , and C be the set of m criteria of the alternatives, , , ,......, .To select the best choice under this setting the following procedures , denotes the IFNs.As there are different types of criteria exist for the alternatives, so if all the criteria are of same type then there is no necessary to normalisation of the ratings.
Otherwise the benefits criteria can be transformed into cost criteria by using the following normalization formula. .Then the aggregate entropy measures are calculated by using the proposed entropy measure.As it is general fact that, the less the uncertainty information with the alternatives with respects to criteria, the better the alternative is.The ranking order is obtained based on this principle.

Example
Let us consider the multi criteria decision making problem adapted from Nayagam (2011).Assume there exists a panel with four possible alternatives for investment.(1) A 1 is car company, (2) A 2 is a food company, (3) A 3 is a computer company, (4) A 4 is an arms company.The investment entity must make a decision according to the following criteria: C 1 (risk), C 2 (growth), C 3 (environmental impact).Suppose the four possible alternatives are evaluated by decision maker using the intuitionistic fuzzy numbers (IFNs) over the above three criteria as the following matrix. ., .
. , .7 0 3 0 4 0 2 Where each assessment x y ij ij , ( ) of the matrix is an IFN and in each entry x y ij ij , ( ) , x ij and y ij indicate the degree that the alternative A i satisfies the criteria C j and the degree that A i does not satisfy the C j respectively.Since the less the entropy or uncertainty information of each alternative with respect to the three criteria is, the better the investment the alternative is.Therefore, based on their entropy values we can rank the alternatives.
Using the proposed entropy measure the calculated aggregate entropy values for the alternatives are E A 1 0 67729 ( ) = .
. The aggregate entropy value of the alternative A 2 is the smallest entropy value that implies that it carries less uncertainty or fuzziness.Therefore, the decision maker can assess more useful information from this alternative.Hence A 2 is the best investment place for an investor.Thus the ranking order for the alternatives is given as A A A A 2 4 3 1 > > > , which coincides with the ranking obtained by Nayagam et al. (2011).

Example
Consider the MCDM problem adopted from Garge et al. (2017).Suppose a multinational company in India is planning its financial strategy for the next year, according to group strategy objective.For this, the four alternatives are obtained after their preliminary screening and are defined as below, A 1 : to invest in the southern Asian markets; A 2 : to invest in the Eastern Asian markets; A 3 : to invest in the Northern Asian markets; and A 4 : to invest in the Local markets.This evaluation proceeds from the four aspects namely as G 1 : the growth analysis; G 2 : the risk analysis; G 3 : the social political impact analysis and G 4 : the environmental impact analysis.These four alternatives A i i = ( ) are to be evaluated by corresponding experts, by using the intuitionistic fuzzy decision matrix , i j = = 1 2 3 4 1 2 3 4 , , , ; , , , and their corresponding rating is shown below: ., 0 0 1 0 6 0 1 0 5 0 2 0 7 0 2 0 6 0 3 0 7 0 2 0 6 0 3 . ., . ., . ., .
. Since decision maker can assess more information from IFS which carries less entropy or fuzziness, therefore, based on this principle the ranking order of these alternatives are A A A A 4 1 2 3 > > > .Thus, A 4 , the Local market is the best choice for the decision maker.

CoNCLUSIoN
Entropy measure of IFS plays an important role in decision making problems.Though different entropy measures have been developed in literature, many of them fail to reflect the exact nature carried out by the IFS.In this paper, based on a new axiomatic approach, a novel entropy measure is defined incorporating the concept of similarity measure of IFS.It has been proved that the proposed entropy measure satisfies all the properties of entropy measure.Consistency and advantages of the newly defined similarity measure are discussed by performing a comparative study with some existing similarity measures.Also in section 6, the comparative study reflects the efficiency and reliability of the proposed entropy measure.Finally, in section 7, the applicability of the proposed entropy measure is successfully exhibited through MCDM problems under IFS environment.
In the future, to enhance the applicability of the proposed entropy measure, it can be extended in a more sophisticated uncertainty environment, for instant, interval valued intuitionistic fuzzy sets and cubic Pythagorean fuzzy sets.
a l l e d a d e g r e e o f s i m i l a r i t y b e t w e e n take the same values for different pair of A, B. The similarity measures in Hung et al. (2004) take the same values for two different pairs of A B , in case 5 and case 6.It is seen that the similarity measure in Boran and Akay (2014) shows reasonable results for each pair of A B , .In Song et al. (2014) the similarity values are 0 are followed.A (MCDM) problem can be expressed in the matrix format as follows: .. ...... ........ ...... ........
c is the complement of x ij .Thus one can obtain the normalised intuitionistic fuzzy decision matrix D Using the proposed entropy measure the aggregate entropy values for the alternatives are E A 1 0 359672