Measuring Given Partial Information about Intuitionistic Fuzzy Sets

Background: Measuring the information and removal of uncertainty are the essential nature of human thinking and many world objectives. Information is well used and beneficial if it is free from uncertainty and fuzziness. Shannon was the primitive who coined the term entropy for measure of uncertainty. He also gave an expression of entropy based on probability distribution. Zadeh used the idea of Shannon to develop the concept of fuzzy sets. Later on, Atanassov generalized the concept of fuzzy set and developed intuitionistic fuzzy sets. Purpose: Sometimes we do not have complete information about fuzzy set or intuitionistic fuzzy sets. Some partial information is known about them i.e either only few values of membership function μμAA (xxi ) or non membership function ννAA(xxi) are known or a relationship between them is known or some inequalities governing these parameters are known. Kapur has measured the partial information given by a fuzzy set. In this paper, we have attempted to quantify partial information given by intuitionistic fuzzy sets by considering all the cases. Methodologies: We analyze some well-known definitions and axioms used in the field of fuzzy theory. Principal Results: We have devised methods to measure the incomplete information given about intuitionistic fuzzy sets. Major Conclusions: By devising the methods of measuring partial information about IFS, we can use this information to get an idea about the given set and use this information wisely to make a good decision.


Introduction
Information theory was developed by Shannon [1] in 1948. Shannon [1] was the first to utilize the term entropy for measuring the information. Let X be a discrete random variable with probability distribution P = (p 1 , p 2 , ..., p n ) in an experiment. The information contained in this experiment is given by H(P) = − ∑ p i =1 ln (p i ) [1] which is well known as Shannon's entropy. Then this notion of entropy was extended and used to measure the information provided by Fuzzy Set. De Luca and Termini [2] suggested the following measure of fuzzy entropy: (1) where A is the fuzzy set defined on universe of discourse X={x 1 ,x 2 ,......x n }.Kaufman [3] defined measure of entropy of fuzzy set as Bhandari and Pal [4] defined the measure of fuzzy entropy as Likewise, numerous other entropy measures were developed for fuzzy sets. Fuzzy sets theory has become very useful because of its wide applications. In recent times, Rana [5] studied fuzzy models and did a comparison study for crop production forecasting. Radzi and Ahmed [6] worked on fuzzy soft sets and found its application in TOPSIS. Anuradha, mehra et al. [7] worked on Archimedean Fuzzy M-Metric Space and Fixed Point Theorems. Atanassov [8] developed the concept of Intuitionistic Fuzzy sets (IFS). IFS are actually a generalization of fuzzy sets. IFS are characterized by two functions: membership function and non membership function. Atanassov [8] defined a new term hesitancy degree for the intuitionistic fuzzy sets. Entropy measures were also developed for intuitionistic fuzzy Sets. Burillo and Bustince [9] were first to characterize the entropy on IFS. Szmidt and Kacprzyk [10] used the axioms of De luca and Termini to characterize non probabilistic intuitionistic fuzzy entropy. Hung and Yang [11] suggested the intuitionistic fuzzy entropy based on the distance measure between IFSs. Vlachos and Sergiagis [12] developed a mathematical connection between fuzzy entropy and intuitionistic fuzzy entropy and proposed a measure of intuitionistic fuzzy entropy. Afterwards, Zhang and Jiang [13] generalized the De Luca Termini [2] logarithmic fuzzy entropy to define a measure of intuitionistic fuzzy entropy. Ye, Wei et al. [14,15] gave other types of intuitionistic fuzzy entropy based on trigonometric functions. Verma and Sharma [16] developed a fuzzy measure for entropy on IFS based on exponential function. Chen and Li [17] proposed other type of entropies on IFSs. In 2018, Joshi and Kumar [18], devised a new parametric measure of entropy for IFS. Alusfyani and Owny [19], found an exponential intuitionistic fuzzy entropy measure and its application in image edge detection. Yin, Yang et. al [20], used the existing fuzzy entropy measure to develop interval valued intuitionistic fuzzy entropy. Recently, Wei and Li et.al [21] have devised a novel generalized exponential entropy for IFS. Information theory is concerned with quantitatively measuring the information. But sometimes, complete information is not given. Only a partial knowledge about the variables is given. In 1997, Kapur [22] tried to measure that partial information provided by the fuzzy set using the entropy measure given by De luca and Termini.
He used the term fuzziness gap for the difference between maximum and minimum value of the entropy. He came to know that this fuzziness gap gets reduced if we know some information about the IFS. So, he used this fuzziness gap to quantify the partial information given by IFS. Then in 2017, N.Singh [23] discussed the various methods to measure the partial information using the entropy formula given by De luca and termini, Kauffman and Bhandari and Pal.
In this paper, we will generalize the ideas of Kapur [22] from fuzzy sets to intuitionistic fuzzy sets so as to measure the partial information given about them.

Methodology
Now, we study some well-known definitions and concepts related to fuzzy sets and intuitionistic fuzzy sets given by various researchers.

Basic Definitions and Preliminaries
Zadeh [24] introduced the concept of fuzzy set.
} be the universe of discourse, then a fuzzy set A defined on X is Definition 2: Intuitionistic fuzzy sets: Intuitionistic fuzzy set A on X is is called the hesitancy degree or the degree of indeterminacy [8] Definition 3: A real function e: FS(X)→ + is called entropy on FS(X) if e satisfies the following properties: denotes the complement of A. [2] Definition 4: A real function E: IFS(X)→ + is called an entropy on IFS(X) if E satisfies the following properties: Langrange's Method of Multipliers: Let f(x,y,z) be a function of variables x,y and z.We want to find maximum or minimum value of g(x,y,z) subjected to the constraint h(x,y,z)=k where k is a real number. i). Form a new function (x,y,z,λ)= g(x,y,z)+ λ(h(x,y,z)-k) ii). For stationary points, put = 0 , = 0 , = 0 Solving, the above three equations, we get the value of x,y,z for which the value of f(x,y,z) is extreme.

Fuzziness Gap
Hung and Yang [11] defined entropy for IFS as This entropy defined by Hung and Yang achieves maximum at ( )= ( ) = ( )=1/3. We will use the above measure of entropy in the rest of our paper.
Let A be an IFS. Suppose, we have only partial information about (x) and (x) i.e we know only few of 's and 's or we know some relations between them. We know that, the information provided by such an IFS is incomplete. Now, we try to quantitatively measure the information provided about such an IFS by knowing only a few values of 's and 's. As we know, entropy for IFS given by Hung and Yang is This (1) measures the fuzziness associated with intuitionistic fuzzy sets.
Suppose we don't know anything about IFS A i.e none of the values of ( ) and ( ) are known. Then the maximum and minimum value of G is: = nln3 for IFS (

Results
If we know something about an IFS like values of some of the 's and 's or some relationships governing them, then that information is known as partial information about the IFS. The fuzziness gap can increase or decrease depending upon the information given by the partial knowledge. Hence, Information given by partial knowledge = Fuzziness Gap before we use the knowledge − Fuzziness Gap after we use the knowledge. Now, we illustrate the methods of measuring given partial information in different cases.

Methods of Measuring Given Partial Information in Different Cases
Case I: When none of the values of 's and 's are known Then the maximum and minimum value of G is: = nln3 for IFS ( Fuzziness Gap = nln3 (discussed above) Case II: When one or more values of ( , ) are given Suppose ( ( ) , ( )) is known Then the term containing ( ) , ( ) in got fixed and remaining (n-1) terms are completely unknown.
We have seen that the fuzziness gap before we know this knowledge is nln3. So, by knowing ( ( ) , ( )), there is a reduction in fuzziness gap being equal to ( − 1) 3. Therefore, the information provided by ( ( ) , ( )) is 3. If we know r values of ( ( ), ( )) then the corresponding fuzziness gap will be (n-r) ln3 and information provided by them will be rln3.
Hence, the fuzziness gap is getting reduced if we are getting information about the IFS.
Case III: When one or more relationships among 's and 's are given.

If are positive integers
Subtract it from nln3 to get the information provided by the constraint.

If are not integers
for some j where i, j are chosen such that . Then, Subtracting it from n ln3 will give information given by the constraint.

If is a positive integer but is not an integer
And subtract it from nln3 to get the information provided by the constraint.
Similarly, it can be done for 2 being positive integer but not 1 Case IV: When equality constraints are given Suppose, for an IFS it is given that For finding , we again use Langrange's method of multipliers, Let Similarly, Solving equations (23) and (25) simultaneously, we get Similarly, ( ) = Where 1 , 2 can be determined from the equations ( 2 ) + … … + n ′ ( n ) = 2 by putting the values of ( ) and ( ). can be found out by taking n−1 ′ zero or one and remaining one is determined by the constraints (18) and (19).
Case V: When inequality constraints are given Suppose we are given that In this case, we first find the maximum value of G corresponding to: For minimum value of G, we will keep one of ′s and one of 's as different from 0 & 1 and all others as one or zero. The value of that non zero, non one should be chosen such that they satisfy the constraints.

Numerical Examples
Example: Suppose there are two companies A and B. A person wants to invest in one of the companies depending upon the transparency in their policies given by ( 1 ) + ( 2 ) + ( 3 ) + ( 4 ) = 1.8. Which company is more transparent in its policies? Answer: (a) As value of membership and non membership function are known only for two values. By Case II, Fuzziness gap for company A after using the knowledge = (n-r) ln3= (4-2)ln3= 2ln3.
Fuzziness gap before using the knowledge for company A = 4ln3 Partial information for company A given by the knowledge =4 ln3-2 ln3 = 2ln3 Fuzziness gap for company B after using the knowledge = (n-r) ln3= (4-1)ln3= 3ln3.
Fuzziness gap before using the knowledge for company B = 4ln3 Partial information for company B given by the knowledge =4 ln3-3 ln3 = ln3 As information provided by company A is more than the information provided by the company B, company A is more transparent in its policies.
(b) Here in case of A, 1 =2.3, 2 =1.3 and n=4 By Case III, Fuzziness Gap of A after using the information Putting the values of 1 , 2 and n, we get Fuzziness Gap of A after using the information = 2.566 Therefore, the partial information provided by the constraint about company A= 4ln3 −2.566 =1.828 In case of B, 1 =2, 2 =1.8 and n=4 By Case III, Fuzziness Gap of B after using the information Putting the values of 1 , 2 and n, we get Fuzziness Gap of B after using the information = 2.9221 Therefore, the partial information provided by the constraint about company B=4ln3-2.9221=1.4723 So, company A is more transparent in its policies.

Discussion
Kapur [22] had given the methods to quantify partial information given about Fuzzy Set. Atanassov generalized the concept of Fuzzy sets and defined Intuitionistic Fuzzy Sets. IFS's are more useful than fuzzy sets as they have membership degree, non membership degree and degree of hesitancy. So, by measuring the partial information given about IFS, we can have an idea about them and hence make a good decision on the basis of that information.

Conclusions
In this paper, we have worked on finding the methods to measure partial information given by IFS in different cases. Measuring the partial information about the sets helps in making better decisions. Earlier, Kapur [22] did this for Fuzzy sets. N.Singh [23] used different measures of entropy to calculate the same. We have generalized the methods given by Kapur for measuring partial information about fuzzy sets to IFS. IFSs are more generalized and more efficient in taking care of uncertainties and measuring the uncertainty associated with the information has always been our priority. So, they are much better than crisp sets and fuzzy sets.