Abstract
In this paper, a parametric generalization of Shannon entropy, i.e., Logarithmic \(\beta\)-Norm entropy measure has been discussed. We have also derived the desired entropy characteristics of new generalized entropy measure including its positivity, expandability, extensivity and additivity and in addition, provided the much desired scale invariance property. This striking property is satisfied neither by Shannon entropy nor by its existing generalization like Renyi’s and Tsallis entropies, etc. Then we define the scale-invariant entropy measure in intuitionistic fuzzy set and discuss its several mathematical properties and validity. Thereafter, we suggested a new multi-criteria decision-making (MCDM) method using weighted correlation coefficient-based VIKOR approach for finding the ranking and measuring the uncertainity in place of distance measure. The proposed decision-making technique is well described through the numerical example based on supplier selection with the help of two approaches. In first approach we considered the case of partially known criteria weights, whereas unknown criteria weights are discussed in second approach. The comparative results show the flexibility and effectiveness of the proposed approach to solve problems in real life.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
To cope with the vague and uncertain information appropriately, Zadeh (1965) proposed the theory of fuzzy sets (FSs) to extend the classical set, depicted by the membership degree (\(\mu\)). Before the development of FS theory, the only method to compute uncertainty was probability. But probability measures the uncertainty that should be expressed in precise numbers. The indistinct terms such as high speed, very rich, very intelligent can be quantified by FS theory. Therefore, the FSs are more useful than classical sets and probability theory to express uncertainty. Due to its flexible approach, FSs find their applications in decision-making, in risk evaluation, etc. Various valuable generalizations (Arya and Kumar 2020a) of fuzzy set theory have been proposed, but FSs are not enough to handle the hesitancy degree.
The intuitionistic fuzzy set theory was proposed by Atanassov (1986) as a new prominent extension of FS. IFSs studied by Atanassov (1986) are characterized by two degrees membership \((\mu \in [0,1])\) and non-membership \((\nu \in [0,1])\) and satisfying \(0 \le (\mu + \nu ) \le 1\). He added third factor in the existing framework of FSs called ’intuitionistic index’ \((\pi )\) with the equation \(\mu + \nu + \pi =1\). Ratika and Kumar (2020) introduced a measure to compute the degree of distance between intuitionistic fuzzy set based on Renyi–Tsallis entropy. Thus, IFSs are more flexible of handling the uncertain real-life problems. Let us visit an example on voting. During voting, a section of people that votes in favor of government comprises the membership degree and a section of people that does not vote in favor of the government constitutes the non-membership degree. In reality, another section of people also exists that is indeterminate to vote for or to vote against the government. In FS proposed by Zadeh (1968), such section of people does not get due representation. To cover up this section of people, Attanasov (1986) added one more component in the existing structure of FS called ‘intuitionistic index’ \((\pi )\). This improved the adaptability of FSs for real-world problems.
Weight determination is an key investigation for fuzzy MCDM as considered by different authors (Arya and Kumar 2020, 2020c; Fahmi et al. 2019; Joshi and Kumar 2018). Weight entropy method is frequently employed method among other methods. Thus, entropy becomes burning topic of FSs. Several researchers considered IFE and suggested different formulas. New definition and formula of entropy for IFs introduced by Zhu and Li (2016). Bhandari and Pal (1975) was the initial researcher who suggested the generalized entropy. Cosine function-based IFE suggested by Liu and Ren (2015). Logarithmic function-based entropy suggested by Mao et al. (2013); Xiong et al. (2017) and Mishra et al. (2017). Joshi and Kumar (2018) implemented parameter-based entropy measure. While changing the parameter the entropy measure is both flexible as well consistent.
Multi-criteria decision-making (MCDM) techniques are useful to make a choice of best alternative among multiple decision alternatives. Therefore, multi-criteria decision-making (MCDM) is an important area where the IFSs have a wide scope of applications. Many theories and tools have been developed by different authors for solving MCDM problems (Chen and Chang 2016; Chen et al. 2016; Zeng et al. 2019; Rani et al. 2019). Opricovic (1998) introduced VIKOR (VIsekriterijumska Optimizacija i Kompromisno Resenje) method, Benayoun et al. (1966) suggested the ELECTRE (Elimination et choice translating reality ), PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluations) technique introduced by Brans and Mareschel (1984). Each method has its own advantages and drawbacks. Citing the drawbacks of PROMOTHEE and ELECTRE, Opricovic introduced an extended VIKOR method. VIKOR method by (Opricovic and Tzeng 2007; Arya and Kumar 2020c) provides a comprehensive solution that makes it more suitable for practical applications. The VIKOR method employed by researchers till now is based on distance measure. But it has been observed that the output of distance measure-based decision-making methods may vary with the distance measure used by Joshi and Kumar (2018). For deciding the most suitable alternative many decision-making methods uses score functions or accuracy functions. But, Ye (2010) argued that accuracy functions or score functions do not provide adequate information about alternatives. No study has been conducted so far on weighted correlation coefficients based on VIKOR approach in the best of my knowledge. Therefore, in this paper, we propose a scale invariant using weighted correlation coefficients based on VIKOR approach.
Further, criteria weights vector plays a deciding part to resolve the multi-criteria decision-making (MCDM) problems. The selection of most desirable alternative is feasible only by proper evaluation of criteria weights. After surveying the literature, Chen and Li (2010) splits criteria weight in objective and subjective methods. In subjective method, the criteria weights are decided purely according to the preference of decision-makers (Arya and Kumar 2020; Hwang and Lin 1987; Joshi and Kumar 2018), while in objective weight method, the criteria weights are determined by solving mathematical programming models (Jamkhaneh 2018; Mahmood et al. 2018), which neglect the subjective judgment information of DMs. Entropy method is another reliable and entrusted approach under objective weight evaluation category. The proposed paper adapted an entropy method to determine the criteria weights and and to reflect both the objective information as well as subjective information of the DM’s. An integrated method seems more meaningful and desirable for determining the criteria weights. This study aims the following contributions: in this paper, we develop a new information measure called logarithmic intuitionistic fuzzy (IF) information measure for IFSs. It is based on the Renyi’s concept. The proposed measure has some valuable properties, which are proved to check the applicability of the proposed measure. Next, we study a novel multi-criteria decision-making (MCDM) problem using weighted correlation coefficients where the weight information on the criteria is complete known or partial unknown. Finally, a ranking method using VIKOR is implemented to rank the alternatives.
The motivation of this paper are: to compute the distance between ideal solutions and alternatives, generally we use different distance measures. As Xiao and Wang (2017) proposed intuitionistic fuzzy VIKOR method based on distance measure and calculates the group utility and individual regret by the distance measure, whereas we evaluate the correlation between the different alternatives and positive as well as negative ideal solutions to decide the best alternative. We have not used distance measure because in some of the special cases it cannot successfully decide the closeness of each substitute. In this paper, we introduce a new multi-criteria decision-making (MCDM) method by using weighted correlation coefficient based on VIKOR approach. For measuring the correlation degree, we have used the correlation coefficient for IFSs and further scale-invariant entropy measure is proposed for calculating the uncertainty.
Since the continuous advancement of science requires experimentation with more and more complex physical systems of nature and the analysis of complex data structure arising from them, there has always been a quest for new, more general measures of uncertainty that could possibly explain such complex phenomena more accurately. With this view, several other one and two parameter generalization of the entropy functional have been proposed in the literature, although not all of them have significant applications with experimental validity.
The major contribution of our proposed work as follows:
-
We proposed a new entropy measure which is scale invariant for complete probability distribution, whereas other entropy measures does not holds scale invariance property. We have mentioned this generalized entropy as the Logarithmic \(\beta\) Norm Entropy(LNE).
-
Also, we proposed intuitionistic fuzzy scale-invariant entropy and prove its validity, whereas other intuitionistic fuzzy measures do not hold scale invariance property. We call such entropy is Logarithm \(\beta\)-Norm intuitionistic fuzzy based on entropy.
-
Thereafter, we suggested correlation coefficient-based VIKOR approach for finding the ranking and measuring the uncertainty in place of distance measure. Correlation coefficient-based VIKOR approach has not been used till date in best of my knowledge.
-
After that, we compared our proposed work with the existing researchers those who have used distance measure-based VIKOR method. After comparing, we reached to the conclusion that the proposed work attains the best result.
To achieve the proposed objectives, the present paper is arranged in six sections: Sect. 1, describes the contribution of preliminary researchers in this field, origin of motivation and goals to be attained via this contribution. In Sect. 2, existing literature related to proposed work is reviewed and a new scale-invariant entropy measure for probabilistic view point has been defined. Then a new intuitionistic fuzzy scale-invariant measure analogous to the well-known Renyi’s and Tsallis entropy are proposed and validated. Section 3 is utilized to recognize some basic definitions. In Sect. 4, we proposed a scale-invariant information measure for intuitionistic fuzzy set and proving the validation of it. In Sect. 5, we introduced a new multi-criteria decision-making (MCDM) technique that builds on new proposed measure and weighted correlation coefficients-based VIKOR approach. The application of the suggested multi-criteria decision-making (MCDM) method in actual problems is described in Sect. 6 with the help of example on supplier selection problem. Eventually, the last section draws “Conclusions”.
2 Scale-invariant generalized information measure
Let \(\Gamma _n=\{ \Omega =(\varpi _1,\varpi _2,...,\varpi _n):\varpi _i\ge 0, i= 1,2,...,n; \,\,\sum _{i=1}^{n}\varpi _i=1\},n \ge 2\) be a set of discrete probability distribution. For any probability distribution \(\Omega =(\varpi _1,\varpi _2,...,\varpi _n) \in \Gamma _n\), Shannon defined an information measure given by
Renyi’s (1961) generalized Shannon measure as:
Remark 1
If \(\beta =2\) in (2), (2) becomes Renyi Index or collision entropy
2. If \(\beta \rightarrow \infty\), the Renyi’s entropy \(H_\beta\) converges to the min entropy \(H_\infty\):
However, in the literature of information theory, there exist several versions of Shannon’s entropy (1948). We introduced a new information measure \(_\beta H_\mathrm{new}(\Omega )\) : \(\Gamma _n \rightarrow \mathfrak {R}^+\) (set of positive real numbers); \(n \ge 2\) as follows:
where \(\Vert \Omega \Vert _{\beta }\) denotes the \(\beta\)-Norm of the function \(\Omega =\{\varpi _1,\varpi _2,...,\varpi _n\}\) defined as \(\Vert \Omega \Vert _{\beta } = \left( \sum _{i=1}^{n}\varpi _i^\beta \right) ^{\beta ^{-1}}\).
3. If \(\beta = 1\), then (5) recovers Shannon entropy.
4. \(_\beta H_\mathrm{new}(\Omega ) = _{\beta ^{-1}} H_\mathrm{new}(\Omega )\), that is, (5) is symmetric with respect to \((\beta\) , \(\beta ^{-1})\).
However, several generalized versions of existing entropy (Shannon’s, Renyi’s and Tsallis entropy) are available in the literature. By noting their similarity with the \(\beta\) norm entropy, we denote the generalized entropy given in (5) as the Logarithmic \(\beta\)-Norm entropy for \(\Omega \in \Gamma _n\). It is interesting to note that \(_\beta H_\mathrm{new}(\Omega )\) is symmetric in the tuning parameter \((\beta ,\beta ^{-1})\).
The major advantage of Logarithmic entropy in (5) is its scale invariance property:
\(_\beta H_\mathrm{new}(c\; \Omega ) = _\beta H_\mathrm{new}(\Omega )\) for any \(\Omega \in \Gamma _n\), \(c , \beta > 0\). This striking property is satisfied neither by the Shannon entropy nor by its existing generalizations like Renyi’s, Tsallis entropies, etc. Therefore, it appears that the Logarithmic \(\beta\)-Norm entropy is the parameteric generalization of the Shannon and Renyi entropy over \(\Omega \in \Gamma _n\) that is scale invariant over the complete probability distributions \(\Gamma _n\).
2.1 Properties of generalized measure presented in (5)
Theorem 2.1
For any \(\Omega \in {\Gamma }_n,\) and \(_\beta H_\mathrm{new}(\Omega )\) satisfies the following properties:
-
a
\(_\beta H_\mathrm{new}(\Omega ) \ge 0\) for all \(\beta > 0(\ne 1)\). [Non-negativity]
-
b
\(_\beta H_\mathrm{new} (\varpi _1,\varpi _2,...,\varpi _n)\) is a symmetric function of \((\varpi _1,\varpi _2,...,\varpi _n)\).
-
c
\(_\beta H_\mathrm{new} (0,1)=0= _\beta H_\mathrm{new}(1,0)\). [Decisivity]
-
d
For any \(\Omega = (\varpi _1,\varpi _2,...,\varpi _n) \in \Gamma _n\), we have \(_\beta H_\mathrm{new}(\Omega ) = _\beta H_\mathrm{new}(\varpi _1,\varpi _2,...,\varpi _n,0)\). [Expandability]
-
e
For \(\Omega = (\varpi _1,\varpi _2,...,\varpi _n) \in \Gamma _n\) and \(\Xi = (\xi _1,\xi _2,...,\xi _m) \in \Gamma _m\), let us define their independent combination as \(\Omega * \Xi = (\varpi _i \xi _j)_{i=1,...,n;j=1,...,m.}\) Then, \(_\beta H_\mathrm{new}(\Omega *\Xi )= _\beta H_\mathrm{new}(\Omega ) + _\beta H_\mathrm{new}(\Xi )\). [Shannon additivity/ Extensivity]
-
f
\(_\beta H_\mathrm{new} (\frac{1}{2},\frac{1}{2})=1\). [Normalize]
-
g
\(_\beta H_\mathrm{new}(\varpi _1,\varpi _2,.....,\varpi _n) \le _\beta H_\mathrm{new} (\frac{1}{n},\frac{1}{n},...,\frac{1}{n})=log(n)\).
-
h
\(_\beta H_\mathrm{new}(\varpi _1,\varpi _2,....,\varpi _n)\) is continuous in \(\varpi _i 's\) for all \(i=1,2,...n\) and \(\beta > 0\).
-
i
For \(\beta > 0\), such that \(ln\Vert \Omega \Vert _{\beta }\) is convex in \(\Omega\). Then, \(_\beta H_\mathrm{new}(\Omega )\) is concave in \(\Omega \in \Gamma _n\).
Proof
(a). The entropy \(_\beta H_\mathrm{new}(\Omega )\) is non- negative for all \(\beta >0\).
We consider the following cases:
Case(i): When \(\beta >1\), this implies \(\beta ^{-1}<1\)
Taking logarithm on both side (6),
From (7), we get
Since, \(\beta >1\), which implies \((\beta ^{-1}-\beta )<0.\)
Therefore, from (8), we get
i.e., \(_\beta H_\mathrm{new}(\Omega )>0.\)
Case(ii): Similarly, for \(\beta <1\), and \(\beta ^{-1} >1\), which implies \((\beta ^{-1}-\beta )>0\) and we get
Therefore, from (9), we get
i.e., \(_\beta H_\mathrm{new}(\Omega ) > 0\).
Now, combine the case (i), (ii) and property (c) that gives non-negativity, i.e., \(_\beta H_\mathrm{new} \ge 0\).
(b–c). Here, property (b) and (c) are proved precisely by the definition of Logarithmic Norm entropy.
(d). By definition (5), this proves the property trivially.
(e). First, we note that, for any \(\beta > 0\), we have
Therefore, for \(\beta \ne 1 (\beta > 0)\), we get
(h). The proof for the case \(\beta \ne 1\) follows directly from the continuity of the norm functionals \(\Vert \Omega \Vert _{\beta }\) and the Logarithmic function.
(i). Let us assume the property (j) holds and take \(\Omega _1 , \Omega _2 \in \Gamma _n\), \(\lambda \in [0,1]\). Take \(\beta < 1\), by Minkowski inequality, we have
Combining it with the monotonicity and concavity of logarithmic function, we get
On the other hand, by convexity of \(ln\; \Vert \Omega _1\Vert _{\beta ^{-1}}\) , we get
Thus, along with \((\beta ^{-1}-\beta )>0\), for \(\beta < 1\) finally we get
This proves the concavity of scale-invariant entropy. \(\square\)
3 Scale-invariant intuitionistic fuzzy information measure
Definition 3.1
(Zadeh 1968) Let \(Y = (y_1,y_2,...,y_n)\) be a non-empty set, FS \({{\tilde{S}}}\) is given by
where \(\mu _{{{\tilde{S}}}}: Y \rightarrow [0,1]\) denotes the membership function and \(\mu _{{{\tilde{S}}}}(y_i) \in [0,1]\) represents the membership degree of \(y_i \in Y\) in \({{\tilde{S}}}\).
The idea of FSs extended by Atanassov (1986) by adding one more component named “Hesitancy Degree’’, thus presenting a new concept called “Intuitionistic Fuzzy Set (IFS)”.
Definition 3.2
(Atanassov 1986) For a universe of discourse \(Y = (y_1,y_2,...,y_n)\), an IFS \({{\hat{S}}}\) is given by
where \(\mu _{{{\hat{S}}}}(y_i)\) denotes membership degree and \(\nu _{{{\hat{S}}}}(y_i)\) denotes non-membership degrees of \(y_i \in Y\) in \({{\hat{S}}}\) satisfying \(0 \le \mu _{{{\hat{S}}}}(y_i)+\nu _{{{\hat{S}}}}(y_i) \le 1\). The number \(\pi _{{{\hat{S}}}}(y_i) = 1- \mu _{{{\hat{S}}}}(y_i)-\nu _{{{\hat{S}}}}(y_i)\) denotes the intuitionistic index or hesitancy degree. If we take \(\pi _{{{\hat{S}}}}(y_i) =0\), then IFS become FSs. For an IFS , \((\mu _{{{\hat{S}}}}(s_i) , \nu _{{{\hat{S}}}}(s_i))\) is termed as intuitionistic fuzzy number (IFN) where every IFN is represented as \(\lambda = (\mu _{\lambda }, \nu _{\lambda })\), where \(\mu _{\lambda }\) and \(\nu _{\lambda }\) lie in [0, 1] with \(\mu _{\lambda }+ \nu _{\lambda } \le 1\). In addition to this, \({{\tilde{S}}}(\lambda ) = \mu _{\lambda }- \nu _{\lambda }\) and \({{\hat{H}}}(\lambda ) = \mu _{\lambda }+ \nu _{\lambda }\) we represent the “score value ” and “accuracy degree” of \(\lambda\), respectively.
In this article, IFS(Y) and FS(Y) constitute the set of all IFSs and set of all FSs on Y independently. Logarithms are assigned to the base \('2'\) unless or otherwise stated.
Definition 3.3
(Xu 2007; Yager 2006) Let IFNs \(\lambda _1 = (\mu _{\lambda _1}\), \(\nu _{\lambda _1})\) , \(\lambda _2 = (\mu _{\lambda _2}\), \(\nu _{\lambda _2})\) and \(\lambda _3 = (\mu _{\lambda _3}\), \(\nu _{\lambda _3})\), the set operations are stated as below:
-
1.
\(\lambda _1 +\lambda _2 = (\mu _{\lambda _ 1} +\mu _{\lambda _2} -\mu _{\lambda _1} \mu _{\lambda _ 2},\nu _{\lambda _1} \nu _{\lambda _ 2} )\),
-
2.
\(\lambda _1 *\lambda _ 2 = (\mu _{\lambda _ 1} \mu _{\lambda _ 2}, \nu _{\lambda _ 1}+ \nu _{\lambda _ 2} -\nu _{\lambda _ 1} \nu _{\lambda _ 2})\),
-
3.
\(\beta \lambda =\left( 1-(1-\mu _{\lambda })^{\beta }, (\nu _{\lambda })^{\beta } \right) ; \beta > 0\),
-
4.
\(\lambda ^{\beta } = \left( (\mu _{\lambda })^{\beta } ,1-(1-\nu _{\lambda })^{\beta }\right) ; \beta >0\).
Definition 3.4
(Xia and Xu 2012) Let \(\lambda _i = (\mu _{\lambda _i}\), \(\nu _{\lambda _ i})\), where \((i = 1,2,...,n)\) be a group of IFNs. Suppose \(\gimel = (\gimel _1,\gimel _2,...,\gimel _n)^T\) be the weight vector of \(\lambda _i (i = 1,2,...,n)\) where \(\gimel _i \in [0,1]\) fulfilling the condition \(\sum _{i=1}^{n} \gimel _i = 1\). Function SIFWA : \(U^n \rightarrow U\) defined as
is known as symmetric intuitionistic fuzzy weighted averaging (SIFWA) operator.
Definition 3.5
(Atanassov 1986) ( Operations on IFSs ). For any \({{\hat{T}}},{{\hat{S}}} \in IFS(Y)\) defined by:
the regular set operations and relations are considered as:
-
1.
\({{\hat{T}}} \subseteq {{\hat{S}}}\) if and only if \({{\hat{T}}} \subseteq {{\hat{S}}}\), i.e., if \(\mu _{{{\hat{T}}}}(y_i)\le \mu _{{{\hat{S}}}}(y_i)\) and \(\nu _{{{\hat{T}}}}(y_i)\ge \nu _{{{\hat{S}}}}(y_i)\) for \(\mu _{{{\hat{S}}}}(y_i)\le \nu _{{{\hat{S}}}}(y_i)\), or if \(\mu _{{{\hat{T}}}}(y_i)\ge \nu _{{{\hat{S}}}}(y_i)\) and \(\nu _{{{\hat{T}}}}(y_i)\le \nu _{{{\hat{S}}}}(y_i)\), for \(\mu _{{{\hat{S}}}}(y_i)\ge \nu _{{{\hat{S}}}}(y_i)\) for any \(y_i \in Y\);
-
2.
\({{\hat{T}}} = {{\hat{S}}}\) if and only if \({{\hat{T}}} \subseteq {{\hat{S}}}\) and \({{\hat{S}}} \subseteq {{\hat{T}}}\);
-
3.
\({{{\hat{T}}}}^c = \{ \langle y_i,\nu _{{{\hat{T}}}}(y_i),\mu _{{{\hat{T}}}}(y_i)\rangle |y_i \in Y \}\);
-
4.
\({{\hat{T}}} \cap {{\hat{S}}} = \{\langle \mu _{{{\hat{T}}}}(y_i)\wedge \mu _{{{\hat{S}}}}(y_i)\)and \(\nu _{{{\hat{T}}}}(y_i)\vee \nu _{{{\hat{S}}}}(y_i)\rangle | y_i \in Y \}\);
-
5.
\({{\hat{T}}} \cup {{\hat{S}}} = \{\langle \mu _{{{\hat{T}}}}(y_i)\vee \mu _{{{\hat{S}}}}(y_i)\)and \(\nu _{{{\hat{T}}}}(y_i)\wedge \nu _{{{\hat{S}}}}(y_i)\rangle | y_i \in Y \}\).
4 Entropy for FSs and IFSs
Definition 4.1
(Zadeh 1965) (Fuzzy Entropy ). A real function \({{\tilde{\phi }}} : FS(Y) \rightarrow [0,\infty )\) is termed as fuzzy entropy whenever it satisfies the subsequent properties:
-
1.
\({{\tilde{S}}}\) is crisp set if and only if \({{\tilde{\phi }}} ({{\tilde{S}}}) = 0 ,\) For all \({{\tilde{S}}} \in FS(Y)\).
-
2.
If \(\mu _{{{\tilde{S}}}} = 0.5\) if and only if \({{\tilde{\phi }}} ({{\tilde{S}}})\) is maximum for all \({{\tilde{S}}} \in FS(Y)\).
-
3.
For any \({{\tilde{T}}},{{\tilde{S}}} \in FS(Y), {{\tilde{\phi }}}({{\tilde{T}}}) \le {{\tilde{\phi }}}({{\tilde{S}}})\) if \({{\tilde{T}}}\) is crisper than \({{\tilde{S}}}\), that is, \(\mu _{{{\tilde{T}}}}\le \mu _{{{\tilde{S}}}}\) if \(\mu _{{{\tilde{S}}}} \le 0.5\) and \(\mu _{{{\tilde{T}}}} \ge \mu _{{{\tilde{S}}}}\) if \(\mu _{{{\tilde{S}}}} \ge 0.5\).
-
4.
\({{\tilde{\phi }}}({{\tilde{S}}}) = {{\tilde{\phi }}}({{{\tilde{S}}}})^c\), where \(({{{\tilde{S}}}})^c\) denotes complement of \({{\tilde{S}}} \in FS(Y)\).
Since for an IFS, \(\mu + \nu + \pi = 1\), accordingly, seeing \((\mu , \nu , \pi )\) as probability distribution, Hung and Yang (2006) gave a new entropy for IFSs, by extending the idea of Luca and Termini (1972).
Definition 4.2
(Atanassov 1986) (Intuitionistic fuzzy entropy). A real function \({{\bar{\phi }}}: IFS(Y) \rightarrow [0, \infty )\) is known as an entropy on IFS(Y) if the following properties are satisfied:
-
1.
\({{\hat{S}}}\) is a crisp set if and only if \({{\bar{\phi }}} ({{\hat{S}}}) = 0\).
-
2.
The value of \({{\bar{\phi }}} ({{\hat{S}}})\) is maximum at \(\mu _{{{\hat{S}}}} = \nu _{{{\hat{S}}}} = \pi _{{{\hat{S}}}} = \frac{1}{3}\).
-
3.
\({{\bar{\phi }}}({{\hat{T}}}) \le {{\bar{\phi }}}({{\hat{S}}})\) if and only if \({{\hat{T}}}\) is crisper than \({{\hat{S}}}\), that is, \(\mu _{{{\hat{T}}}} \ge \mu _{{{\hat{S}}}} , \nu _{{{\hat{T}}}} \ge \nu _{{{\hat{S}}}}\) if \(min(\mu _{{{\hat{T}}}} ,\nu _{{{\hat{S}}}}) \ge \frac{1}{3}\), and \(\mu _{{{\hat{T}}}} \le \mu _{{{\hat{S}}}} , \nu _{{{\hat{T}}}} \le \nu _{{{\hat{S}}}}\) if \(max( \mu _{{{\hat{S}}}} , \nu _{{{\hat{S}}}} ) \le \frac{1}{3}\).
-
4.
\({{\bar{\phi }}}({{\hat{S}}}) = {{\bar{\phi }}}({{{\hat{S}}}})^c\) where \(({{\hat{S}}})^c\) denotes complement of \({{\hat{S}}}\).
Definition 4.3
(Szmidt and Kacprzyk 2002) An entropy on IFS(Y) is a real-valued function A: IFS\((Y) \rightarrow [0,1]\), which fulfills the properties as below:
\(\aleph _1\). \(A({{\hat{T}}})= 1\) if and only if \(\mu _{{{\hat{T}}}}(y_i) = \nu _{{{\hat{T}}}}(y_i)\), for all \(y_i \in Y\).
\(\aleph _2\). \(A({{\hat{T}}}) = 0\) if and only if \({{\hat{T}}}\) is crisp set, i.e., \(\mu _{{{\hat{T}}}}(y_i) = 0, \nu _{{{\hat{T}}}}(y_i) = 1\) or \(\mu _{{{\hat{T}}}}(y_i) = 1, \nu _{{{\hat{T}}}}(y_i) = 0\) for all \(y_i \in Y\).
\(\aleph _3\). \(A({{\hat{T}}}) \le A({{\hat{S}}})\) if and only if \({{\hat{T}}} \subseteq {{\hat{S}}}\).
\(\aleph _4\). \(A({{\hat{T}}}) = A({{\hat{T}}}) ^c\).
Definition 4.4
[Correlation coefficients (Gerstenkorn and Manko 1991)]. Suppose \(\hat{S_1} = \{ \langle y_i,\mu _{\hat{S_1}}(y_i), \nu _{\hat{S_1}}(y_i) \rangle |y_i \in Y \}\) and \(\hat{S_2} = \{ \langle y_i,\mu _{\hat{S_2}}(y_i) , \nu _{\hat{S_2}}(y_i) \rangle |y_i \in Y \}\) are two IFSs. Gerstenkorn and Manko define correlation coefficient \(\beth {(\hat{S_1} , \hat{S_2})}\) between \(\hat{S_1}\) and \(\hat{S_2}\) as follows:
where
represents the correlation between two IFSs \(\hat{S_1}\) and \(\hat{S_2}\), and
are respective informational energies of \(\hat{S_1}\) and \(\hat{S_2}\).
The correlation coefficient \(\digamma (\hat{S_1} ,\hat{S_2})\) satisfies the following properties:
-
1.
\(0 \le \digamma (\hat{S_1} ,\hat{S_2}) \le 1\).
-
2.
\(\digamma (\hat{S_1} ,\hat{S_2}) = \digamma (\hat{S_2} ,\hat{S_1})\).
-
3.
\(\digamma (\hat{S_1} ,\hat{S_2}) = 1\) if \(\hat{S_1} = \hat{S_2}\).
Now we introduce a new entropy measure called Logarithmic intuitionistic fuzzy entropy of IFSs with the help of above concepts.
Now, corresponding to Shannon entropy, Luca and Termini (1972) proposed a fuzzy entropy as below:
where \({{\tilde{S}}} \in FS(Y)\) and \(y_i \in Y\). Bhandari and Pal (1975) extended Renyi’s idea for introducing a new fuzzy entropy which is given by
Further extending the idea of Verma and Sharma (2014), we proposed a scale-invariant information measure for IFSs.
Definition 4.5
For any \({{\hat{T}}} \in IFS(Y)\), we define:
Then (24) is a Logarithmic \(\beta\)-Norm intuitionistic fuzzy information measure.
Particular cases:
-
1.
If \(\beta =1\), then (24) becomes Vlachos and Sergiadis (2007) entropy.
-
2.
If \(\beta =1\) and Hesitancy \(\pi _{{{\hat{T}}}}(y_i)=0\), (24) becomes De Luca and Termini (1972) entropy.
-
3.
If \(\pi _{{{\hat{T}}}}(y_i)=0\), then (24) becomes fuzzy information measures corresponding to the measure (5).
-
4.
\(A_{\beta }({{\hat{T}}}) = A_{\beta ^{-1}}({{\hat{T}}})\), i.e., (24) is symmetric for intuitionistic case and also satisfies the translating invariant property.
Now, we explain the existence of the measure (24) as below.
4.1 Proof of validity for (24)
Theorem 4.1
The measure \(A_{\beta }({{\hat{T}}})\) in (24)is a valid intuitionistic fuzzy entropy of order \(\beta\).
Proof
We validate the measure (24) by satisfying the axioms \(\aleph _1 - \aleph _4\).
(\(\aleph _1\)). Prove that \(A_{\beta }({{\hat{T}}}) = 1\) if and only if \(\mu _{{{\hat{T}}}}(y_i) = \nu _{{{\hat{T}}}}(y_i)\). Putting \(\mu _{{{\hat{T}}}}(y_i) = \nu _{{{\hat{T}}}}(y_i)\) in (24), then we get \(A_{\beta }({{\hat{T}}}) = 1\). In the converse part,suppose \(A_{\beta }({{\hat{T}}}) = 1\) then we have to show that \(\mu _{{{\hat{T}}}}(y_i) = \nu _{{{\hat{T}}}}(y_i)\) for all \(y_i \in Y\). For proving the above, we use the following inequality:
where equality holds if and only if \(\omega = \kappa\). For \(\beta > 1\), we have
By taking \(\mu _{{{\hat{T}}}}(y_i) = \omega , \nu _{{{\hat{T}}}}(y_i) = \kappa\) and \(\mu _{{{\hat{T}}}}(y_i) + \nu _{{{\hat{T}}}}(y_i)+ \pi _{{{\hat{T}}}}(y_i) = 1\) in (26), we have
Therefore, \(A_{\beta }({{\hat{T}}}) = 1\) if and only if \(\mu _{{{\hat{T}}}}(y_i) = \nu _{{{\hat{T}}}}(y_i)\).
(\(\aleph _2\)). Let \({{\hat{T}}}\) be a crisp set. This implies, either \(\mu _{{{\hat{T}}}}(y_i) = 1\), \(\nu _{{{\hat{T}}}}(y_i) = 0\); or \(\mu _{{{\hat{T}}}}(y_i) = 0\), \(\nu _{{{\hat{T}}}}(y_i) = 1\).
Then from (24), we find that \(A_{\beta }({{\hat{T}}}) = 0\). Conversely, let \(A_{\beta }({{\hat{T}}}) = 0\). Therefore, (24) gives
Therefore, (28) will hold only if
Since \(\beta \ne 1\), then (28) holds. We may conclude that \({{\hat{T}}}\) is a crisp set if and only if \(A_{\beta }({{\hat{T}}}) = 0\).
\((\aleph _3\)). \(A_{\beta }({{\hat{T}}}) \le A_{\beta }({{\hat{S}}})\) if and only if \({{\hat{T}}} \subseteq {{\hat{S}}}\). Now, we have to prove that (24) satisfies \((\aleph _3)\). For this, we define the function:
\(\rho\) is an decreasing and increasing function with respect to \(g_2\) and \(g_1\), respectively. Calculating the critical points and putting \(\frac{\partial {\rho }(g_1,g_2)}{\partial {g_1}} = 0\) and \(\frac{ \partial {\rho }(g_1,g_2)}{\partial {g_2}} = 0\), gives
\(\square\)
Some cases that arise are:
We conclude from above cases 1 and 2 that \(\rho (g_1,g_2)\) is an increasing function:
We conclude from above cases 3 and 4 that \(\rho (g_1,g_2)\) is an decreasing function.
Therefore, we may conclude that \(A_{\beta }({{\hat{T}}}) \le A_{\beta }(S)\) if \({{\hat{T}}} \subseteq S\).
(\(\aleph _4\)).\(A_{\beta }({{\hat{T}}}) = A_{\beta }({{\hat{T}}})^c\).
We know that \(({{\hat{T}}})^c = \{\langle y_i,\nu _{{{\hat{T}}}}(y_i) , \mu _{{{\hat{T}}}}(y_i) \rangle |y_i \in Y \}\).This implies
We get from (24)
Therefore, \(A_{\beta }({{\hat{T}}})\) is a valid intuitionistic fuzzy entropy measure. (24) also satisfies the following additional properties.
Theorem 4.2
For any two IFSs \({{\hat{T}}}\) and \({{\hat{S}}}\) satisfying \({{\hat{T}}} \subseteq {{\hat{S}}}\) or \({{\hat{S}}} \subseteq {{\hat{T}}}\), the following holds:
Proof
Suppose \(Y_1\) and \(Y_2\) are two parts of Y,
We get for all \(y_i \in Y_1\) and \(y_i \in Y_2\),
From (32), (33) and (34) may be proved easily. \(\square\)
Corollary
Let \({{\hat{T}}} \in IFS(Y)\) and \(({{\hat{T}}})^c\) denote the complement of \({{\hat{T}}}\).
Theorem 4.3
If \({{\hat{T}}}\) be a crisp set, then \(A_{\beta }({{\hat{T}}})\) contains minimum value and if \({{\hat{T}}}\) be most intuitionistic fuzzy set, then \(A_{\beta }({{\hat{T}}})\) has the maximum value. So, the maximum and minimum values are independent.
5 Application of the proposed measure in decision-making
The VIKOR method by Opricovic (1998) determines the best alternatives on the basis of closeness of alternatives with extreme solutions, i.e., positive along with negative ideal solutions. On basis of intuitionistic fuzzy measure, this section presents stepwise algorithm for the proposed VIKOR method. Therefore, determination for justified criteria weights are quite important. The importance of criteria weights divided in two weights, i.e., subjective and objective weights.
5.1 Algorithm to determine criteria weights
Criteria weights play a deciding role in finding the solution of multi-criteria decision-making (MCDM) problems. Algorithm to determine the weights based on proposed entropy:
-
1.
A MCDM problem may be constituted by a \(m \times n\) matrix with m-rows representing alternatives \((\varphi _i)_{1 \le i \le m}\) and \(n-\)columns constituting criteria \((\varrho _j)_{1 \le j \le n}\). Consider the intuitionistic fuzzy decision matrix specified by
(36)where \(d_{ij} = (\bar{\mu _{ij}},\bar{\nu _{ij}}); i = 1,2,...,m\) and \(j = 1,2,...,n\).
-
2.
In this step, by using (24), the intuitionistic fuzzy entropy provided by \({{\tilde{D}}}\) can be obtained as \(F_j\); \(j = 1,2,...,n\).
The whole process of criteria weight evaluation is divided into two parts as follows:
5.1.1 (a) In case the criteria weights are known partially
Practically, there are two constraints in criteria weight determination. First, it is difficult to find an expert having expertise in all fields, and second to hire the experts of all fields is a costly affair. In fact, it is easy to extract their views in forms other than precise numbers, for example, linguistic variables, in the form of intervals, etc. In such cases, partial details about criteria weights are available with us. The whole details expressed by decision-makers can be compiled by means of set denoted by \(W^T\). We use the concept of minimum entropy proposed by Wang and Wang (2012) to extract criteria weights from the set \(W^T\).
Entropy value of an alternative \(\varphi _i\) covering the criteria \(\varrho _j\) is specified by:
where
For the determining of optimal criteria weights, we construct a linear programming model which is specified by
which fulfills \(\sum _{j=1}^n \zeta _j = 1 , \zeta _j \in W^T\). On solving (39), the criteria weights vector is stated by arg min \(F = (\zeta _1,\zeta _2,...,\zeta _n)^{'}\).
5.1.2 (b) In case the criteria weights are unknown
In the case where the criteria weights are unknown, we employ the procedure introduced by Chen and Li (2010) as follows:
where
5.2 VIKOR method
In the VIKOR method, multi-criteria decision-making (MCDM) problems are solved by setting the IFSs. This introduces a new criteria weight method by subjective and objective weights. On the basis of correlation coefficients, this section presents a stepwise algorithm for the proposed IF-VIKOR method. For a multi-criteria decision-making (MCDM) problem with an m alternatives \(\varphi _i (i=1,2,...,m)\), let the decision-makers be \(FY_j (j=1,2,...,n)\) which are decided to the best alternatives. Each n-decision-makers has a weight \(\omega _j (j=1,2,...,n)\), satisfying \(\sum _{j=1}^n \omega _j = 1\). The proposed VIKOR method is summarized in following steps.
5.2.1 (a) Construction of IF decision matrix
Compile the information obtained by decision-makers \(FY^p\) by means of intuitionistic Fuzzy decision matrix. Let \(d_{ij}^p\) be IFN given by pth decision-makers. The entries of the matrix are the IFNs, that is \(d_{ij}^p = (\mu _{ij}^p,\nu _{ij}^p)\) which can be computed as below:
5.2.2 (b) Normalized the IF decision matrix
Let \(({{\tilde{t}}}_{ij})\) be a intuitionistic decision matrix which is normalized by the proposed method of Xu and Hu (2010) as follows:
5.2.3 (c) Determination of subjective weights
Let \(\zeta _j^{p} = (\mu _j^p ,\nu _j^p)\) be the weight given by decision-makers \(FY^p\). Therefore, we calculate IF weights \((\zeta _j)\) for different criteria by using the operator SIFWA:
where \(\zeta _j = (\mu _j,\nu _j), j= 1,2,...,n\).
5.2.4 (d) Normalize the subjective weights
We obtained the subjective weights (Li et al. 2015; Boran et al. 2011) by normalizing the weight \((\zeta _j^z)\), satisfying \(\sum _{j=1}^n \zeta _j^z = 1\):
where \(\tau _j = 1-\mu _j-\nu _j\).
5.2.5 (e) Determination of objective weights
To calculate the objective weights \(\zeta _j^b\) described in (5.1 (a),(b)).
5.2.6 (f) Determine the solutions and cost criteria
We define the relative intuitionistic fuzzy best solution \(\Upsilon _j^{++} = (\mu _j^{++},\nu _j^{++})\) and the relative intuitionistic fuzzy worst solution \(\Upsilon _j^{--} = (\mu _j^{--},\nu _j^{--})\) as follows:
and
5.2.7 (g) Calculation of \(T_i\) and \(R_i\)
Let us find the values of \(T_i , R_i\) and \(Q_i\) , \(i=1,2,...,m\), where \(T_i\) is a group utility value, \(R_i\) is a individual regret value and \(Q_i\) is compromise value:
where \(\widehat{w_j} = \sum _{j=1}^n\left( \Theta \zeta _j^z + (1-\Theta )\zeta _j^b\right)\) is the amalgamation of subjective and objective weights and \(\Theta\) presents the relation between subjective and objective weights which lies between 0 and 1 \(, i.e.,\, \Theta \in [0,1]\) and \(\Theta = 0.5\).
5.2.8 (h) Determination of VIKOR index \((Q_i)\)
where \(\Psi\) and \((1-\Psi )\) represent the weights for \(T_i\) and \(R_i\). We put the value of \(\Psi = 0.5\). We take in (4.17) \(T^{++} = max\,\, T_i\, , T^{--} = min\,\, T_i \,, R^{++} = max\,\, R_i\) and \(R^{--} = min\,\, R_i\).
5.2.9 (i) The values of \(T_i , R_i\) and \(Q_i\) are arranged in ascending order
Then we rank the alternatives. The alternatives satisfy the following conditions:
\({{\bar{C1}}}\) (acceptable advantage) If \(Q(\Upsilon ^2-\Upsilon ^1) \ge \frac{1}{n-1}\), where \(\Upsilon ^1\) and \(\Upsilon ^2\) are first and second ranked alternatives in \(\Upsilon _i\) column.
\({{\bar{C2}}}\) (acceptable stability) The alternative \(\Upsilon ^1\) should also be ranked first in \(R_i\) and \(T_i\) columns.
The alternative \(\Upsilon _i\) will be the most desirable one if both the conditions are concurrently satisfied. If both the conditions are not satisfied simultaneously, then we proceed for compromise solutions as follows:
-
1.
If the condition \({{\bar{C2}}}\) is not satisfied then \((\Upsilon ^1,\Upsilon ^2)\) is the set of compromise solution.
-
2.
If the condition \({{\bar{C1}}}\) is not satisfied then the set \((\Upsilon ^1,\Upsilon ^2,...,\Upsilon ^{M})\) constitutes the compromise solution, where \(\Upsilon ^{M}\) is defined by
The number n in (51) denotes the total number of criteria and the number M represents the maximum of ranks of the alternatives in column \(Q_i\) satisfying (51).
The step-by-step procedure of the proposed VIKOR method is demonstrated with the help of Fig. 1. After the compilation of information by different experts, we constructed the IF decision matrix using Eq. (42). In the next step, we normalized the IF decision matrix with the help of Eq. (43). In the following steps, we determined the criteria weights and extreme solutions with the help of Eqs. (44)–(47). After that VIKOR index is to be determined with the help of group utility and individual regret value using Eq. (50). In the final step, ranking has been given to the different alternatives.
6 Numerical example
Now, we solved the problem with application of multi-criteria decision-making (MCDM) VIKOR method.
6.1 Approach 1: in case criteria weights are known partially
The manufacturers of laptops want to hire potential suppliers for supply of parts. They receive five options, say \(\varphi _i(i=1,2,3,4,5)\). The most appropriate supplier company has hired five experts, say \(FY_i (i=1,2,3,4,5)\). The council of company has fixed five criteria given by Functionality \((\xi _1)\), Reliability \((\xi _2)\), Customer Satisfaction \((\xi _3)\), Quality \((\xi _4)\), Cost \((\xi _5)\).
In Tables 1 and 2, we express the alternative for rating and criteria weights in the form of linguistic terms using IFNs. Here it is to be noted that on the basis of historical data and/or a questionnaire responded by experts of concerned domain, IFNs could be defined. The experts displayed our assessment and important criteria weights are shown in Tables 3 and 4.
Step 1: Intuitionistic fuzzy decision matrix obtained by Eq. 42 is depicted in Table 5.
Step 2: The subjective criteria weights calculated by Eq. 44 are depicted in Table 6.
Step 3: In this step, we normalized the subjective criteria weights with the help of Eq. 45 given in Table 7.
Step 4: In order to compute the values of objective weights let us consider the set of information with weights denoted by:
The construction for objective weights in programming model:
The criteria objective weight vector function is obtained by solving the (53),
Step 5: Normalized intuitionistic Fuzzy decision matrix is obtained by Eq. 43 which is depicted in Table 8.
Step 6: The values of \(T_i\) ,\(R_i\) , \(Q_i\) for all alternatives are calculated using Eqs. 48–50, where \(T_i\) is the utility value, \(R_i\) is the individual regret and \(Q_i\) is the VIKOR index.
Step 7: After that we have rated the values of T, R and Q which are calculated in Table 9 and depicted in Table 10. First ranking is given to the alternative having lower value among all the alternatives and so on.
Step 8: In this step, we have provided the preferential sequences of alternatives on the basis of rating scale such as corresponding to T, \(\varphi _4\) has the lower rating, whereas \(\varphi _5\) has the highest rating. In a similar fashion, we have given the preference to the R and Q depending upon their corresponding values.
Analysis of Table 11 reveals that alternatives \(\varphi _5\) and \(\varphi _1\) are, respectively, ranked first and second in column of \(Q_i\). Also \(Q(\varphi _1) -Q(\varphi _5) = 0.0616 -0 =0.0616 < \frac{1}{5-1} = 0.25\). Therefore, \({{\bar{C1}}}\) is not satisfied. Moreover, the alternative \(\varphi _5\) is also ranked first in columns of T and R, which means \({{\bar{C2}}}\) is satisfied. \(\varphi _5> \varphi _1>\varphi _2>\varphi _3>\varphi _4\); we obtain the best alternative as \(\varphi _5\).
6.2 Sensitive analysis
In this section, we have incorporated sensitive analysis to show the behavior of compromise solution. We analyzed that reliability and fuzzy information are affecting by changing weight \(\Psi\). In this way compromise solution becomes more versatile for practical purpose. We have taken distinct values to show the impact of \(\Psi\) on compromise solution. The values achieved by changing the weight \(\Psi\) are as shown in Table 12. Ranking obtained is as \(\varphi _5> \varphi _1>\varphi _2>\varphi _3>\varphi _4\). After analyzing Table 12, it is observed that same ranking sequence is obtained on different values of \(\Psi\). It shows that there is no effect of \(\Psi\) on ranking sequence and consequently compromise solution has considered all the linguistic information. Sensitivity outcomes at the distinct values of \(\Psi\) are depicted by Fig. 2.
6.3 Approach 2: in case of unknown criteria weights
In this section, we have to find the solution of the above example in case of unknown criteria weights. The step-by-step computational procedure are as follows:
Step 1: Firstly, we have calculated the different values of criteria weights corresponding to different alternatives with the help of Eq. 40. The calculated values of criteria weights are shown in Eq. 54:
Step 2: After that we have calculated the positive \((\Upsilon _j^{++})\) and negative \((\Upsilon _j{--})\) ideal solution using Eqs. (46) and (47). The computed values of positive \((\Upsilon _j^{++})\) and negative \((\Upsilon _j{--})\) ideal solution are depicted in Table 13.
Step 3: In this step, we have computed the values of \(T_i\) , \(R_i\) and \(Q_i\) using Eqs. (48)–(50) corresponding to the different alternatives. The calculated values are demonstrated in Table 14.
Step 4: Depending on the values of \(T_i\) , \(R_i\) and \(Q_i\), ranking was assigned to the different alternatives as shown in Table 15.
Step 5: The preferential sequences of alternatives on the basis of Table 14 are shown in Table 16.
Analysis of Table 16 shows that alternatives \(\varphi _5\) and \(\varphi _1\) are, respectively, ranked first and second in column of \(Q_i\). Also \(Q(\varphi _1) -Q(\varphi _5) = 0.3990-0 =0.3990 > \frac{1}{5-1} = 0.25\). Therefore, \({{\bar{C1}}}\) is satisfied. Moreover, the alternative \(\varphi _5\) is also ranked first in columns of T and R, it means \({{\bar{C2}}}\) is satisfied. \(\varphi _5> \varphi _1>\varphi _2>\varphi _3>\varphi _4\), we obtain the best alternative as \(\varphi _5\). Therefore, the supplier represented by the alternative \(\varphi _5\) is the most preferred one.
6.3.1 Comparative analysis
To further explore the efficacy and execution of proposed work, comparison has been carried out with same numerical example with same presumptions and information of weights. Radhika and Sammeer Kumar (2017) introduced a MCDM method that provides ranking of available alternatives based on distance measure. However, Divsalar (2017) presented VIKOR method as a qualitative multi attribute group decision-making approach which is based on extended hesitant fuzzy linguistic term (EHFLTS) distance measures. Victor (2017) presented a hybrid system by combining FCM with VIKOR method which
is also based on distance measure, whereas Afful-Dadzie (2014) proposed a fuzzy VIKOR frame to find the ranking in fuzzy environment with the help of linguistic variables to deal with uncertainty and subjectivity. We have proposed a new entropy measures which is scale invariant for complete probability distribution and proposed correlation coefficient-based VIKOR approach for finding the ranking and measuring the uncertainty in place of distance measure. From the exploration of Table 17, it is observed that ranking order is different by different researchers, but the optimal alternative is same which revealed that the proposed method of this work has evident reliability in intuitionistic fuzzy domain. But the strategy of proposed work is more efficient among existing approaches and valid for different multi-criteria decision-making (MCDM) problems.
The proposed work can easily pursue the case where information regarding weight is unknown or partially known, whereas compared approaches cannot manage such type of situations. The proposed work incorporates entropy measures in addition to normalization of subjective and objective weights to compute more consistent and reliable information. Additionally, we have evaluated objective weight vector by using mathematical model. The main advantage of this work is that we have normalized the decision matrix using correlation coefficient based on VIKOR method. Secondly, the proposed work can provide a adjustable technique to deal with MCDM problems, where information regarding weight is partially known or fully unknown. Thus we can conclude that the outcomes of proposed work is more practicable and straightforward for ranking.
7 Conclusion
In this paper, we generalized entropy measure for intuitionistic fuzzy set which is called scale-invariant entropy generalization of Shannon. Moreover, a novel multi-criteria decision-making (MCDM) process using weighted correlation coefficients-based VIKOR approach is introduced. The working of the proposed decision-making process is thoroughly interpreted with the help of two numerical illustrations considering selection of the most appropriate supplier with the help of ranking. The numerical examples are discussed with two different approaches about the evaluation of criteria weights. In the first approach, we consider the case of partially known criteria weights, whereas unknown criteria weights are discussed in the second approach. The output of proposed multi-criteria decision-making (MCDM) method is compared with other well-known decision-making methods like the VIKOR method. The proposed measure entropy IF can be elongated to more general sets such as interval-valued intuitionistic Fuzzy sets, picture fuzzy set, Q-orthopair fuzzy set, etc. Further, some of the most representative computational intelligence algorithms can be used to solve the problems, like monarch butterfly optimization (MBO), earthworm optimization algorithm (EWA), elephant herding optimization (EHO), month search (MS) algorithm.
References
Arya V, Kumar S (2020c) Extended VIKOR - TODIM approach based on entropy weight for intuitionistic fuzzy sets. In: Singh P, Gupta RK, Ray K, Bandyopadhyay A (eds) Advances in intelligent systems and computing. Springer, Singapore, pp 95–108
Arya V, Kumar S (2020) Multi-criteria decision making problem for evaluating ERP system using entropy weighting approach and q-rung orthopair fuzzy TODIM. Granul Comput. https://doi.org/10.1007/s41066-020-00242-2
Arya V, Kumar S (2020a) Knowledge measure and entropy: a complementary concept in fuzzy theory. Granul Comput. https://doi.org/10.1007/s41066-020-00221-7
Atanassov KT (1986) Intuitionistic fuzzy sets. Fuzzy Sets Syst 20(1):87–96
Benayoun R, Roy B, Sussman B (1966) ELECTRE Une methode pour guider le choix en presence de points de vue multiples, Note de travail 49. Direction Scientifique: SEMAMETRA International
Bhandari D, Pal NR (1975) Some new information measures for fuzzy sets. Inf Sci 67(3):209–228
Boran FE, Genc S, Akay D (2011) Personal selection based on intuitionistic fuzzy sets. Hum Factors Ergon Manuf Serv Ind 21:493–503
Brans JP Mareschel V (1984) PROMETHEE: a new family of outranking methods in multicriteria analysis (JP Brans Ed). Oper Res 84:477–490
Chen SM, Chang CH (2016) Fuzzy multiattribute decision making based on transformation techniques of intuitionistic fuzzy values and intuitionistic fuzzy geometric averaging operators. Inf Sci 352:133–149
Chen SM, Lan TC (2016) Multicriteria decision making based on the TOPSIS method and similarity measures between intuitionistic fuzzy values. Inf Sci 367:279–295
Chen T, Li C (2010) Determining objective weights with intuitionistic fuzzy entropy measures: a comparative analysis. Inf Sci 180:4207–4222
De Luca A, Termini S (1972) A definition of non-probabilistic entropy in the setting of fuzzy set theory. Inf Control 20:301–312
Eric A-D (2014) Fuzzy VIKOR approach: evaluating quality of internet health information. https://doi.org/10.15439/2014F203
Fahmi A, Amin F, Ullah H (2019) Multiple attribute group decision making based on weighted aggregation operators of triangular neutrosophic cubic fuzzy numbers. Granual Comput 2019:1–3
Gerstenkorn T, Manko J (1991) Correlation of intuitionistic fuzzy sets. Fuzzy Sets Syst 44:39–43
Hung WL, Yang MS (2006) Fuzzy entropy on intuitionistic fuzzy sets. Int J Intell Syst 21(4):443–451
Hwang CL, Lin MJ (1987) Group decision making under multiple criteria: methods and applications. Springer, Berlin
Jamkhanesh EB, Garg H (2018) Some new operations over the generalized intuitionistic fuzzy sets and their application to decision making process. Granul Comput 3(2):11–122
Joshi R, Kumar S (2018) A new parametric intuitionistic fuzzy entropy and its application in multiple attribute decision making. Int J Appl Comput Math 4(1):52
Joshi R, Kumar S (2018) A novel fuzzy decision making method using entropy weights based correlation coefficients under intuitionistic fuzzy environment. Int J Fuzzy Syst. https://doi.org/10.1007/s40815-018-0538-8
Li HC, You JX, You XY, Shan MM (2015) A novel approach for failure mode and effects analysis using combination weighting and fuzzy VIKOR method. Appl Soft Comput 28:579–588
Liu MF, Ren HP (2015) A study of multi-attribute decision making based on a new intuitionistic fuzzy entropy measure. Syst Eng Theor Pract 35(11):2909–2916
Mahmood T, Liu P, Ye J, Khan Q (2018) Several hybrid aggregation operators for triangular intuitionistic fizzy set and their application in multi-criteria decision-making. Granul Comput 3(2):153–168
Mao JJ, Yao DB, Wang CC (2013) A novel cross entropy and entropy measures of IFSs and their applications. Knowl Based Syst 48:37–45
Mehdi D (2017) Extension of the VIKOR method for group decision making with extentded hesitant fuzzy linguistic information. https://doi.org/10.1007/s00521-017-2944-5
Mishra AR, Rani P, Jain D (2017) Information measures based topsis method for multi criteria decision making problem in intuitionistic fuzzy environment. Iran J Fuzzy Syst 14(6):41–63
Opricovic S (1998) Multi-criteria optimization of civil engineering systems. Ph.D. Thesis, University of Belgrade
Radhika S, Sammeer Kumar D (2017) VIKOR method for multicriteria decision making in academic staff selection. J Product Res Manag 3(2). ISSN:2249-4766
Rani P, Jain D, Hooda DS (2019) Extension of intuitionistic fuzzy TODIM technique for multi-criteria decision making method based on shapley weighted divergence measure. Granul Comput. 4(3):407–420
Ratika K, Kumar S (2020) A novel intuitionistic Renyi’s-Tsallis discriminant information measure and its applications in decision-making. Granul Comput. https://doi.org/10.1007/s41066-020-00237-z
Renyi A (1961) On measures of entropy and information. In: Proceeding of 4th Barkley symposium on mathematical statistics and probability. University of California Press, vol 1, p 547
Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423
Szmidt E, Kacprzyk J (2002) Using intuitionistic fuzzy sets in group decision -making. Control Cybern 31:1037–1054
Verma RK, Sharma BD (2014) On intuitionistic fuzzy entropy of order-\(\alpha\). Adv Fuzzy Syst (article ID 789890) 1–8
Victor DA (2017) A hybrid scenario FCM with VIKOR technique for ranking the factors. 119(9):233–244. ISSN:1314-3345
Vlachos IK, Sergiadis GD (2007) Intuitionistic fuzzy information application to pattern recognition. Pattern Recognit Lett 28(2):197–206
Wang J, Wang P (2012) Intuitionistic linguistic fuzzy multi criteria decision-making method based on intuitionistic fuzzy entropy. Control Decis 27:1694–1698
Xia M, Xu ZS (2012) Entropy/Cross entropy based group decision making under intuitionistic fuzzy environment. Inf Fusion 13:31–47
Xiao L, Xuanzi W (2017) Extended VIKOR method for intuitionistic fuzzy multiattribute decision-making based on a new distance measure. Hindawi Math Probl Eng (article ID 4072486) 1–16
Xiong SH, Wu S, Chen ZS, Li YL (2017) Generalized intuitionistic fuzzy entropy and its application in weight determination. Control Decis 329(5):845–854
Xu ZS (2007) Intuitionistic fuzzy aggregation operators. IEEE Trans Fuzzy Systs 15:1179–1187
Xu ZS, Hu H (2010) Projection models for intuitionistic fuzzy multiple attribute decision making. Int J Inf Technol Decis Mak 9:267–280
Xu ZS, Yager RR (2006) Some geometric aggregation operators based on intuitionistic fuzzy sets. Int J Gen Syst 35:417–433
Ye J (2010) Fuzzy decision making method based on the weighted correlation coefficient under intuitionistic fuzzy environment. Eur J Oper Res 205:202–204
Zadeh LA (1965) Fuzzy sets. Inf Comput 8(3):338–353
Zadeh LA (1968) Probability measures of fuzzy events. J Math Anal Appl 23:421–427
Zeng S, Chen SM, Kuo LW (2019) Multiattribute decision making based on novel score function of intuitionistic fuzzy values and modified VIKOR method. Inf Sci 488:76–92
Zhu YJ, Li DF (2016) A new definition and formula of entropy for intuitionistic fuzzy sets. J Intell Fuzzy Syst 30(6):3057–3066
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Gupta, R., Kumar, S. Intuitionistic fuzzy scale-invariant entropy with correlation coefficients-based VIKOR approach for multi-criteria decision-making. Granul. Comput. 7, 77–93 (2022). https://doi.org/10.1007/s41066-020-00252-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s41066-020-00252-0