1 Introduction

Generally, the concept of fuzziness together with the concept of uncertainty exist extensively in decision making procedures, and this is due to the complexity of the inherent mechanism of decision processes. To model a real decision making procedure, there are many useful and interesting methodologies which have been developed so far, where the information forms are based on linguistic term set (LTS) [9, 21], intuitionistic fuzzy set (IFS) [1, 33], and hesitant fuzzy set (HFS) [9, 10] which is widely used in modeling quantitative expressions when a decision maker is likely hesitate among several values for evaluating an alternative.

In recent years, the linguistic term is used to make the judgment of experts more reliable for decision making, for instance, Rodríguez et al. [21] generalized the concept of hesitant fuzzy linguistic term set (HFLTS) by the use of combining both concepts of HFS and LTS. Chen and Hong [3] introduced an approach of dealing with the multi-criteria linguistic decision making by the use of the two concepts: the pessimistic attitude and the optimistic one of the decision makers. Liao et al. [16] applied hesitant fuzzy linguistic VIKOR method in qualitative multiple criteria decision making. Wang et al. [27] introduced a likelihood-based TODIM technique on the basis of multi-hesitant fuzzy linguistic information in order for evaluation in logistics outsourcing. Gou et al. [11] implemented the double hierarchy HFLTS and MULTIMOORA method in evaluating the implementation status of haze controlling measures.

However, there are information measures, such as entropy measure, similarity measure, and distance measure which have been extensively studied in the literature and used successfully in various applications. For instance, we can refer here to the information measures used extensively in pattern recognition [6], decision making [4], etc. Although the similarity and the distance measures [18, 19] are two interesting topics among the latter three kinds of information measures, the notion of entropy is worth further explanation, because it returns the amount of difficulty in making a decision whether an element belongs to that set or not. Among several papers recently published concerning the study of entropy measure for fuzzy sets and their extensions, we can refer to De Luca and Termini [20], Kaufmann [14], Yager [32], Burillo and Bustince [1], Zeng and Li [34], Szmidt and Kacprzyk [22], and Farhadinia [8].

Recently, Gou et al. [12] defined mainly some entropy and cross-entropy measures for HFLTSs on the basis of an equivalent transformation function. By the way, Gou et al.’s [12] entropy measures cannot distinguish different HFLTSs correctly in some situations whenever different HFLTSs are equal with their complements. Liang et al. [15] proposed a variety of entropy measures for HFLTSs which can measure both hesitation and fuzziness for an HFLTS, and of course they are applicable for measuring the uncertainty of HFLTSs. Through that work, Liang et al. [15] showed that Farhadinia’s [9] entropies, which are based on the distance and the similarity measures of HFLTSs, measure only the degree of fuzziness for an HFLTS. This is while in defining an entropy measure for HFLTSs, the hesitation of the HFLTS has to be taken into consideration beside the fuzziness degree.

To further enhance the efficiency of information measures for HFLTSs, we are going here to investigate more HFLTS entropy measures from another viewpoint. The main objective of the present contribution is to develop a theoretical framework that assists the researchers for the construction of HFLTS entropy measures by transforming HFLTSs to interval-transformed HFLTSs (ITHFLTSs) together with implementing the concepts of component-wise average and deviation being different from the concepts of total average and deviation considered by Liang et al. [15].

The structure of this paper is organized as the following: In Sect. 2, we review the concepts of LTSs and HFLTSs, and then, we present a number of existing entropy measures for HFLTSs. Section 3 deals with the new entropy measures of HFLTSs by taking into account a novel transformation of HFLTSs, called interval-transformed HFLTSs (ITHFLTSs). Then, in Sect. 4, we apply the new entropy measures to MCDM involving the information given in the form of HFLTSs. Finally, Sect. 5 concludes the paper.

2 Entropy Measures for Hesitant Fuzzy Linguistic Term Sets

In most of the decision making problems accompanied with linguistic information, we are usually more comfortable stating each opinion by the linguistic variables. It is assumed that a linguistic approach is sufficiently suitable and accurate for modeling the human cognitive process [13]. Usually, for providing a preference over an object with linguistic terms, we are required to predefine a suitable linguistic evaluation scale [13]. In this regard, Xu [31] defined the finite and moreover totally ordered discrete LTS \({\mathfrak {{S}}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \},\) where \(\tau\) indicates a positive integer and \(s_\alpha\) is used for a possible value of a linguistic variable. Moreover, the LTS \({\mathfrak {S}}\) is totally ordered, that is, we conclude for any \(s_\alpha ,s_\beta \in {\mathfrak {S}}\) that

  1. 1.

    \(s_\alpha <s_\beta\) if and only if \(\alpha <\beta\);

  2. 2.

    \(N(s_\alpha )=s_{-\alpha }\) where \(s_{-\alpha }\) indicates the negation of \(s_{\alpha }\).

By the inspiration of the idea of hesitant fuzzy sets (HFSs) [24], Rodriguez et al. [21] introduced the hesitant fuzzy linguistic term set (HFLTS) to overcome some difficulties observed in a qualitative circumstance where a decision maker may hesitate between several terms at the same time, or he/she needs a complex linguistic term instead of a single linguistic term to assess a linguistic variable. Continuing that work, Liao et al. [17] refined the concept of HFLTS mathematically as follows:

Definition 2.1

[17] Let \(X=\{x_1,x_2, \ldots ,x_N\}\) be the reference set, and \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) be a LTS. Then, a hesitant fuzzy linguistic term set (HFLTS) on X is mathematically defined as

$$\begin{aligned} H_{{\mathfrak {S}}}=\{\langle x_i, h_{{\mathfrak {S}}}(x_i)\rangle \,|\,x_i\in X\}, \end{aligned}$$
(1)

in which the set of \(h_{{\mathfrak {S}}}(x_i)\) includes some possible values in the LTS \({{\mathfrak {S}}}\) such that

$$\begin{aligned} h_{{\mathfrak {S}}}(x_i)=\{s_{\delta _l}(x_i)\,|\,s_{\delta _l}(x_i)\in {\mathfrak {S}},\,l=1,2, \ldots ,L\}, \end{aligned}$$
(2)

where the parameter L denotes the number of linguistic terms in \(h_{{\mathfrak {S}}}(x_i)\), and by the notation \(s_{\delta _l}(x_i)\), we mean the constant \(s_{\delta _l}\) according to the element \(x_i\).

Example 2.2

Assume that \(x_1,x_2\) and \(x_3\) are three cars whose approximate speed should be evaluated by an expert using linguistic terms instead of numerical values. We consider

$$\begin{aligned} {\mathfrak {{S}}}= & {} \{s_{-3}=\mathrm{very \, slow},\, s_{-2}=\mathrm{slow},\, s_{-1}=\mathrm{slightly \, slow},\\ s_{0}= & {} \mathrm{average},\, s_{1}=\mathrm{slightly \,fast},\, s_{2}=\mathrm{fast},\, s_{3}=\mathrm{very\, fast}\}, \end{aligned}$$

as the set of linguistic terms. If we suppose that the corresponding expert’s judgments considering on the three cars are characterized by the linguistic expressions: “at least fast” for \(x_1\), “between very slow and average” for \(x_2\), and “greater than fast” for \(x_3\), then such linguistic expressions are represented by the terms of \({\mathfrak {{S}}}\) as the following

$$\begin{aligned} \{\langle x_1,\mathrm{fast,\, very\, fast} \rangle ,\langle x_2,\mathrm{very\, slow,\,slow,\, slightly\, slow,\,average} \rangle ,\langle x_3,\mathrm{very\, fast} \rangle \}, \end{aligned}$$

which constructs a HFLTS as the following

$$\begin{aligned} H_{\mathfrak {{S}}}=\{\langle x_1,h_{{\mathfrak {S}}}(x_1)=\{s_{2},\, s_{3}\} \rangle ,\langle x_2,h_{{\mathfrak {S}}}(x_2)=\{s_{-3},\, s_{-2},\, s_{-1},\,s_{0}\} \rangle , \langle x_3,h_{{\mathfrak {S}}}(x_3)=\{s_{3}\} \rangle \}. \end{aligned}$$

In order to simplify the next discussions, the hesitant fuzzy linguistic element (HFLE) \(h_{{\mathfrak {S}}}(x_i)\) is denoted briefly by \(h_{{\mathfrak {S}}}\).

Recently, Wang and Xu [29] aimed to extend the concept of HFLTS to the extended HFLTS (EHFLTS) as the ordered but non-consecutive linguistic terms. Since a detailed description of EHFLTSs is not the scope of this contribution, we will not go into details and refer to [29] for the exact description.

Now, we are in a position to review the existing entropy measures for HFLTSs from different viewpoints by discussing their possible advantages and disadvantages.

First of all, let us investigate here the entropy measure of HFLTSs which are resulted from Farhadinia’s [9] transformation of distance and similarity measures of HFLTSs.

Definition 2.3

[9] Given a LTS \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) together with the two HFLTSs \(H^1_{{\mathfrak {S}}}\) and \(H^2_{{\mathfrak {S}}}\), the function E is said to be an entropy measure for HFLTSs whenever

  1. (E0)

    \(0\le E(H_{{\mathfrak {S}}})\le 1\);

  2. (E1)

    \(E(H_{{\mathfrak {S}}})=0\) iff \(H_{{\mathfrak {S}}}=H^{[-\tau ]}_{{\mathfrak {S}}}\) or \(H_{{\mathfrak {S}}}=H^{[\tau ]}_{{\mathfrak {S}}}\);

  3. (E2)

    \(E(H_{{\mathfrak {S}}})=1\) iff \(H_{{\mathfrak {S}}}=H^{[0]}_{{\mathfrak {S}}}\);

  4. (E3)

    \(E(H_{{\mathfrak {S}}})=E({\overline{H}}_{{\mathfrak {S}}})\);

  5. (E4)

    If \(H^1_{{\mathfrak {S}}}{\preceq } H^2_{{\mathfrak {S}}}{\preceq } H^{[0]}_{{\mathfrak {S}}}\) or \(H^{[0]}_{{\mathfrak {S}}}{\preceq } H^2_{{\mathfrak {S}}}{\preceq } H^1_{{\mathfrak {S}}}\), then \(E(H^1_{{\mathfrak {S}}})\le E(H^2_{{\mathfrak {S}}})\),

where \(X=\{x_1,x_2, \ldots ,x_N\}\) is used for denoting the reference set, and

$$\begin{aligned}&H^{[0]}_{{\mathfrak {S}}}=\{\langle x_i, h^{[0]}_{{\mathfrak {S}}}(x_i)\rangle \,|\,x_i\in X\}=\{\langle x_i, \{s_{\delta _l}(x_i):=s_{0}(x_i)\,,\,\forall l=1,2, \ldots ,L\}\rangle \,|\,x_i\in X\};\\&H^{[-\tau ]}_{{\mathfrak {S}}}=\{\langle x_i, h^{[-\tau ]}_{{\mathfrak {S}}}(x_i)\rangle \,|\,x_i\in X\}=\{\langle x_i, \{s_{\delta _l}(x_i):=s_{-\tau }(x_i)\,,\,\forall l=1,2, \ldots ,L\}\rangle \,|\,x_i\in X\};\\&H^{[\tau ]}_{{\mathfrak {S}}}=\{\langle x_i, h^{[\tau ]}_{{\mathfrak {S}}}(x_i)\rangle \,|\,x_i\in X\}=\{\langle x_i, \{s_{\delta _l}(x_i):=s_{\tau }(x_i)\,,\,\forall l=1,2, \ldots ,L\}\rangle \,|\,x_i\in X\};\\&{\overline{H}}_{{\mathfrak {S}}}=\{\langle x_i, {\overline{h}}_{{\mathfrak {S}}}(x_i)\rangle \,|\,x_i\in X\}=\{\langle x_i, \{s_{\delta _l}(x_i):=s_{-\delta _l}(x_i)\,,\,\forall l=1,2, \ldots ,L\}\rangle \,|\,x_i\in X\}. \end{aligned}$$

Moreover, \({\preceq }\) denotes the partial order of HFLTSs introduced in [9] as follows:

$$\begin{aligned} H^1_{{\mathfrak {S}}}{\preceq } H^2_{{\mathfrak {S}}}\quad \text {if and only if} \quad h^{1}_{{\mathfrak {S}}}(x_i)\le h^{2}_{{\mathfrak {S}}}(x_i), \,\forall x_i\in X \quad \text {if and only if} \quad s^1_{\delta _l}(x_i)\le s^2_{\delta _l}(x_i),\,\forall x_i\in X,\,\forall l=1,2, \ldots ,L. \end{aligned}$$

Theorem 2.4

[9] Suppose that \(Z: [0, 1]\rightarrow [0, 1]\) is a strictly monotone decreasing real function, and also d is a distance measure between HFLTSs. Then,

$$\begin{aligned}&E_{d}(H_{{\mathfrak {S}}})=\frac{Z(2d(H_{{\mathfrak {S}}},H^{[0]}_{{\mathfrak {S}}}))-Z(1)}{Z(0)-Z(1)} \end{aligned}$$
(3)

defines an entropy measure of HFLTS with respect to the distance d.

The latter theorem empowers us to create a number of entropy measures of HFLTSs by the use of a distance measure between HFLTSs. In the case where we take the strictly monotone decreasing function Z: \([0,1]\rightarrow [0,1]\) as \(Z({t})=1-{t}; Z({t})=\frac{1-{t}}{1+{t}}; Z({t})=1-{t}e^{{t}-1}\), and \(Z({t})=1-{t}^2\), then different formulas of entropy measure for HFLTSs can be achieved. For instance, if we take \(Z({t})=1-{t}\), then

$$\begin{aligned}&E_{d_{g}}(H_{{\mathfrak {S}}})=1- \frac{2}{N}\sum ^{N}_{i=1}\left[ \left( \frac{1}{L}\sum _{l=1}^{L}\left( \frac{|\delta ^{(i)}_l|}{2\tau }\right) ^{\lambda }\right) ^{\frac{1}{\lambda }}\right] ,\quad \lambda >0, \end{aligned}$$
(4)

stands for an entropy measure of HFLTS \(H_{{\mathfrak {S}}}\).

Moreover, Farhadinia [9] defined another class of entropy measures as:

Theorem 2.5

[9] Assume that \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) is a LTS, and moreover S is a similarity measure for HFLTSs. Then,

$$\begin{aligned} E_{S}(H_{{\mathfrak {S}}})=S(H_{{\mathfrak {S}}},{\overline{H}}_{{\mathfrak {S}}}) \end{aligned}$$
(5)

defines a similarity-based entropy measure of HFLTS \(H_{{\mathfrak {S}}}\).

Taking Theorem 2.5 into account, Farhadinia [9] constructed an entropy measure for HFLTSs as follows:

$$\begin{aligned}&E_{S_{g}}(H_{{\mathfrak {S}}})=1- \left[ \frac{1}{N}\sum ^{N}_{i=1}\left[ \left( \frac{1}{L}\sum _{l=1}^{L}\left( \frac{|\delta ^{(i)}_l|}{\tau }\right) ^{\lambda }\right) ^{\frac{1}{\lambda }}\right] \right] ^2,\quad \lambda >0. \end{aligned}$$
(6)

Recently, Tao et al. [23] considered a linguistic fuzzy set in the form of a pair \((X, {\mathcal {L}})\) where \(X=\{x_1,x_2, \ldots ,x_N\}\) is the reference set, and \({\mathcal {L}}:X\rightarrow {\mathfrak {S}}\) is called the grade of linguistic membership function.

With the concept of entropy measure for fuzzy sets, Tao et al. [23] developed a similar axiomatic definition of entropies for linguistic information as stated before by Farhadinia [9] in the form of Definition 2.3. In fact, Tao et al. [23] presented three kinds of entropy measures which are consistent with the idea of Wang and Chiu [25] who proposed the following well-known examples of entropy measures, such as \(e(u) = 4u(1- u), e(u) = -(uln(u) +(1-u)ln(1-u))\), and \(e(u) =\left\{ \begin{array}{ll} 2u , &{}\quad u\in [0, 0.5]; \\ 2(1-u) , &{}\quad u\in [0.5,1]. \end{array} \right.\)

We should point out that e: \([0, 1]\rightarrow [0, 1]\) is a monotonically increasing function on [0, 0.5] and a monotonically decreasing function on [0.5, 1], together with \(e(u) = 0\) if \(u = 0, 1\) and moreover \(e(u) = 1\) if \(u = 0.5\).

However, drawing on the idea of entropies for linguistic fuzzy sets provided by Tao et al. [23], we here introduce the entropy measures for HFLTSs with some minor modifications:

Definition 2.6

For the LTS \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\), the followings

$$\begin{aligned}&E^{T1}_{S}(H_{{\mathfrak {S}}})=\frac{4}{N}\sum ^{N}_{i=1}\left[ \frac{1}{L}\sum _{l=1}^{L}\left( \frac{\delta ^{(i)}_l}{2\tau }\left( 1-\frac{\delta ^{(i)}_l}{2\tau }\right) \right) \right] ; \end{aligned}$$
(7)
$$\begin{aligned}&E^{T2}_{S}(H_{{\mathfrak {S}}})=\frac{-1}{N Ln(2)}\sum ^{N}_{i=1}\left[ \frac{1}{L}\sum _{l=1}^{L}\left( \frac{\delta ^{(i)}_l}{2\tau }Ln\left( \frac{\delta ^{(i)}_l}{2\tau }\right) \qquad+ \left( 1-\frac{\delta ^{(i)}_l}{2\tau }\right) Ln\left( 1-\frac{\delta ^{(i)}_l}{2\tau }\right) \right) \right] ; \end{aligned}$$
(8)
$$\begin{aligned}&E^{T3}_{S}(H_{{\mathfrak {S}}})=\left\{ \begin{array}{ll} \frac{2}{N}\sum ^{N}_{i=1}\left[ \frac{1}{L}\sum _{l=1}^{L}\left( \frac{\delta ^{(i)}_l}{2\tau }\right) \right] , &{}\quad \frac{\delta ^{(i)}_l}{2\tau }\in [0,0.5]; \\ \frac{2}{N}\sum ^{N}_{i=1}\left. \left[ \frac{1}{L}\sum _{l=1}^{L}\left( 1-\frac{\delta ^{(i)}_l}{2\tau }\right) \right) \right] , &{}\quad \frac{\delta ^{(i)}_l}{2\tau }\in [0.5,1], \end{array} \right. \end{aligned}$$
(9)

are all called entropy measures for the HFLTS \(H_{{\mathfrak {S}}}=\{\langle x_i, h_{{\mathfrak {S}}}(x_i)=\{s_{\delta _l}(x_i)\,|\,s_{\delta _l}(x_i)\in {\mathfrak {S}},\,l=1,2, \ldots ,L\}\rangle \,|\,x_i\in X\}\) where \(\delta ^{(i)}_l\) is used for \({\delta _l}(x_i)\).

In recent years, Liang et al. [15] introduced a class of entropy measures for HFLTSs by accompanying two aspects: the fuzziness and hesitation of information. Liang et al. [15] showed that the fuzziness is in fact given by the difference of the averaging value of \(H_{{\mathfrak {S}}}\) from \(H^{[0]}_{{\mathfrak {S}}}\), and moreover, the concept of hesitation is used to reflect the deviation degree of all elements in \(H_{{\mathfrak {S}}}\). Putting these two concepts together, Liang et al. [15] defined:

Definition 2.7

[15] Given a LTS \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) together with a HFLE \(h_{{\mathfrak {S}}}=\{s_{\delta _l}\,|\,s_{\delta _l}\in {\mathfrak {S}},\,l=1,2, \ldots ,L\}\), we define

$$\begin{aligned}&\mu (h_{{\mathfrak {S}}})=\frac{1}{L}\sum _{l=1}^{L}\delta _l, \end{aligned}$$
(10)
$$\begin{aligned}&\nu (h_{{\mathfrak {S}}})={\frac{1}{(L)_2}\sum _{l=1}^{L-1} \sum _{k=l+1}^L (\delta _l-\delta _k)}, \end{aligned}$$
(11)

where \((L)_2=\frac{L!}{(L-2)!2!}\).

Notice 2.1

In order to have a correct representation of deviation degree of all elements in \(H_{{\mathfrak {S}}}\), we should modify the relation proposed by Liang et al. [15] as the following

$$\begin{aligned}&\nu (h_{{\mathfrak {S}}})=\left\{ \begin{array}{ll} 0, &{} L=1; \\ {\frac{1}{(L)_2}\sum _{l=1}^{L-1} \sum _{k=l+1}^L (\delta _l-\delta _k)}, &{} L>1. \end{array} \right. \end{aligned}$$
(12)

where \((L)_2=\frac{L!}{(L-2)!2!}\).

Needless to say that the above fuzziness and hesitation of information are nothing else than the two others proposed by Farhadinia [9] for ranking the HFLTSs. These two items are used here to rank HFLTEs \(h^{1}_{{\mathfrak {S}}}=\{s_{\delta ^{1}_l}\,|\,s_{\delta ^{1}_l}\in {\mathfrak {S}},\,l=1,2, \ldots ,L\}\) and \(h^{2}_{{\mathfrak {S}}}=\{s_{\delta ^{2}_l}\,|\,s_{\delta ^{2}_l}\in {\mathfrak {S}},\,l=1,2, \ldots ,L\}\) with respect to the following total ranking order \(\le\)

$$\begin{aligned}&\text {if}\, \mu (h^{1}_{{\mathfrak {S}}})<\mu (h^{2}_{{\mathfrak {S}}}),\,\text {then}\,h^{1}_{{\mathfrak {S}}}< h^{2}_{{\mathfrak {S}}};\\&\text {if}\, \mu (h^{1}_{{\mathfrak {S}}})=\mu (h^{2}_{{\mathfrak {S}}}),\,\text {then}\\&\quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \text {if}\, \nu (h^{1}_{{\mathfrak {S}}})>\nu (h^{2}_{{\mathfrak {S}}}),\,\text {then}\,h^{1}_{{\mathfrak {S}}}< h^{2}_{{\mathfrak {S}}};\\&\quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \text {if}\, \nu (h^{1}_{{\mathfrak {S}}})=\nu (h^{2}_{{\mathfrak {S}}}),\,\text {then}\,h^{1}_{{\mathfrak {S}}}= h^{2}_{{\mathfrak {S}}}. \end{aligned}$$

For a comprehensive review of ranking orders on HFLTSs, we refer the interested reader to [28].

By taking the averaging value \(\mu\) and the deviation function value \(\nu\) into account, Liang et al. [15] proposed an axiomatic definition of entropy for an HFLTS.

Definition 2.8

[15] Given a LTS \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) together with the two HFLTSs \(H^1_{{\mathfrak {S}}}\) and \(H^2_{{\mathfrak {S}}}\), the function \(E_L\) is said to be an entropy measure for HFLTSs if

  1. (E0)

    \(0\le E_L(H_{{\mathfrak {S}}})\le 1\);

  2. (E1)

    \(E_L(H_{{\mathfrak {S}}})=0\) iff \(H_{{\mathfrak {S}}}=H^{[-\tau ]}_{{\mathfrak {S}}}\) or \(H_{{\mathfrak {S}}}=H^{[\tau ]}_{{\mathfrak {S}}}\);

  3. (E2)

    \(E_L(H_{{\mathfrak {S}}})=1\) iff \(\mu (H_{{\mathfrak {S}}})=H^{[0]}_{{\mathfrak {S}}}\);

  4. (E3)

    \(E_L(H_{{\mathfrak {S}}})=E_L({\overline{H}}_{{\mathfrak {S}}})\);

  5. (E4)

    If \(\mu (H^1_{{\mathfrak {S}}})\le \mu (H^2_{{\mathfrak {S}}})\) and \(\nu (H^1_{{\mathfrak {S}}})\le \nu (H^2_{{\mathfrak {S}}})\) for \(\mu (H^2_{{\mathfrak {S}}})\le H^{[0]}_{{\mathfrak {S}}}\); or \(\mu (H^1_{{\mathfrak {S}}})\ge \mu (H^2_{{\mathfrak {S}}})\) and \(\nu (H^1_{{\mathfrak {S}}})\le \nu (H^2_{{\mathfrak {S}}})\) for \(\mu (H^2_{{\mathfrak {S}}})\ge H^{[0]}_{{\mathfrak {S}}}\), then \(E_L(H^1_{{\mathfrak {S}}})\le E_L(H^2_{{\mathfrak {S}}})\).

Theorem 2.9

[15] Let \(H_{{\mathfrak {S}}}\) be an HFLTS. Then, for any HFLTS \(H_{{\mathfrak {S}}}\)

$$\begin{aligned}&E_{L}(H_{{\mathfrak {S}}})=\frac{f(\mu (H_{{\mathfrak {S}}}))+\alpha \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}{1+\alpha \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }} \end{aligned}$$
(13)

defines an entropy measure for HFLTS whenever \(f:[-\tau ,\tau ]\rightarrow [0,1]\) satisfies

  1. (i)

    \(f(-x)=f(x)\);

  2. (ii)

    f(x) is a strictly and monotonically increasing function on \([-\tau ,0]\) , and moreover, a strictly and monotonically decreasing function on \([0, \tau ]\);

  3. (iii)

    f(x) interpolates three points \((-\tau , 0)\), (0, 1) and \((\tau , 0)\).

In the above relation, parameter \(\alpha \in (0, 1]\) indicates the ratio of \(\mu\) to \(\nu\) for the uncertainty of \(H_{{\mathfrak {S}}}\).

By assuming \(\alpha =1\) together with the functions \(f(x)=1-|1-\frac{x}{\tau }|\), \(f(x)=\sin (\frac{x}{2\tau }\pi )\) and \(f(x)=1-(\frac{x}{\tau }-1)^2\), Liang et al. [15] obtained, respectively,

$$\begin{aligned}&E_{L1}(H_{{\mathfrak {S}}})= \frac{1-|1-\frac{\mu (H_{{\mathfrak {S}}})}{\tau }| + \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}{1+ \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}; \end{aligned}$$
(14)
$$\begin{aligned}&E_{L2}(H_{{\mathfrak {S}}})= \frac{\sin (\frac{\mu (H_{{\mathfrak {S}}})}{2\tau }\pi ) + \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}{1+ \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}; \end{aligned}$$
(15)
$$\begin{aligned}&E_{L3}(H_{{\mathfrak {S}}})= \frac{1-(\frac{\mu (H_{{\mathfrak {S}}})}{\tau }-1)^2 + \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}{1+ \frac{\nu (H_{{\mathfrak {S}}})}{2\tau }}. \end{aligned}$$
(16)

In order to have a more complete picture of the next proposed entropy measures than the above-mentioned ones, at this stage, we briefly refer to some of the main limitations of the above-mentioned entropies. These findings are, in fact, the basis for conducting a new class of entropy measures. Respecting to the above formulas, we find out that Farhadinia’s entropy measures \(E_{d_{g}}\) and \(E_{S_{g}}\) given by (4) and (6) only measure the fuzziness degree of an HFLTS, that is, the deviation of an HFLTS from the most fuzzy element. This is while, not only the fuzziness degree of an HFLTS should be considered, but also the hesitation of the HFLTS needs to be taken into account. On the other hand, Tao et al.’s entropy measures \(E^{T1}_{S}\), \(E^{T2}_{S}\) and \(E^{T3}_{S}\) given by (7)–(9) are apparently in different form of Farhadinia’s entropy measures \(E_{d_{g}}\) and \(E_{S_{g}}\), but they suffer the same shortcomings as mentioned earlier. Moreover, we can show that Liang et al.’s entropy measures \(E_{L1}\), \(E_{L2}\) and \(E_{L3}\) given by (14)–(16), which are constructed by the use of the total averaging \(\mu\) and the total deviation function \(\nu\), seems to be less confidential compared to the proposed entropy measures which are based on the component-wise averaging from the value \(\frac{1}{2}\), and the component-wise deviation function.

3 New Entropy Measure for Hesitant Fuzzy Linguistic Term Sets

In this section, we first introduce a new concept, called interval-transformed hesitant fuzzy element, which is used to build the foundation of new entropy measures for HFLTSs. In such a way, we build a “bridge” between the involutive interval-valued fuzzy sets (IIVFSs) and a hesitant fuzzy element (HFE) which allows us to raise an issue for future works where there is a need to construct entropy measures of HFE from that of IIVFSs.

To simplify the notation, we henceforth denote the set of IIVFSs by an involutive interval-valued hesitant fuzzy element (IIVHFE).

Before any more progress can be made, let us first introduce the main object of this study which plays an important role in the next discussions.

Definition 3.1

Suppose that \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau \}\) is a LTS and \(H_{\mathfrak {S}}=\{s_{\sigma (j)}\}_{j=1}^{l_H}\) is a HFLTS on \({\mathfrak {S}}\). Then, we define the interval-transformed HFLTS (ITHFLTS) \({\mathbb {H}}_{\mathfrak {S}}\) associated with the HFLTS \(H_{\mathfrak {S}}\) in terms of the closed intervals by

$$\begin{aligned} {\mathbb {H}}_{\mathfrak {S}}=\left\{ \left[ \frac{1}{{2}\tau }{\sigma (j)},\frac{1}{{2}\tau }{\sigma (l_H-j+1)}\right] \right\} _{j=1}^{\lceil \frac{l_H}{2}\rceil }, \end{aligned}$$
(17)

where \({\lceil \frac{l_H}{2}\rceil }\) denotes the smallest integer no smaller than \(\frac{l_H}{2}\).

Example 3.2

Suppose that \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =-\tau , \ldots ,-1,0,1, \ldots ,\tau =3\}\), and moreover, let \(H^1_{\mathfrak {S}}=\{s_0\}\), \(H^2_{\mathfrak {S}}=\{s_{-2},s_{-1}\}\), and \(H^3_{\mathfrak {S}}=\{s_1,s_2,s_3\}\). Then, the associated ITHFLTSs are, respectively, in the forms of

$$\begin{aligned} {{\mathbb {H}}^1_{\mathfrak {S}}=\{[0,0] \}, \quad {\mathbb {H}}^2_{\mathfrak {S}}=\left\{ \left[ \frac{1}{6}(-2),\frac{1}{6}(-1)\right] \right\} , \quad {\mathbb {H}}^3_{\mathfrak {S}}=\left\{ \left[ \frac{1}{6}(1),\frac{1}{6}(3)\right] , \left[ \frac{1}{6}(2),\frac{1}{6}(2)\right] \right\} .} \end{aligned}$$
(18)

Note 3.1

Without loss of generality and in order to consider only positive intervals, we suppose hereafter \({\mathfrak {S}}=\{s_\alpha \,|\,\alpha =0,1, \ldots ,\tau \}\) as a LTS, and therefore, any ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}\) is associated with the HFLTS \(H_{\mathfrak {S}}\) in terms of the positive closed intervals.

In the following, we give the description of building the bridge between a HFLTS and an IIVHFE using the concept of ITHFLTS.

If we take into consideration \(J=\{1, \ldots ,{\lceil \frac{l_H}{2}\rceil }\}\) together with I([0, 1]) which denotes the set of all closed subintervals of [0, 1], then any ITHFLTS can be regarded as an IIVHFE on J which is expressed by

$$\begin{aligned} {\mathbb {H}}_{\mathfrak {S}}=\{\langle j,{\mathbb {H}}_{\mathfrak {S}}(j) \rangle \,|\,j\in J\}, \end{aligned}$$
(19)

such that

$$\begin{aligned} \left\{ \begin{array}{l} {\mathbb {H}}_{\mathfrak {S}}:J\rightarrow I([0,1]), \\ j\rightarrow {\mathbb {H}}_{\mathfrak {S}}(j):=[\frac{1}{\tau }{\sigma (j)},\frac{1}{\tau }{\sigma (l_H-j+1)}]\in I([0,1]). \end{array} \right. \end{aligned}$$
(20)

Note that the bijective correspondence (20) allows us to investigate the relation between ITHFLTSs and IIVHFEs that may be a subject of the next works.

On the basis of the above relationship between ITHFLTSs and IIVHFEs, we deal with the second main part of this paper that refers hereafter to as the procedure of introducing entropy measures for HFLTSs.

For notational convenience, we hereinafter denote the ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}=\{[\frac{1}{\tau }{\sigma (j)},\frac{1}{\tau }{\sigma (l_H-j+1)}] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\) by \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[{\mathbb {H}}^L_{\mathfrak {S}}{(j)},{\mathbb {H}}^U_{\mathfrak {S}}{(j)}] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\). Using such a notation, we correspond each interval element of the ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}\) to the following two concepts:

$$\begin{aligned}&\triangle _j=|{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}-1|, \end{aligned}$$
(21)
$$\begin{aligned}&\nabla _j={\mathbb {H}}^U_{\mathfrak {S}}{(j)}-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}, \end{aligned}$$
(22)

where the notation \(\triangle _j\) can be seen as the distance of averaging value of \(I_j=[{\mathbb {H}}^L_{\mathfrak {S}}{(j)},{\mathbb {H}}^U_{\mathfrak {S}}{(j)}]\) from the mean term of [0, 1], and the notation \(\nabla _j\) states the deviation function value of each part of \({\mathbb {H}}_{\mathfrak {S}}\).

In the following, we are interested to describe the axiomatic definition of ITHFLTS entropy measure that is mainly defined on the basis of the two latter concepts \(\triangle _j\) and \(\nabla _j\) for \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\).

Definition 3.3

Suppose that \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[{\mathbb {H}}^L_{\mathfrak {S}}{(j)},{\mathbb {H}}^U_{\mathfrak {S}}{(j)}] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\) is an ITHFLTS with its corresponding two parameters \(\triangle _j\) and \(\nabla _j\) for \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\). Then, the function E is said to be an entropy measure for ITHFLTS if

  1. (E1)

    \(E({\mathbb {H}}_{\mathfrak {S}})=0\) if and only if \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[0,0] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\) or \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[1,1] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\);

  2. (E2)

    \(E({\mathbb {H}}_{\mathfrak {S}})=1\) if and only if \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[0,1] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\);

  3. (E3)

    \(E({\mathbb {H}}_{\mathfrak {S}})=E(\overline{{\mathbb {H}}}_{\mathfrak {S}})\);

  4. (E4)

    \(E({\mathbb {H}}_{\mathfrak {S}})\) is a monotonically and decreasing function for \(\triangle _j\), and also a monotonically and increasing function for \(\nabla _j\) with respect to all \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\).

Now, we are in a position to construct a family of entropy formulas of ITHFLTS that satisfy the requirements given in Definition 3.3.

Theorem 3.4

Suppose that \(\Xi =\{(x, y)\in [0, 1]\times [0, 1]\,|\,x+y\le 1\}\), and \(\phi :\Xi \longrightarrow [0, 1]\) is a continuous function. Then, the following function

$$\begin{aligned} E({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle _j,\nabla _j) \end{aligned}$$
(23)

satisfies the requirements (E1)–(E4) if and only if \(\phi\) possesses the following properties:

  1. (i)

    \(\phi (x,y)=0\) if and only if \(x=1\) and \(y=0\);

  2. (ii)

    \(\phi (x,y)=1\) if and only if \(x=0\) and \(y=1\);

  3. (iii)

    \(\phi (x,y)\) is a monotonically and decreasing function with respect to x, and also a monotonically and increasing function with regard to y.

Proof

First of all, we assume that \(E({\mathbb {H}}_{\mathfrak {S}})\) satisfies the axiomatic requirements (E1)–(E4) given in Definition 3.3. Based on the latter assumption, we are going to show that \(\phi\) has the above properties (i)–(iii).

(1) Let \(\phi (x, y)=0\) with \(x + y\le 1\) where \(x, y\in [0, 1]\). If we take \(\triangle _j = x\) together with \(\nabla _j = y\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\), then it can be concluded that

$$\begin{aligned} E({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle _j,\nabla _j)=0. \end{aligned}$$
(24)

Comparing this relation and the requirement (E1) shows that the relation (24) holds if and only if \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[0,0] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\) or \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[1,1] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\), that is, \(\triangle _j = 1\) and \(\nabla _j = 0\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\). In other words, the relation (24) holds if and only if \(x = 1\) and \(y = 0\), and this is nothing else but the property (i).

(2) Suppose that \(\phi (x, y)=1\) with \(x + y\le 1\) where \(x, y\in [0, 1]\). If we let \(\triangle _j = x\) together with \(\nabla _j = y\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\), then we obtain

$$\begin{aligned} E({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle _j,\nabla _j)=1. \end{aligned}$$
(25)

By refereing to the requirement (E2), we conclude that the relation (25) holds if and only if \({\mathbb {H}}_{\mathfrak {S}}=\{I_j:=[0,1] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\), that is, \(\triangle _j = 0\) and \(\nabla _j = 1\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\). This means that the relation (25) holds if and only if \(x = 0\) and \(y = 1\), and therefore, \(\phi\) satisfies the property (ii).

(3) By taking (iii) into account, we assume that there are \(z_1, z_2, y\in [0, 1]\) with \(z_1\le z_2\), \(z_1 +y\le 1\) and \(z_2+y\le 1\) such that \(\phi (z_1,y)\le \phi (z_2, y)\). Now, for the two ITHFLTSs \({\mathbb {H}}^{1}_{\mathfrak {S}}\) describing by \(\triangle ^{1}_j=z_1\) and \(\nabla ^{1}_j=y\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\), and also \({\mathbb {H}}^{2}_{\mathfrak {S}}\) given by \(\triangle ^{2}_j=z_2\) and \(\nabla ^{2}_j=y\) for every \({j=1,2, \ldots , \lceil \frac{l_H}{2}\rceil }\), we find from Definition 3.3 that

$$\begin{aligned}&E({\mathbb {H}}^{1}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle ^{1}_j,\nabla ^{1}_j)=\phi (z_1,y)\\&\qquad \le \phi (z_2,y)= \frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle ^{2}_j,\nabla ^{2}_j)=E({\mathbb {H}}^{2}_{\mathfrak {S}}), \end{aligned}$$

which contradicts the axiomatic requirement (E4). This explains nothing, except the validation of (iii). Moreover, it is not hard to see that the converse is proved. \(\square\)

By means of Theorem 3.4, one can construct an entropy measure for ITHFLTSs as follows.

Let \(\Xi =\{(x, y)\in [0, 1]\times [0, 1]\,|\,x+y\le 1\}\), and the function \(\phi :\Xi \longrightarrow [0, 1]\) be defined as \(\phi (x,y)=\frac{(1-x)(1+y)}{2}\). Then, we can see that \(\phi\) fulfills all the properties given in Theorem 3.4, and therefore, the derived entropy formula of the ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}\) is in the form of

$$\begin{aligned} E_1({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil } \frac{(1-|{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}-1|)({\mathbb {H}}^U_{\mathfrak {S}}{(j)}-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+1)}{2}. \end{aligned}$$
(26)

Looking back, it is almost hard to find a bivariate function like that considered in Theorem 3.4, and therefore, we prefer to replace such a function with the combination of univariate functions.

Theorem 3.5

Suppose the aggregation function \(\Theta : [0,1]\times [0,1]\longrightarrow [0,1]\) is symmetric, and \(\Theta (x,.): [0, 1]\longrightarrow [0, 1]\) is a strictly increasing function for any \(x\in [0, 1]\). If we take \(\theta : [0, 1]\longrightarrow [0, 1]\) as a continuous function, then \(\phi (x, y) = \Theta (1-\theta (x),\theta (y))\) satisfies all the properties given in Theorem 3.4 if and only if the function \(\theta\) fulfills

  1. (i′)

    \(\theta (x)=0\) if and only if \(x=0\);

  2. (ii′)

    \(\theta (x)=1\) if and only if \(x=1\);

  3. (iii′)

    \(\theta (x)\) is a monotone and non-decreasing function on [0, 1].

Proof

Following from symmetric property of \(\Theta\), and the strictly increasing property of \(\Theta (x,.)\) for any \(x\in [0, 1]\), we find that

$$\begin{aligned} \Theta (x,y)=0,\quad \text {if \, and \, only \, if} \quad x=y=0,\\ \Theta (x,y)=1,\quad \text {if \, and \, only \, if} \quad x=y=1. \end{aligned}$$

Now, suppose that \(\phi (x, y) = \Theta (1-\theta (x),\theta (y))\) satisfies all the properties given in Theorem 3.4. Therefore, from \(\phi (1, 0) = \Theta (1-\theta (1),\theta (0))\), and moreover from the property of \(\Theta\), we conclude that \(\theta (1)=1\) and \(\theta (0)=0\). Assume that the latter results is not the only case. Thus, we can find a \(x_1\not =1\) such that \(\theta (x_1)=1\) which satisfies \(\phi (x_1, 0) = \Theta (1-\theta (x_1),\theta (0))=\Theta (0, 0)=0\). The latter relation is in fact a contradiction in view of the property (i) of Theorem 3.4. Similarly, assume that for a \(x_2\not =0\) we conclude that \(\phi (x_2, 1) = \Theta (1-\theta (x_2),\theta (1))=\Theta (1, 1)=1\). By the same reasoning, the latter relation represents that we are arriving again at a contradiction. This conclusion completes the proofs of (i\(^\prime\)) and (ii\(^\prime\)) above.

Finally, we prove the part (iii\(^\prime\)) of this theorem by contradiction. At this stage, consider \(0\le x\le y\le 1\) such that \(\theta (x)>\theta (y)\). In the present case, we conclude that

$$\begin{aligned} \phi (x,1-y) = \Theta (1-\theta (x),\theta (1-y)),\\ \phi (y,1-y) = \Theta (1-\theta (y),\theta (1-y)). \end{aligned}$$

Now, using the strict monotonicity property of function \(\Theta (x,.)\), we find that

$$\begin{aligned} \phi (x,1-y) \le \phi (y,1-y), \end{aligned}$$

that contradicts property (iii) in Theorem 3.4.

The same conclusion holds if the converse is taken into account. \(\square\)

In using Theorem 3.5, one can construct entropy measures for ITHFLTSs as the following.

Let the function \(\theta\): \([0,1]\longrightarrow [0, 1]\) be defined as \(\theta (x)=x\). Clearly, \(\theta\) satisfies all the properties given in Theorem 3.5. Moreover, in the case where we assume that \(\Theta\): \([0,1]\times [0,1]\longrightarrow [0, 1]\) as \(\Theta (x,y)=\frac{x+y}{2}\), then the derived entropy formula of ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}\) is in the form of

$$\begin{aligned} E_2({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil } \frac{(1-|{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}-1|)+({\mathbb {H}}^U_{\mathfrak {S}}{(j)}-{\mathbb {H}}^L_{\mathfrak {S}}{(j)})}{2}. \end{aligned}$$
(27)

If we take \(\theta\): \([0,1]\longrightarrow [0, 1]\) as \(\theta (x)=\sin (\frac{\pi }{2}x)\) together with the above form of \(\Theta :[0,1]\times [0,1]\longrightarrow [0, 1]\), that is, \(\Theta (x,y)=\frac{x+y}{2}\), then the derived entropy formula of the ITHFLTS \({\mathbb {H}}_{\mathfrak {S}}\) is in the form of

$$\begin{aligned} E_3({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil } \frac{1-\sin (\frac{\pi }{2}|{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}-1|)+ \sin (\frac{\pi }{2}({\mathbb {H}}^U_{\mathfrak {S}}{(j)}-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}))}{2}. \end{aligned}$$
(28)

Remark 3.6

Until this part of the study and for obtaining an entropy measure for ITHFLTSs, we have just considered \(\phi :\Xi \longrightarrow [0, 1]\) in which \(\Xi =\{(x, y)\in [0, 1]\times [0, 1]\,|\,x+y\le 1\}\). But, in real situations, any pair of (xy) may not satisfy the condition \(x+y\le 1\) as assumed in the definition of \(\Xi =\{(x, y)\in [0, 1]\times [0, 1]\,|\,x+y\le 1\}\). This is quite reasonable because the relation \((x, y)\in [0, 1]\times [0, 1]\) may led to \(1\le x+y\le 2\). Therefore, in order to consider the definition of entropy in a more general situation, we should define

$$\begin{aligned} E({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\left\{ \begin{array}{ll} \phi (\triangle _j,\nabla _j), &{} (0\le ){\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}\le 1; \\ \phi ({\widetilde{\triangle }}_j,{\widetilde{\nabla }}_j), &{} (2\ge ){\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}>1. \end{array} \right. \end{aligned}$$
(29)

As done until now, the first partition defined on [0, 1] has been comprehensively taken into account. It remains only to discuss about the second partition where \((x, y)\in [1,2]\). If we consider the transformation \([{\mathbb {H}}^L_{\mathfrak {S}}{(j)},{\mathbb {H}}^U_{\mathfrak {S}}{(j)}]\rightarrow [1-{\mathbb {H}}^U_{\mathfrak {S}}{(j)}, 1-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}]\) on the interval [1, 2], then obviously \([1-{\mathbb {H}}^U_{\mathfrak {S}}{(j)}, 1-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}]\) lies in [0, 1]. In this regard,

$$\begin{aligned}&{\widetilde{\triangle }}_j=|(1-{\mathbb {H}}^U_{\mathfrak {S}}{(j)})+(1-{\mathbb {H}}^L_{\mathfrak {S}}{(j)})-1| =|{\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}-1|={\triangle }_j, \end{aligned}$$
(30)
$$\begin{aligned}&{\widetilde{\nabla }}_j=(1-{\mathbb {H}}^L_{\mathfrak {S}}{(j)})-(1-{\mathbb {H}}^U_{\mathfrak {S}}{(j)}) ={\mathbb {H}}^U_{\mathfrak {S}}{(j)}-{\mathbb {H}}^L_{\mathfrak {S}}{(j)}={\nabla }_j. \end{aligned}$$
(31)

These findings prove that the entropy \(E({\mathbb {H}}_{\mathfrak {S}})\) should be defined in the term of \(\phi (\triangle _j,\nabla _j)\), but with minor modifications. Therefore, in the practical applications, we should consider

$$\begin{aligned} E({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{j=1}^{\lceil \frac{l_H}{2}\rceil }\phi (\triangle _j,\nabla _j) (\chi {({\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}\le 1)} \qquad +\chi {({\mathbb {H}}^L_{\mathfrak {S}}{(j)}+{\mathbb {H}}^U_{\mathfrak {S}}{(j)}> 1)}) \end{aligned}$$
(32)

where \(\chi\) denotes the characteristic function as follows

$$\begin{aligned} \chi _S (x)=\left\{ \begin{array}{ll} 1, &{}\quad x\in S; \\ 0, &{}\quad x\not \in S. \end{array} \right. \end{aligned}$$

Now, we are in a position to present a comparative analysis to show the validity of the proposed entropy measures of HFLTSs.

In order to develop our analysis, we use here the framework defined by Liang et al. [15].

Example 3.7

[15] Suppose that eight HFLTSs are defined on the LTS

$$\begin{aligned}&{\mathfrak {S}} = \{s_{-4}:\, \mathrm{nothing},\, s_{-3}:\,\mathrm{very\,low},\, s_{-2}: \,\mathrm{low},\, s_{-1}:\,\mathrm{slightly\,low},\\& \, s_{0}:\,\mathrm{medium},\, s_{1}:\,\mathrm{slightly\, high},\, s_{2}:\, \mathrm{high},\, s_{3}:\,\mathrm{very\, high},\, s_{4}:\, \mathrm{perfect}\} \end{aligned}$$

such that \(H^1_{{\mathfrak {S}}}=\{s_{-4}\}\), \(H^2_{{\mathfrak {S}}}=\{s_{-3}\}\), \(H^3_{{\mathfrak {S}}}=\{s_{-4},s_{-3},s_{-2}\}\), \(H^4_{{\mathfrak {S}}}=\{s_{-2}\}\), \(H^5_{{\mathfrak {S}}}=\{s_{-1}\}\), \(H^6_{{\mathfrak {S}}}=\{s_{-2},s_{-1},s_{0}\}\), \(H^7_{{\mathfrak {S}}}=\{s_{-3},s_{-2},s_{-1},s_{0},s_{1}\}\) and \(H^8_{{\mathfrak {S}}}=\{s_{0}\}\).

The entropies of the above HFLTSs are calculated by Farhadinia’s entropy measures \(E_{d_{g}}\) and \(E_{S_{g}}\) given by (4) and (6) with \(\lambda =1\); Tao et al.’s entropy measures \(E^{T1}_{S}\), \(E^{T2}_{S}\) and \(E^{T3}_{S}\) given by (7)–(9); Liang et al.’s entropy measures \(E_{L1}\), \(E_{L2}\) and \(E_{L3}\) given by (14)–(16); and the proposed entropy measures \(E_{1}\), \(E_{2}\) and \(E_{3}\) given by (26)–(28). The results of the above example are summarized in Table 1.

Table 1 Results of entropy measures applied to HFLTSs of Example 3.7

By refereing to Table 1, we can observe that the values in bold-face type indicate some counter-intuition cases which are produced by the corresponding entropy measure. In more details, we can observe that, although, the HFLTSs \(H^2_{{\mathfrak {S}}}\) and \(H^3_{{\mathfrak {S}}}\), and moreover, the HFLTSs \(H^5_{{\mathfrak {S}}}\), \(H^6_{{\mathfrak {S}}}\), and \(H^7_{{\mathfrak {S}}}\) are not the same, the entropy measures \(E_{d_{g}}\) and \(E_{S_{g}}\) evaluated on these HFLTSs return the same value as shown in the first two rows of Table 1. Furthermore, in the results of entropies \(E^{T1}_{S}\), \(E^{T2}_{S}\) and \(E^{T3}_{S}\), we cannot observe an increasing order of values, which is obviously counter-intuitive. As can be seen from Table 1, only the entropy measures of Liang et al. [15] and the proposed ones appear in strictly increasing order which is quite consistent with the behavior of considered HFLTSs. By the way, Liang et al.’s [15] entropies are constructed by the use of the total averaging \(\mu\) and the total deviation function \(\nu\), while the proposed entropies are computed by means of the component-wise average from the value \(\frac{1}{2}\) and the component-wise deviation function. Needless to say that such a component-wise comparison considered in the proposed entropies allows us to have more reasonable results than that of Liang et al. [15].

4 MCDM with Information Assessed in Hesitant Fuzzy Linguistic Term Sets

In most of the MCDM problems, we usually consider the criteria with different importance degrees, and this is while the information related to criteria weights is sometimes incomplete [7]. Needless to say that such a consideration may increase the uncertainty and complexity of practical decision making problems.

Here, like that of Ye [33], we do not assign the weights of criteria before, and therefore, the weights of criteria will be determined by the use of information entropy at the evaluation values of attributes for alternatives.

In this portion, we now assume that the decision maker is asked for selecting one of n alternatives \(x_i\,(i=1,2, \ldots ,n)\) which are evaluated based on m criteria \(c_j\,(j=1,2, \ldots ,m)\). In a linguistic approach, each decision maker assesses each ith alternative \(x_i\) with respect to each jth criterion \(c_j\) by implementing a set of linguistic expressions. In this situation, we transform the linguistic information into HFLTSs by the help of the context-free grammar. Therefore, the characteristics of the alternatives \(x_i\,(i=1,2, \ldots ,n)\) and criteria \(c_j\,(j=1,2, \ldots ,m)\) are presented in the form of a \(n\times m\) decision matrix with the HFLE element of \(h^j_{{\mathfrak {S}}}(x_i)\), denoting the degree that the alternative \(x_i\) satisfies the criterion \(c_j\). Knowing by this fact that the entropy method is one of the best objective weight-assessing techniques, we employ it here to the given multi-criteria group decision making method. For further information in this regard, the interested reader is referred to Farhadinia’s [9] work.

Suppose that the decision matrix \(D({H}_{{\mathfrak {S}}})\) is in the form of HFLTS as follows:

$$\begin{aligned} D({H}_{{\mathfrak {S}}}) =[h^j_{{\mathfrak {S}}}(x_i)]_{n\times m}:=[h^{ij}_{{\mathfrak {S}}}]_{n\times m}=\left( \begin{array}{cccc} h^{11}_{{\mathfrak {S}}} &{}\quad h^{12}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad h^{1m}_{{\mathfrak {S}}} \\ h^{21}_{{\mathfrak {S}}} &{}\quad h^{22}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad h^{2m}_{{\mathfrak {S}}} \\ \vdots &{}\quad \vdots &{}\quad \vdots &{}\quad \vdots \\ h^{n1}_{{\mathfrak {S}}} &{}\quad h^{n2}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad h^{nm}_{{\mathfrak {S}}} \\ \end{array} \right) . \end{aligned}$$
(33)

Here, we denote the converted HFLTS decision matrix \(D({H}_{{\mathfrak {S}}})\) to that for ITHFLTSs as follows:

$$\begin{aligned} D({\mathbb {H}}_{{\mathfrak {S}}}) =[\hbar ^j_{{\mathfrak {S}}}(x_i)]_{n\times m}:=[\hbar ^{ij}_{{\mathfrak {S}}}]_{n\times m}=\left( \begin{array}{cccc} \hbar ^{11}_{{\mathfrak {S}}} &{}\quad \hbar ^{12}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad \hbar ^{1m}_{{\mathfrak {S}}} \\ \hbar ^{21}_{{\mathfrak {S}}} &{}\quad \hbar ^{22}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad \hbar ^{2m}_{{\mathfrak {S}}} \\ \vdots &{}\quad \vdots &{}\quad \vdots &{}\quad \vdots \\ \hbar ^{n1}_{{\mathfrak {S}}} &{}\quad \hbar ^{n2}_{{\mathfrak {S}}} &{}\quad \cdots &{}\quad \hbar ^{nm}_{{\mathfrak {S}}} \\ \end{array} \right) , \end{aligned}$$
(34)

where \(\hbar ^{ij}_{{\mathfrak {S}}}=\{[{\mathbb {H}}^L_{\mathfrak {S}}{(j)}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{(j)}(x_i)] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }\) is an ITHFLTS.

In the following way and by taking the ITHFLTS decision matrix \(D({\mathbb {H}}_{{\mathfrak {S}}})\) into account, we specify the weights of criteria by the use of entropy-based weights

$$\begin{aligned} w_j=\frac{1-E_j}{m-\sum _{j=1}^m E_j}, \quad j=1, \ldots ,m, \end{aligned}$$
(35)

where \(w_j\in [0,1], \sum _{j=1}^{m}w_j=1\). Moreover, \(E_j\) is any ITHFLTS entropy measure given by

$$\begin{aligned} E_j=\frac{1}{n}\sum _{i=1}^{n}E(\hbar ^{ij}_{{\mathfrak {S}}}), \quad j=1, \ldots ,m. \end{aligned}$$
(36)

Before proceeding more, we consider the following two notions which are mentioned in [18] for HFLTSs. Let \(D({\mathbb {H}}_{{\mathfrak {S}}}) =[\hbar ^{ij}_{{\mathfrak {S}}}]_{n\times m}=[\{[{\mathbb {H}}^L_{\mathfrak {S}}{(j)}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{(j)}(x_i)] \}_{j=1}^{\lceil \frac{l_H}{2}\rceil }]_{n\times m}\) be the decision matrix, then we define

  • Interval-transformed hesitant fuzzy linguistic positive ideal solution (ITHFLPIS)

    $$\begin{aligned} x^+=\{\hbar ^{+1}_{{\mathfrak {S}}},\hbar ^{+2}_{{\mathfrak {S}}}, \ldots ,\hbar ^{+m}_{{\mathfrak {S}}}\}, \end{aligned}$$
    (37)

    where

    $$\begin{aligned} \hbar ^{+j}_{{\mathfrak {S}}}=\left\{ \begin{array}{ll} \max _{Lex}{_{i=1, \ldots ,n}} \{[{\mathbb {H}}^L_{\mathfrak {S}}{(1)}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{(1)}(x_i)]\}, &{} \hbox {for benefit criterion } c_j, \\ \min _{Lex}{_{i=1, \ldots ,n}} \{[{\mathbb {H}}^L_{\mathfrak {S}}{({\lceil \frac{l_H}{2}\rceil })}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{({\lceil \frac{l_H}{2}\rceil })}(x_i)]\}, &{} \hbox {for cost criterion } c_j, \end{array}\quad for \,j=1, \ldots ,m; \right. \end{aligned}$$
    (38)
  • Interval-transformed hesitant fuzzy linguistic negative ideal solution (ITHFLNIS)

    $$\begin{aligned} x^-=\{\hbar ^{-1}_{{\mathfrak {S}}},\hbar ^{-2}_{{\mathfrak {S}}}, \ldots ,\hbar ^{-m}_{{\mathfrak {S}}}\}, \end{aligned}$$
    (39)

    where

    $$\begin{aligned} \hbar ^{-j}_{{\mathfrak {S}}}=\left\{ \begin{array}{ll} \min _{Lex}{_{i=1, \ldots ,n}} \{[{\mathbb {H}}^L_{\mathfrak {S}}{(1)}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{(1)}(x_i)]\}, &{} \hbox {for benefit criterion } c_j, \\ \max _{Lex}{_{i=1, \ldots ,n}} \{[{\mathbb {H}}^L_{\mathfrak {S}}{({\lceil \frac{l_H}{2}\rceil })}(x_i), {\mathbb {H}}^U_{\mathfrak {S}}{({\lceil \frac{l_H}{2}\rceil })}(x_i)]\}, &{} \hbox {for cost criterion } c_j, \end{array}\quad for \,j=1, \ldots ,m; \right. \end{aligned}$$
    (40)

    Here, the operators \(\max _{Lex}\) and \(\min _{Lex}\) stand for the lexicographical order of the second coordinate given by Bustince et al. [2] in the form of

    $$\begin{aligned}{}[a, b]\le _{Lex} [c, d]\quad \text {if and only if}\quad b < d \,\text {or}\, (b = d\, \text {and}\, a \le c). \end{aligned}$$
    (41)

Now, we are able to define the relative closeness coefficient of an alternative \(x_i\) with respect to the ITHFLPIS \(x^+\) as the following

$$\begin{aligned} CC(x_i)=\frac{D(x_i,x^+)}{D(x_i,x^+)+D(x_i,x^-)}:=\frac{\sum _{j=1}^m w_jd(\hbar ^{ij}_{{\mathfrak {S}}},\hbar ^{+j})}{\sum _{j=1}^m w_jd(\hbar ^{ij}_{{\mathfrak {S}}},\hbar ^{+j}) +\sum _{j=1}^m w_jd(\hbar ^{ij}_{{\mathfrak {S}}},\hbar ^{-j})}, \end{aligned}$$
(42)

where d is an arbitrary interval-valued based distance measure, and all \(w_j\,(j=1,2, \ldots ,m)\) are the entropy-based weights of criteria determined by (35). As the consequence, we get the higher of relative closeness coefficient CC(.) by the use of the better alternative \(x_i\) for \({i=1, \ldots ,n}\).

Summarizing the latter arguments, one reaches a practical method for solving MCDM problems where the information of criteria weights is completely unknown, and of course the criteria values take the form of IVHFLE information. Such a method is described by the next steps:

Algorithm 4.1

  • Step 1 Construct the decision matrix \(D({\mathbb {H}}_{{\mathfrak {S}}}) =[\hbar ^j_{{\mathfrak {S}}}(x_i)]_{n\times m}\) with the IVHFLE arrays which are provided by the decision maker for an alternative \(x_i\,(i=1,2, \ldots ,n)\) and a criterion \(c_j\,(j=1,2, \ldots ,m)\).

  • Step 2 Specify the entropy-based weights for criteria by the help of the equation (35) from the decision matrix \(D({\mathbb {H}}_{{\mathfrak {S}}})\).

  • Step 3 Use (37) and (39) to specify the corresponding IVHFLPIS \(x^+\) and IVHFLNIS \(x^-\), respectively.

  • Step 4 Use (42) to determine the relative closeness coefficient \(CC(x_i)\) of each alternative \(x_i\) with regarding to the IVHFLPIS \(x^+\).

  • Step 5 Rank all the alternatives with respect to the relative closeness coefficients \(CC(x_i)\) for the IVHFLPIS \(x^+\) and eventually specify the best choice(s).

It is worth mentioning that the proposed method of this contribution can be applied to many situations where the entropy-based weights of criteria are taken into account, for instance, when we are discussing about deriving the best service quality and then sorting the overall online bookstore service quality [26]; managing the water resources in selecting the best location for the construction of a dam in the basin of Nestos river [5]; and performing the evaluation of after-sales service providers (ASPs) of automobile company [30].

By the way and as an application example, we consider the MCDM problem discussed originally by Liao et al. [18] and later by Farhadinia [9]. Briefly, in this problem, we need to find the selection of appropriate movie for a compony which intends to give ratings on the movies with respect to some criteria. This problem is indeed dealing with the situation where a company is searching for the best movie among five movies \(x_1,x_2,x_3,x_4\) and \(x_5\) with respect to four criteria, including: story (\(c_1\)), acting (\(c_2\)), visuals (\(c_3\)), and direction (\(c_4\)). Unlike Liao et al. [18], Farhadinia [9] considered the weighing vector of criteria is not to be completely known, the case where we also take into account here. Since, such criteria are all qualitative, the decision makers express their feelings by the use of linguistic terms \({\mathfrak {S}}=\{s_{-3}=\mathrm{terrible},\, s_{-2}= \mathrm{very\, bad},\,s_{-1}= \mathrm{bad},\,s_{0}=\mathrm{medium},\,s_{1}=\mathrm{well},\,s_{2}=\mathrm{very\, well},\,s_{3}=\mathrm{perfect}\}\) to assess the movies. During the process of evaluation, the group of decision makers may consider that the acting of the movie \(x_2\) is between medium and perfect. Needless to say that this kind of expression is more similar to the human being’s cognition comparing to implement a single linguistic term. Such a linguistic expression is appropriately represented by the help of the concept HFLTS \(\{s_0, s_1, s_2, s_3\}\). Sometimes, the group of decision makers may not have the same opinion on the movies. For example, a decision maker may assign to the direction of movie \(x_2\) the term \(\{s_3\}\), that is, perfect, and the other may consider the term between medium and very well, that is, \(\{s_0, s_1, s_2\}\). In this situation and if they cannot persuade each other, then we should take the assessment as a HFLTS \(\{s_0, s_1, s_2,s_3\}\) into account.

By the way, the final assessments of such movies are summarized in the form of a hesitant fuzzy linguistic judgment matrix, shown in Table 2.

Table 2 Hesitant fuzzy linguistic judgment matrix

As mentioned earlier in Note 3.1, we first convert linguistic terms \({\mathfrak {S}}\) to that with positive indices. Taking this task into consideration, the HFLTSs in Table 2 are then converted to the form of the IVHFLTSs as shown in Table 3.

Table 3 Interval-valued hesitant fuzzy linguistic judgment matrix

We are now in a position to perform Step 2 of Algorithm 4.1. We take the decision matrix \(D({\mathbb {H}}_{{\mathfrak {S}}}) =[\hbar ^{ij}_{{\mathfrak {S}}}]_{{5\times 4}}\) into account as shown in Table 3. Next, we employ the IVHFLTS entropy measure

$$\begin{aligned} E_1({\mathbb {H}}_{\mathfrak {S}})=\frac{1}{\lceil \frac{l_H}{2}\rceil }\sum _{k=1}^{\lceil \frac{l_H}{2}\rceil } \frac{(1-|{\mathbb {H}}^L_{\mathfrak {S}}{(k)}+{\mathbb {H}}^U_{\mathfrak {S}}{(k)}-1|)({\mathbb {H}}^U_{\mathfrak {S}}{(k)}-{\mathbb {H}}^L_{\mathfrak {S}}{(k)}+1)}{2} \end{aligned}$$

given by (26) to obtain the entropy-based weights of criteria, which is denoted by the equation (35). In this regard, one gets

$$\begin{aligned} w_j=\frac{1-E_{j,1}}{4-\sum _{j=1}^4 E_{j,1}}= \frac{1-\frac{ {1}}{{5}}\sum _{i=1}^{{5}}E_{1}(\hbar ^{ij}_{{\mathfrak {S}}})}{4-\sum _{j=1}^4 \left( \frac{ {1}}{{5}}\sum _{i=1}^{{5}}E_{1} \left( \hbar ^{ij}_{{\mathfrak {S}}} \right) \right) }, \quad j=1, \ldots ,4, \end{aligned}$$

where, for example,

$$\begin{aligned} E_{1}(\hbar ^{11}_{{\mathfrak {S}}})= & {} \frac{1}{2}\sum _{k=1}^{2} \frac{\left( 1-|{\mathbb {H}}^L_{\mathfrak {S}}{(k)}+{\mathbb {H}}^U_{\mathfrak {S}}{(k)}-1| \right) \left( {\mathbb {H}}^U_{\mathfrak {S}}{(k)}-{\mathbb {H}}^L_{\mathfrak {S}}{(k)}+1\right) }{2}\\= & {} \frac{1}{2} \left[ \frac{\left( 1-|\frac{1}{6}+\frac{3}{6}-1| \right) \left( \frac{3}{6}-\frac{1}{6}+1 \right) }{2}+\frac{\left( 1-|\frac{2}{6}+\frac{2}{6}-1| \right) \left( \frac{2}{6}-\frac{2}{6}+1 \right) }{2}\right] \\= \, & {} 0.3889. \end{aligned}$$

By the same way, we get \({E_{1}(\hbar ^{21}_{{\mathfrak {S}}})= 0.3889, \,E_{1}(\hbar ^{31}_{{\mathfrak {S}}})= 0.0973, \,E_{1}(\hbar ^{41}_{{\mathfrak {S}}})= 0.3889, \,E_{1}(\hbar ^{51}_{{\mathfrak {S}}})= 0.1945,}\) which result in \({E_{1,1}= {\frac{1}{5}\sum _{i=1}^{5}E_{1}(\hbar ^{i1}_{{\mathfrak {S}}})={0.2917}}}\). A similar calculation for \(E_{2,1},\,E_{3,1}\) and \(E_{4,1}\) leads to the following entropy-based weights of criteria \(c_j\,(j=1,2,3,4)\) as

$$\begin{aligned} {w_1=0.2513,\,\,w_2=0.2363, \,\,w_3=0.2513 ,\,\,w_4=0.2610.} \end{aligned}$$

From the above procedure, we are able to compute the criteria weights using the entropy measures, referred here to as \(E_{2}\) and \(E_{3}\), given by (27) and (28), respectively. The detailed results are shown in Table 4.

Table 4 Criteria weights and their ranking orders generated by entropy measures

In Table 4, the ranking order of criteria weights are considered from the least important to the most important. For instance, we observe from the first row of Table 4 that

$$\begin{aligned}&{w_1\quad w_2 \quad w_3 \quad w_4}\\&{{ \,2\quad \,\,1 \quad \,\,3 \quad \,\,4}} \end{aligned}$$

which means that \(w_4\) has the most important weight, \(w_3\) has the second-important place, \(w_1\) has the third-important place and \(w_2\) has the least important weight.

Let us recall Farhadinia’s [9] entropy measure \(E_{d_{g}}\) given here by Eq. (4) together with the entropy measure based on generalized Hausdorff distance

$$\begin{aligned}&E_{d_{gh}}(H_{{\mathfrak {S}}})=1- \frac{2}{N}\sum ^{N}_{i=1}\left[ \left( \max _{l=1,2, \ldots ,L}\left( \frac{|\delta _l|}{2\tau }\right) ^{\lambda }\right) ^{\frac{1}{\lambda }}\right] ,\quad \lambda >0; \end{aligned}$$
(43)

and the entropy measure based on generalized hybrid Hamming distance

$$\begin{aligned}&E_{d_{ghh}}(H_{{\mathfrak {S}}})=1- \frac{2}{N}\sum ^{N}_{i=1}\left[ \left( \frac{\frac{1}{L}\sum _{l=1}^{L}\left( \frac{|\delta _l|}{2\tau }\right) ^{\lambda } +\max _{l=1,2, \ldots ,L}\left( \frac{|\delta ^1_l|}{2\tau }\right) ^{\lambda }}{2}\right) ^{\frac{1}{\lambda }}\right] ,\quad \lambda >0. \end{aligned}$$
(44)

Moreover, on the basis of the latter entropies, Farhadinia [9] defined the arithmetic mean and the geometric mean entropies which are given, respectively, by

$$\begin{aligned}&E_{\psi _{_{AM}}}(H_{{\mathfrak {S}}})=\frac{1}{3}(E_{d_{g}}(H_{{\mathfrak {S}}})+E_{d_{gh}}(H_{{\mathfrak {S}}})+E_{d_{ghh}}(H_{{\mathfrak {S}}})), \end{aligned}$$
(45)
$$\begin{aligned}&E_{\psi _{_{GM}}}(H_{{\mathfrak {S}}})=({E_{d_{g}}(H_{{\mathfrak {S}}})\times E_{d_{gh}}(H_{{\mathfrak {S}}})\times E_{d_{ghh}}}(H_{{\mathfrak {S}}}))^{\frac{1}{3}}. \end{aligned}$$
(46)

Taking the above-mentioned entropy measures, Farhadinia [9] determined the weight of each criterion reporting in Tables 2 through 4 in [9]. Here, we only consider the ranking indices of Tables 2 through 4 in [9] and compare them with the ranking indices given in Table 4. The summarized results are presented in Table 5.

Table 5 Ranking orders generated by entropy measures

From the results of Farhadinia’s entropies that are presented in Table 5, one can observe that, on the one hand, the priority of criteria weights is sensitive to the change of \(\lambda\), and on the other hand, even for a fixed choice of \(\lambda\), strict priority is not guaranteed. This is while, the results of the third last rows are completely unique, and such a finding truly supports the conclusion that the proposed entropy measures are more reliable and valid.

5 Conclusions and Future Works

In the present contribution, we firstly review the existing entropy measures for HFLTSs by addressing their limitations. Then, we present some challenges by concentrating on HFLTS entropies and the use of ITHFLTS as the bridge of HFLTSs and IVFSs. By constructing ITHFLTS entropy measures based on a new axiomatic framework, we show that the proposed entropies of HFLTSs are more confident in distinguishing different HFLTSs rather than the most existing entropy measures. Further, we define three entropy measures that satisfy the new axiomatic framework. Eventually, a multiple criteria decision making with HFLTS information is employed to illustrate the validity and of course the applicability of proposed entropies.