1 Introduction

Information fusion technology is now an important research course [1]. Information fusion technology can lead to more credible conclusions in the more reasonable way through multiple different sensors, by integrating the fuzzy set theory [2], evidence theory [3, 4], D-number theory, evidential reasoning [5, 6], and Z-number theory [7], etc. Because of the effectiveness in real applications, it has been well used in various fields, such as decision-making [8], supplier selection [9], preventive maintenance planning [10] and so on.

When confidence mass is assigned to multi-subsets, the confidence mass of multi-subset belief acquisition becomes a basic probability assignment (BPA). When the belief mass is only assigned to singletons, it is consistent with the Bayesian method [11]. Moreover, the model degenerates into a Bayesian probability model. The negation of a proposition can more intuitively measure the ambiguity degree of information [12] in Fig. 1 (cardinality of hypothesis (|Ω|) is N). Yager proposed a negation operation based on the Bayesian probability model [13]. For an exhaustive frame of discernment (close-world assumption), Gao and Deng proposed a confidence mass decision-making allocation scheme based on the power set [14]. Lefevre et al. proposes a method of assigning the mass of conflicts to power set relying on weights in the case of conflict information fusion [15]. Similarly, Luo and Deng combine the negation with the weight in frame of discernment (FOD) [16]. What’s more, they propose a more intuitive matrix negation method. Meanwhile, they verify that the negation of matrix negation based on the total uncertainty measure satisfies a property that entropy constant increment after each negation operation [17]. After multiple negation iterations, BPA tends to average according to cardinality.

Fig. 1
figure 1

Ambiguity degree. For a Completely trustful message (Mass= 0), its negation is Completely distrustful (Mass= 1), which cardinality of hypothesis (|Ω|) is N. Because of the huge Gap between Completely distrustful and Completely distrustful, the hypothesis Ω is called the Definite information. On the contrary, when the confidence mass perhaps tends to be relatively average (Uniformly trustful), negation explains that the ambiguity of the information is increasingly large (DefiniteAmbiguousMost ambiguous)

The mass represents the belief degree of a certain proposition, and also indicates how much the confidence mass is assigned [18]. However, the mass cannot represent the belief of every element in a hypothesis. Therefore, the negation of the probability distribution is not suitable for the BPA model [19]. Unfortunately, most previous negations were proposed for the maximum entropy of Shannon entropy in probability distribution or maximum Nguyen entropy in BPA. As a tool of the uncertainty measure, the negation is usually only suitable for a specific entropy. However, there are currently many uncertain measures applicable to BPA that is obtained in the open-world. In contrast, there is no suitable negation method as a tool for many uncertain measures to measure the ambiguity degree. In particular, some entropies based on the belief interval (belief function and plausibility function) [20, 21] are difficult to find negation directly in the BPA framework. Therefore, it is an important research course to establish the belief interval framework and find the negation indirectly under the BPA framework.

In Section 2.1, the D-S evidence theory and the Bayesian framework will be reviewed. In Section 2.2, the negation of Bayesian probability model will be reviewed. In Section 2.3, a series of BPA negation approaches will be reviewed. In Section 2.4, distinct uncertainty measures will be compared. In Section 3.1, two novel uncertainty measures are proposed. In Section 3.2, a new belief interval negation combination method is defined. In Section 3.3, the calculation method of the newly proposed belief interval negation method is defined. In Section 3.4, the property of entropy increment is verified. In Section 4.1, the newly proposed belief interval negation method is compared with some previous negation methods. In Section 4.2, some numerical examples are exhibited. In Section 4.3, the application in medical pattern recognition is explained. In Section 4.4, convergent mass distribution is discussed. In Section 5, the paper draws a conclusion.

2 Preliminaries

2.1 Dempster Shafer theory

Uncertainty information processing is inevitable in real applications [22]. So far, many methods have been proposed to deal with uncertainty information, such as probabilistic linguistic [23, 24], fuzzy sets [25], intuitionistic fuzzy sets [26, 27], belief rule-based [28, 29], and so on [30]. As one of the most useful methods to handle the uncertainty, the evidence theory was first proposed by Dempster, then developed by Shafer [31, 32]. The subjective Bayesian method firstly has to give the prior probability. Moreover, the evidence theory has the ability to directly express uncertainty in dealing with conflicts [33]. When the probability is known, the evidence theory degenerates into probability theory.

Suppose that there is a hypothesis Ω, the elements in Ω are mutually exclusive, we call Ω a frame of discernment (FOD). The cardinality of elements of Ω is N, and the cardinality of elements of the power set 2Ω is 2N, satisfying

$$ {{\varOmega}} = \left\{A, B, C {\cdots} \right\} $$
(1)
$$ 2^{{{\varOmega}}} = \left\{\emptyset, \left\{A \right\}, \left\{B \right\}, \left\{C \right\}, \cdots, \left\{A, B \right\}, \cdots, {{\varTheta}} \right\} $$
(2)

where is an empty set. Proposition Θ consisting of all elements is defined as the support set of Ω. For simplify these symbols, the power set corresponds to

$$ 2^{{{\varOmega}}} = \left\{A_{1}, A_{2}, \cdots, A_{2^{N}} \right\} $$
(3)

where \(A_{2^{N}}\) has the same meaning as Θ. As a proposition Θ, then m(Θ) suggests that the confidence mass is of ignorance how to allocate. If m(Θ) = 1, confidence mass is of total ignorance how to allocate [15]. Each subset regarded as a proposition corresponds to a mass, and the magnitude of the mass represents the precise belief of a proposition, satisfying:

$$ m(A) \in [0,1] $$
(4)
$$ m(\emptyset)=0 $$
(5)
$$ \sum\limits_{A \subseteq 2^{{{\varOmega}}}}m(A)=1 $$
(6)

The mass from the information fusion of 2 sources:

$$ m_{\oplus} = m_{1} \oplus m_{2} $$
(7)
$$ m_{\oplus}(A) = \frac{{\sum}_{B \cap C = A} m_{1}(B) \times m_{2}(C)}{1-m(\emptyset)} $$
(8)
$$ m(\emptyset) = \sum\limits_{B \cap C = \emptyset}m_{1}(B) \times m_{2}(C) $$
(9)

The belief function (Bel) can be interpreted as confidence that a proposition is correct. The plausibility function (Pl) can be regarded as a belief assignment that a proposition may be correct [34].

$$ Bel(A)=\sum\limits_{B \subseteq A}m(B) $$
(10)
$$ Pl(A)=\sum\limits_{A \cap B \neq \emptyset}m(B) $$
(11)

2.2 Negation of probability distribution

Uncertainty management has been widely used in fault diagnosis [35], as well as pattern classification [36], data fusion [37], and so on [38, 39]. The negation is an essential tool of uncertainty measures.

Example 1

In the medical field, hormones act on specific target cells to regulate their metabolism. It is worth mentioning that the target cell tracking event is assumed to be hypothesis X. Each mutually exclusive element is the target cell \(X= \left \{x_{1}, x_{2}, \cdots , x_{n}\right \}\) probably being traced.

In Beyesian probability model, hypothesis X corresponds to sample space, elements in X corresponds to samples. The target cell is x1, which will be recorded as “On trace x1” in the fuzzy set and probability is P(x1). In contrast, if the target cell is not x1, it is recorded as “Not on trace x1” and probability is \(P(\overline {x_{1}})\).

Under the circumstance where the probability of each element xi is known, it degenerates from FOD into a Bayesian model, corresponding to the probability \(P = \left \{p_{1}, p_{2}, \cdots , p_{n}\right \}\):

$$ \sum\limits_{i=1}^{n} p_{i} =1 $$
(12)

where pi ∈ [0,1].

$$ \bar{P} = [\bar{p_{1}}, \bar{p_{2}}, \cdots, \bar{p_{n}}] $$
(13)
$$ \bar{p_{i}} = \frac{1-p_{i}}{n-1} $$
(14)

\(\bar {p_{j}}\) is defined as probability distribution negation after j negation iterations. We suppose that

$$ \begin{array}{@{}rcl@{}} p_{i+1} = \bar{p_{i}} = \frac{1-p_{i}}{n-1} \end{array} $$
(15)

Then,

$$ \begin{array}{@{}rcl@{}} p_{i+1} - \frac{1}{n} &=& \frac{1-p_{i}}{n-1} - \frac{1}{n}\\ &=& \frac{1}{n-1} - \frac{n\times p_{i} + n-1}{n \times(n-1)}\\ &=& \frac{p_{i} - \frac{1}{n}}{1-n} \end{array} $$
(16)

Thus, \(p_{i}-\frac {1}{n}\) is a series of ratios with a common ratio of \( \frac {1}{1-n}\)

$$ p_{j} = (p_{1}+\frac1n) \times (\frac{1}{1-n})^{j-1} + \frac{1}{n} $$
(17)

We conclude

$$ \begin{array}{@{}rcl@{}} p_{j} = \frac{np_{1}-1}{n(n-1)^{j-1}} + \frac{1}{n} \end{array} $$
(18)
$$ \lim_{j \to \infty}p_{j} = \frac{1}{n} $$
(19)

Yager suggests that the maximal uncertainty measure corresponds to a unique distribution.

Example 2

To be continued of Example 1, supposed that hormones act on 3 kinds of specific target cells to regulate their metabolism, \(X=\left \{x_{1}, x_{2}, x_{3} \right \}\). Initially, prior probability distribution is \(P = \left \{0.1, 0.3, 0.6 \right \}\).

Owing to the known prior probability and cardinality n = 3, probability negation is obtained as:

$$ P(\bar{x_{1}})=\frac{1-P(x_{1})}{n-1}=0.45 $$
$$ P(\bar{x_{2}})=\frac{1-P(x_{2})}{n-1}=0.35 $$
$$ P(\bar{x_{3}})=\frac{1-P(x_{3})}{n-1}=0.20 $$

After multiple negation iterations, it converges to the maximal uncertainty, which corresponds to the uniform distribution.

2.3 Negation of basic probability assignment method

A large number of approaches to belief mass allocation in information fusion under D-S framework. Entropy is able to describe the uncertainty or reliability of a system [40]. The application of entropy in artificial intelligence and neural network has been paid more attention [41, 42]. The negation of probability distribution cannot solve the BPA negation problems of multi-subsets under the exhaustive FOD:

$$ 2^{{{\varOmega}}}=\left\{ \emptyset , A_{2}, A_{3}, \cdots, A_{2^{N-1}}, {{\varTheta}} \right\} $$
(20)

where empty set = A1 and support set \({{\varTheta }} = A_{2^{N}}\). All subsets satisfy

$$ \overline{m}(A)=m(\overline{A}) \quad \forall A \subseteq 2^{{{\varOmega}}} $$
(21)

Example 3

To be continued of Example 2, supposed that there is a exhaustive FOD that 3 kinds of target cells are marked as \({{\varOmega }}=\left \{x_{1}, x_{2}, x_{3} \right \}\). There are 4 propositions as focal elements in hypothesis Y

$$Y = \left\{\left\{x_{1} \right\}, \left\{x_{3} \right\}, \left\{x_{1}, x_{2} \right\}, \left\{x_{1},x_{2},x_{3} \right\} \right\} = \left\{y_{1}, y_{2}, y_{3}, y_{4} \right\}$$

which satisfies yiY and BPAs are m(Y ) = {0.5,0.3, 0.15,0.05}.

Yin and Deng proposed to assign the yi’s residual mass 1 − m(yi) to other focal elements except itself [43] as:

$$ m(\overline{y_{i}}) = \frac{1-m(y_{i})}{n-1} $$
(22)

whose cardinality of focal elements satisfies n = 4. After multiple negation iterations, belief mass converges to uniform distribution as \(m(y_{i})=\frac {1}{4}\).

Gao and Deng proposed to assign the yi’s residual mass 1 − m(yi) to the power set except empty set and itself [14] as

$$ m(\overline{y_{i}}) = \frac{1-m(y_{i})}{2^{n}-2} $$
(23)

After multiple negation iterations, belief mass converges to uniform distribution as \(m(y_{i})=\frac {1}{2^{3}-1}=\frac {1}{7}\).

After that, Luo and Deng proposed a negation matrix (\([G]_{2^{N}\times 2^{N}}\)) method in basic belief assignment vectors (BBAVs) space to simplify this problem, which almost all belief mass is assigned to the support set Θ [16].

$$ {\text{g(i,j) = }}\left\{ {\begin{array}{*{20}{l}} &{{0}}\quad & {i=j, j\neq 2^{N}}\\ &{\frac{|A_{i}\cap \overline{A}_{j}|}{{\sum}_{A_{k}\neq A_{j}, A_{k}\in 2^{{{\varTheta}}}}{|A_{k}\cap \overline{A}_{j}|}}}\quad & {i\neq j, j\neq 2^{N}}\\ &{{1}}\quad & {i=j, j = 2^{N}}\\ &{{0}}\quad & {i\neq j, j = 2^{N}} \end{array}} \right. $$
(24)

where g(i,j) is an element in negation matrix G. This algorithm manifests conflict (m() by (9)) increment during the negation iterations.

Furthermore, Xie and Xiao [44] has made greater improvements in the weight assignment problem. This method with the negation matrix E preserves the support set after negation iterations.

$$ {\text{E = }}\left[ {\begin{array}{*{20}{c}} {{e_{1,1}}}&{{e_{1,2}}}& {\cdots} &{{e_{1,{2^{N}}}}}\\ {{e_{2,1}}}&{{e_{2,2}}}& {\cdots} &{{e_{2,{2^{N}}}}}\\ {\vdots} & {\vdots} & {\ddots} & {\vdots} \\ {{e_{{2^{N}},1}}}&{{e_{{2^{N}},2}}}& {\cdots} &{{e_{{2^{N}},{2^{N}}}}} \end{array}} \right] $$
(25)

When j≠ 2N and j≠ 1, satisfy:

$$ e_{i,j}=\left\{ \begin{array}{lr} 0&i= j\\ \frac{{\left| {{A_{i}} \cap \bar {A}_{j} } \right|}}{{{\sum}_{{A_{k}} \ne {{{\varTheta}}},{A_{k}} \ne \emptyset,{A_{k}} \subseteq {2^{{{\varOmega}}} }} {\left| {{A_{k}} \cap \bar{A}_{j} } \right|} }} &i\ne j \end{array} \right. $$
(26)

The new method re-allocates the mass m(yi) according to the cardinality of intersections of proposition \(\overline {y_{i}}\) and other focal elements. It explains with a better physical meaning, much more intuitively. Also satisfy convergence according to the cardinality of elements after multiple negation iterations. The case comes from Example 3, cardinality of hypothesis Y satisfies

$$ \left\{|\overline{y_{1}} |, |\overline{y_{2} }|, |\overline{y_{3} }|, |\overline{y_{4} }| \right\} = \left\{2, 2, 1, 0 \right\} $$
$$ \left\{|\overline{A_{2} }|, |\overline{A_{3} }|, |\overline{A_{4} }|, |\overline{A_{5}} |, \overline{|A_{6}} |, |\overline{A_{7}} | \right\} = \left\{2, 2, 2, 1, 1, 1 \right\} $$

After multiple negation iteration, m(A1) = m() = 0 and \(m(A_{2^{N}})=m({{\varTheta }})=0.05\) preserves and don’t assign itself to any subset else. Distinctly, it converges to m(A2) : m(A3) : m(A4) : m(A5) : m(A6) : m(A7) = 1 : 1 : 1 : 2 : 2 : 2 after multiple negation iterations. It satisfies entropy increment during negation iterations, as well.

There are three completely different methods to allocate belief mass during the negation iterations. These three methods are compared in Section 4.

2.4 Uncertainty measure methods

Uncertainty measures have an important contribution in measuring ambiguity degree of the system. Uncertainty measures are applied to medical fields [45], prediction [46, 47], recognition [48], classification [49, 50], awareness [51] and decision making [52, 53].

Entropy is originally derived from thermodynamics. In 1948, Claude Elwood Shannon introduced entropy in thermodynamics into information theory. Therefore, it was widespread well-known as Shannon entropy and information entropy [54, 55]. Provided a random variable \(X = \left \{x_{1}, x_{2}, \cdots , x_{n}\right \}\), the probability distribution is P(X = xi) = pi, where i = 1,2,⋯ ,n. Shannon entropy satisfies

$$ H_{S}(X)=-\sum\limits_{i=1}^{n} P(x_{i}) \times \log_{2}(P(x_{i})) $$
(27)

Information refers to the objects transmitted and processed by messages and communication systems, and refers to everything that human society spreads [56]. A ”message” represents an event, sample, or feature from a distribution or data stream. Entropy is a measure of uncertainty. The more information a hypothesis contains, the greater the entropy will be [57]. The concept of entropy allows information to be quantified. The probability distribution of the sample is yet another feature of the information.

Nguyen defines an entropy of BPA based on mass in FOD [58] as follow:

$$ H_{N}(m) = \sum\limits_{A \subseteq 2^{{{\varOmega}}}} m(A) \times \log_{2}(\frac{1}{m(A)}) $$
(28)

It is different from Shannon entropy. The entropy defined by Nguyen is based on a weaker framework than the Bayesian model, which lays a foundation for determining the uncertainty measure of BPA. It does satisfy the probabilistic consistency property. When the BPA of each proposition is uniform distribution, the entropy takes the maximum. One of the earliest entropies is defined by Höhle [59] based on belief function as

$$ H_{O}(m) = \sum\limits_{A \subseteq 2^{{{\varOmega}}}} m(A) \times \log_{2}(\frac{1}{Bel(A)}) $$
(29)

Another entropy is defined by Yager based on plausibility function [60] as

$$ H_{Y}(m) = \sum\limits_{A \subseteq 2^{{{\varOmega}}}} m(A) \times \log_{2}(\frac{1}{Pl(A)}) $$
(30)

3 New belief interval negation of BPA

3.1 Newly proposed uncertainty measure approaches under D-S structure

This paper proposes two new methods of uncertainty measures, based on the belief function (HB) and plausibility function (HP). They are defined as:

$$ H_{B} = -\sum\limits_{A \subseteq 2^{{{\varOmega}}}} Bel(A) \times \log_{2} (Bel(A)) $$
(31)
$$ H_{P} = -\sum\limits_{A \subseteq 2^{{{\varOmega}}}} Pl(A) \times \log_{2} (Pl(A)) $$
(32)

Belief interval [Bel, Pl] represents that the proposition is completely correct and the proposition may be correct. The two new entropies are suitable for the belief interval negation method. Therefore, this paper defines two new approaches to measuring uncertain to prove the wide applicability of belief interval negation method.

3.2 Definition of belief interval negation

The belief interval of D-S evidence theory is widely used in various fields. D-S structure is different from the Bayesian probability model. BPA is a kind of weaker than probability theory [61]. It expresses more uncertainty with the mass, belief function and plausibility function in the FOD.

This paper proposes a belief interval negation based on the belief interval [Bel, Pl]. The newly proposed method first establishes belief intervals for all sets in the power set (\(\forall A \subseteq 2^{{{\varOmega }}}\)). Then find the negations of belief intervals. Next, the belief interval after the negation is converted into a BPA. Afterwards, we complete the entire negation process. Meanwhile, the process of mass → belief interval → belief interval negation → mass is realized. The negation of the belief interval is defined as follows. Its rationality is proved.

$$ \begin{array}{@{}rcl@{}} Pl(\bar{A}) &=& Pl({{\varOmega}}-A)\\ &=&{\sum}_{B\cap ({{\varOmega}} -A) \neq \emptyset }m(B) \end{array} $$
(33)

Thus,

$$ \begin{array}{@{}rcl@{}} Pl(\bar{A}) &=& {\sum}_{B \subseteq 2^{{{\varOmega}}}}m(B)-{\sum}_{C\cap A \neq \emptyset }m(C)\\ &=&1-Bel(A) \end{array} $$
(34)

Similarly, replace A with \(\overline {A}\), then we obtain

$$ Bel(\bar{A})=1-Pl(A) $$
(35)

which satisfies

$$ Bel(\bar{A})=\overline{Bel}({A}) $$
(36)
$$ Pl(\bar{A})=\overline{Pl}({A}) $$
(37)

Therefore, belief interval negation \([\overline {Bel} ,\ \overline {Pl}]\) is defined as

$$ [\overline{Bel}(A), \overline{Pl}(A)] = [1-Pl(A), 1-Bel(A)] \quad \forall A \subseteq 2^{{{\varOmega}}} $$
(38)

Besides, We define \(\overline {m}_{\oplus }\) as orthogonal mass function and \(\overline {Pl}_{\oplus }\) as the orthogonal plausibility function

$$ \begin{array}{@{}rcl@{}} K_{m}=\sum\limits_{A\subseteq 2^{{{\varOmega}}}}{\overline{m}(A)} \end{array} $$
(39)
$$ \begin{array}{@{}rcl@{}} \overline{m}_{\oplus}(A) = \frac{\overline{m}(A)}{K_{m}} \end{array} $$
(40)
$$ \begin{array}{@{}rcl@{}} K_{P}=\sum\limits_{A\subseteq 2^{{{\varOmega}}}}{\overline{Pl}(A)} \end{array} $$
(41)
$$ \begin{array}{@{}rcl@{}} \overline{Pl}_{\oplus}(A) = \frac{\overline{Pl}(A)}{K_{P}} \end{array} $$
(42)

The newly proposed belief interval negation method is not only applicable to the belief entropy and plausibility entropy proposed above, but also applicable to Nguyen entropy, Höhle entropy, Yager entropy. A total of 5 uncertain measure methods are suitable for belief interval negation calculations. From a global perspective, during the negation iterations, the belief interval negation method satisfies the entropy increment.

3.3 Calculation and properties of Belief interval negation

In an exhaustive FOD, the process of belief interval negation includes mass → belief interval → belief interval negation → mass. We have a simple numerical case as an introduction to the derivation, for a better understanding.

Example 4

There is a hypothesis \({{\varOmega }} = \left \{A, B, C\right \}\), and initial BPA satisfies:

$$m(A)=0.2$$
$$m(B)=0.1$$
$$m(A, C)=0.3$$
$$m(A, B, C)=0.4$$

3.3.1 Step 1: belief interval establishment

Establish belief intervals for all subsets in the power set (except for the empty set ) by (10) and (11).

3.3.2 Step 2: belief interval negation

Establish the belief interval of power set by (34), (35), and (38).

3.3.3 Step 3: transformation from belief interval negation to BPA

We regard all the belief function (\(\overline {Bel}\)) obtained from the belief interval negation method as the new BPA (\(\overline {m}\)). Although this method is contrary to the definition of the belief function. Because the belief interval negation method has its own irrationality. This irrationality will be discussed in detail in the Section 3.4.

Therefore, in order to deal with such irrationality, we directly regard \(\overline {Bel}\) obtained by belief interval negation as BPA.

3.3.4 Step 4: BPA orthogonalization

BPA is orthogonalized after step 3. Orthogonal BPA (\(\overline {m}_{\oplus }\)) and plausibility function (\(\overline {Pl}_{\oplus }\)) is obtained from (39)-(42).

After that, the newly proposed belief interval negation method completes all operations. The mass in belief interval negation iterations is shown as Fig. 2 and Table 1.

Fig. 2
figure 2

Mass in belief interval negation iterations

Table 1 Belief interval negation data

The new belief interval negation works for Bel entropy. Bel (obtained from step 1) can be used to calculate belief entropy.

The new belief interval negation is available for Pl entropy. Pl (obtained from step 1) can be used to calculate plausibility entropy.

The new belief interval negation works for Höhle entropy. BPA (before negation process) and Bel (obtained from step 1) can be used to calculate Höhle entropy as (29).

The new belief interval negation works for Yager entropy. The orthogonal BPA (obtained after step 4) and \(\overline {Pl}_{\oplus }\) (obtained after step 3) can be used to calculate Yager entropy as (30). But it is worth noting that we need to modify Yager entropy to make the belief interval negation applicable to Yager entropy as follows

$$ H_{Y}(m) = \sum\limits_{A \subseteq 2^{{{\varOmega}}}} \overline{m}_{\oplus}(A) \times \log_{2}(\frac{1}{\overline{Pl}_{\oplus}(A)}) $$
(43)

The new belief interval negation applies to Nguyen entropy. The orthogonal BPA (obtained after step 4) can be used to calculate Nguyen entropy as (28), which could be rewritten as

$$ H_{N}(m) = \sum\limits_{A \subseteq 2^{{{\varOmega}}}} \overline{m}_{\oplus}(A) \times \log_{2}(\frac{1}{\overline{m}_{\oplus}(A)}) $$
(44)

Uncertainty measures in belief interval negation iterations are shown in Fig. 3 and Table 2. It exactly satisfies entropy increment during the negation iterations.

Fig. 3
figure 3

Entropy in belief interval negation iterations

Table 2 Uncertainty measures in belief interval negation iterations

3.4 View from belief interval negation

The belief interval negation method is unreasonable in step 4. To be continued of Example 4, as for the plausibility function (Pl), the subset \(A_{2} = \left \{A\right \}\) is a singleton with only one element. The subset \(A_{5} = \left \{A, B\right \}\) has two elements. Therefore, Pl(A5) ≥ Pl(A2). However, after belief interval negation, \(\overline {Bel} (A_{5}) \leq \overline {Bel} (A_{2})\). This is counter-intuitive, because

$$ \overline {Bel} (A_{2})=m(\bar{A}) $$
$$ \overline {Bel} (A_{5})=m(\bar{A}) + m(\bar{B}) + m(\bar{A}, \bar{B}) $$

Because the belief function represents that the mass of trust assigned to a proposition, which is completely correct. To solve the counter-intuitive case where |A5|>|A2|, we directly regard \(\overline {Bel}(A)\) (\(\forall A \subseteq 2^{{{\varOmega }}}\)) obtained after the negation as a new BPA. Finally, the obtained BPA is orthogonalized. The whole process of the belief interval negation completely finishes.

4 Numerical examples and discussion

4.1 Numerical example 1

To be continued of Example 4, there is an exhaustive FOD, and BPA is given.

In this section, we compare three previous BPA negation methods and prove the effectiveness of the newly proposed belief interval negation method. We try to prove through such a simple case that neither Yin’s [43] nor Gao’s [14] method is suitable for the mentioned above five uncertain measure methods. Because negation iterations based on Yin’s negation method in Figs. 4 and 5 or Gao’s negation method in Figs. 6 and 7 cause the uncertainty measure values to oscillate. The negation method proposed by Luo [16] doesn’t apply to the above five uncertainty measures, since the entropy decreases sharply in the iterative process in Figs. 8 and 9. A negation matrix method proposed by Xie [44] may be applicable to Shannon, Höhle and Belief entropy, whereas it isn’t applicable to plausibility and Yager entropy in Figs. 10 and 11.

Fig. 4
figure 4

Mass in belief interval negation iterations

Fig. 5
figure 5

Entropy in belief interval negation iterations

Fig. 6
figure 6

Mass in belief interval negation iterations

Fig. 7
figure 7

Entropy in belief interval negation iterations

Fig. 8
figure 8

Mass in belief interval negation iterations

Fig. 9
figure 9

Entropy in belief interval negation iterations

Fig. 10
figure 10

Mass in belief interval negation iterations

Fig. 11
figure 11

Entropy in belief interval negation iterations

The new belief interval negation method is exactly applicable to the above five uncertain measures. Because the newly proposed belief interval negation method manifests an entropy increment during the negation iterations.

4.2 Numerical example 2

There is an exhaustive FOD (\({{\varOmega }} = \left \{A, B, C \right \}\)) similar to example 4, but BPAs are unknown. There are 10 hypotheses below in the Table 3.

Table 3 BPAs of 10 different hypotheses when cardinality N= 3

Experiments have shown that the newly proposed belief interval negation method is widely applicable to various hypotheses. It is suitable to all the 5 uncertain measure methods mentioned above. During the negation iterations, it satisfies the entropy increment in Figs. 12131415161718192021222324252627282930 and 31. These examples prove that belief interval negation method is an important tool for uncertainty measures.

Fig. 12
figure 12

Mass of Hypo.1

Fig. 13
figure 13

Entropy of Hypo.1

Fig. 14
figure 14

Mass of Hypo.2

Fig. 15
figure 15

Entropy of Hypo.2

Fig. 16
figure 16

Mass of Hypo.3

Fig. 17
figure 17

Entropy of Hypo.3

Fig. 18
figure 18

Mass of Hypo.4

Fig. 19
figure 19

Entropy of Hypo.4

Fig. 20
figure 20

Mass of Hypo.5

Fig. 21
figure 21

Entropy of Hypo.5

Fig. 22
figure 22

Mass of Hypo.6

Fig. 23
figure 23

Entropy of Hypo.6

Fig. 24
figure 24

Mass of Hypo.7

Fig. 25
figure 25

Entropy of Hypo.7

Fig. 26
figure 26

Mass of Hypo.8

Fig. 27
figure 27

Entropy of Hypo.8

Fig. 28
figure 28

Mass of Hypo.9

Fig. 29
figure 29

Entropy of Hypo.9

Fig. 30
figure 30

Mass of Hypo.10

Fig. 31
figure 31

Entropy of Hypo.10

4.3 Application in medical pattern recognition

Supposed that there are 3 medical patterns, the diagnosis result of a patient might be: Malaria, Typhoid, Enteritis, respectively marked as the three elements of hypothesis \({{\varOmega }}=\left \{A, B, C\right \}\). According to different expert experience and evidence, the medical patterns of 3 experts are given as follows:

$$ m_{1}(A)=0.4, m_{1}(B)=0.3, m_{1}(A,C)=0.1, m_{1}(A,B,C)=0.2 $$
$$ m_{2}(A)=0.1, m_{2}(B)=0.4, m_{2}(A,C)=0.2, m_{2}(A,B,C)=0.3 $$
$$m_{3}(A)=0.2, m_{3}(C)=0.1, m_{3}(A,B)=0.3, m_{3}(A,B,C)=0.4 $$

Before and after a belief interval negation, results of belief mass are displayed in Table 4 and five uncertain measure approaches are compared in Table 5. These 4 uncertainty measures (HB, HP, HO, HY) point to the 3rd expert (m3) with the lowest uncertainty, which indicates the largest reliability. In addition, Nguyen entropy (HN) lost its role in this uncertainty measure. After a belief interval negation, the ambiguity of medical patterns given by the experts based on these 5 uncertainty measures has been increasing. Although HP and HY pointed out that expert 1 (\(\bar {m}_{1}\)) was more reliable, HB, HN, and HO still pointed out that expert 3 (\(\bar {m}_{3}\)) was more reliable. This application proves the consistency of the two newly proposed uncertainty measures and belief interval negation methods with Nguyen, Höhle, and Yager entropy.

Table 4 Belief interval negation data from 3 medical patterns in belief interval negation
Table 5 Uncertainty measures from 3 medical patterns in belief interval negation

4.4 View from convergent mass

In this section we will discuss the convergence of mass. After multiple belief interval negation iterations, the subsets with the same cardinality converge to the same mass. We assume that all the singletons are labeled with \({e_{1}^{c}}\). The subscript 1 indicates the cardinality N = 1, and the superscript c indicates the convergence. Similarly, we assume that all subsets of cardinality N = 2 are labeled with \({e_{2}^{c}}\), whose subscript 2 represents the cardinality N = 2, and the superscript c represents the convergence. The support set Θ disappears during the negation iterations, so \(m ({e_{3}^{c}}) = m({{\varTheta }}) = 0\). After multiple negation iterations, the mass converges. Therefore, before or after the belief interval negation, BPA is related to the cardinality of the subsets. Taking N = 3 as an example, we can obtain

$$ \frac{m({e_{1}^{c}})}{m({e_{2}^{c}})}=\frac{1-m({e_{1}^{c}}) - 2 \times m({e_{2}^{c}})}{1-2 \times m({e_{1}^{c}}) - 3 \times m({e_{2}^{c}})} $$
(45)
$$ 3 \times m({e_{1}^{c}}) + 3 \times m({e_{2}^{c}})=1 $$
(46)

Thus,

$$ m({e_{1}^{c}}) = \frac{\sqrt 2}{6} $$
$$m({e_{2}^{c}}) = \frac13 - \frac{\sqrt 2}{6}$$

We generalize the calculation method of the convergence mass to cardinality N = n, then we can obtain

$$ n \times m({e_{1}^{c}}) + n \times m({e_{2}^{c}}) + {\cdots} + n \times m({e_{n}^{c}}) =1 $$
(47)

Then repeat the above method to obtain the convergent mass as

$$ \begin{array}{@{}rcl@{}} && m({e_{1}^{c}}): m({e_{2}^{c}}): {\cdots} : m({e_{j}^{c}}): {\cdots} \\ &&=(1-m({e_{1}^{c}}) - \sum\limits_{i=2}^{n-1} C_{n-1}^{i-1} \times m({e_{i}^{c}}))\\ &&:(1-2\times m({e_{1}^{c}}) - \sum\limits_{i=2}^{n-1} ({C_{n}^{i}}-C_{n-i}^{i}) \times m({e_{i}^{c}}) )\\ &&: (\cdots) \\ &&: (1-j\times m({e_{1}^{c}}) - \sum\limits_{i=2}^{n-1} ({C_{n}^{i}}-C_{n-i}^{i}) \times m({e_{i}^{c}}) )\\ &&: (\cdots) \end{array} $$
(48)

When ini is valid, that means \(C_{n-i}^{i}\geq 1\). When i<ni, then we consider \(C_{n-i}^{i} = 0\).

5 Conclusion

This paper proposes a belief interval negation method, which converts BPA to belief interval negation. After the belief interval negation step is completed, it is converted back to BPA and applied to various entropies. In particular, after the process of belief interval negation, the believe function is directly regarded as the BPA of the subset. Many previous BPA negation methods are not applicable to Höhle entropy and Yager entropy. The newly proposed negation method solves this problem. Besides, this paper also proposes two new uncertainty measure methods. In addition, this paper verifies that the newly proposed negation method is applicable to the above five uncertain measure methods, and they satisfy the property of entropy increment during negation iterations. Moreover, the newly proposed uncertainty measures and belief interval negation approaches are applied to medical pattern recognition. The experimental results prove that the novel approaches are reasonable and consistent with the previous method. Furthermore, the important contribution of this paper is that the newly proposed belief interval negation method, as an important tool for uncertainty measure, build the relationship among the belief interval (belief function and plausibility function), BPA, and the uncertainty measure. It will play an essential role in describing system ambiguity in the future.