Keywords

1 Introduction

Clustering is a data analysis method which is used regularly in the strategy formulation, study of market and business system planning. Partition of commodities is a conventional issue in the inventory control and management. In most industries, there are different types of materials and components of machine or other apparatus to be managed for achieving desired goal. In order to enrich the efficiency of material management, an expert idea is to sort different materials into groups. However, different enterprises have different requirements in this area. Hence, the precision-based cluster analysis may not be practical. Furthermore, a clustering method based on fuzzy equivalence relation can comfortably tackle the separation analysis under fuzzy environment. The conventional clustering is aimed to assign each data point to only one cluster. But the fuzzy clustering assigns different degree of satisfaction to each level where the membership of a level is shared among various clusters. In this paper we are proposing the fuzzy equivalence class clustering using Minkowski, Mahalanobis, Cosine, Chebychev and Correlation distance function on the performance grading of different binders used in Turner–Fairbank Highway Research Center Polymer Research Program [19]. It was observed by Shenoy in his research that the super pave specification parameter \( \left| {G*} \right|/\left( {\frac{1}{Sin\delta }} \right) \) is not tolerable in classification polymer modified binders for high temperature performance grading of paving asphalts. It was a matter of concern to subtilize this parameter to gain more consciousness in the pavement performance and also detect other latent parameters that may better relate the rutting resistance. The refined super pave specification parameter, namely, \( \left| {G*} \right|/\left( {1 - \frac{1}{tan\delta .Sin\delta }} \right) \) has the highest merit for possible use. It is a viable alternative for getting the high temperature specification, such that it becomes more sensitive to field performance. Owing to the variations in the phase angle \( \delta \), the parameter \( \left| {G * } \right|/(1 - \left( {\frac{1}{tan\delta .sin\delta }} \right) \) can easily attain its efficiency as compared to the original super pave specification parameter. Another alternative would be to first define an equal stiffness temperature \( \left( {T_{e} \,^\circ {\text{C}}} \right) \), when the complex shear modulus \( \left( {\left| {G*} \right|} \right) \) takes a specific value of 50 kPa. This takes care of the rheological contribution coming from one portion of the term \( \left| {G*} \right|/\left[ {1 - \frac{1}{{\left( {tan\delta .Sin\delta } \right) }}} \right] \). The result in terms of high specification temperature \( \left( {T_{TH}^\circ {\text{C}}} \right) \) being defined as \( \left( {T_{e} \,^\circ {\text{C}}} \right)/\left[ {1 - \frac{1}{{\left( {tan\delta .Sin\delta } \right) }}} \right] \) and it is more meaningful to achieve eminent high specification temperature. To get the better discrimination between the performances of bindersat different membership grades with specification parameter \( \left| {G * } \right|/\left( {\frac{1}{.sin\delta }} \right) \), \( \left| {G * } \right|/(1 - \left( {\frac{1}{tan\delta .sin\delta }} \right) \) and \( \left( {T_{e}^\circ \,{\text{C}}} \right)/\left[ {1 - \frac{1}{{\left( {tan\delta .Sin\delta } \right) }}} \right] \), the fuzzy equivalence class clustering is proposed. Cluster analysis is one of the leading approaches to acknowledge the patterns. In order to get better classification of objects, the idea of fuzzy clustering was represented by many researchers. In this direction, they made significant contributions in the development of making a decision in existence of fuzziness, incomplete information. The first fuzzy clustering approach was initiated by Bellman et al. [1] and Ruspini [2] then Dunn [3] explained the Well-Separated Clusters and Optimal Fuzzy Partitions. Tamura et al. [4] figure out an n-step procedure using max-min fuzzy compositions (max-min similarity relation) and achieved a multi-level hierarchical clustering. Now fuzzy clustering has been extensively examined and practiced in multifarious areas by Bezdek et al. [5], Bezdek [6], Aldenderfer et al. [7], Trauwaert et al. [8], and Yang et al. [9,10,11]. Groenen [12] used Minskowski distance function to get fuzzy cluster analysis. Yang et al. [13] concentrated on Cluster analysis based on fuzzy relations and a clustering algorithm is created for the max-t similarity relation matrix. Then three critical max-t (max-min, max-prod and max-∆) compositions are compared. Liang et al. [14] determined the best number of cluster using a cluster validity index by taking suitable \( \lambda \) cut value. At first, the trapezoidal fuzzy numbers is defined based on subject’s attributes rating. The distance between two trapezoidal fuzzy numbers is computed subsequently to obtain the compatibility relation then the categorization of objects was done by fuzzy equivalence relation. Then Recently, Gustafson et al. [15], Fu et al. [17] and Kumam et al. [18] concentrated on fuzzy clustering analysis based on equivalence class and illustrated the desirable cluster. Bataineh et al. [16] compared the performances of fuzzy C-mean clustering algorithm and subtractive clustering algorithm according to their capabilities. The aim of this paper is to classify the binder’s performances using fuzzy equivalence clustering via Minkowski Cosine, Chebychev, Correlation and a new (Mahalanobis) distance function. The reliability and adequacy of the Mahalonobis distance on the clustering performance of Binders is examined over Minkowski and other distance functions.

2 Clustering Analysis Method

2.1 Mathematical Preliminaries

In this paper, some primal attributes of fuzzy cluster analysis are reviewed. First, we recapitulate the basics of fuzzy sets and fuzzy relations.

Definition 1.

Let X be an universal space then a fuzzy set on X, is defined by \( \tilde{A} = \{ ({\text{x}}, \mu_{{\tilde{A}}} (x))|{\text{x}} \in {\text{X}}\;{\text{and}}\; \mu_{{\tilde{A}}} (x) \to [0,1]\} \) where \( \mu_{{\tilde{A}}} \left( x \right) \) is the membership function or grade defined in X which gets values in the range [0, 1].

Definition 2.

Let \( \tilde{A} \) and \( \tilde{B} \) are two fuzzy sets, defined on universal spaces \( {\text{X}}\;{\text{and}}\;{\text{Y}} \) then a fuzzy relation on \( \left( {{\text{X}} \times {\text{Y}}} \right) \), is defined by \( \tilde{R} = \left\{ {\left[ {\left( {x,y} \right),\mu_{{\tilde{R}}} \left( {x,y} \right)} \right]|\left( {x,y} \right) \,\epsilon\, {\text{X}} \times {\text{Y}}} \right\} \) where

$$ \mu_{{\tilde{R}}} \left( {x,y} \right) \le { \hbox{min} }\{ \mu_{{\tilde{A}}} (x),\;\mu_{{\tilde{B}}} (x)\} $$

Definition 3.

Let \( \tilde{R}_{1} \) on \( \left( {{\text{X}} \times {\text{Y}}} \right) \) and \( \tilde{R}_{2} \) on \( \left( {{\text{Y}} \times {\text{Z}}} \right) \) be two fuzzy relations then the max-min composition \( \tilde{R}_{1} \circ \tilde{R}_{2} \) is defined by \( \tilde{R}_{1} \circ \tilde{R}_{2} = \left\{ {\left[ {\left( {x,z} \right),\mathop { \hbox{max} }\limits_{y \in Y} \left\{ {\hbox{min} \left\{ {\mu_{{\tilde{R}}} \left( {x,y} \right),\mu_{{\tilde{R}}} \left( {y,z} \right)} \right\}} \right\}} \right]|x \,\epsilon\, X,y \,\epsilon\, Y,z \,\epsilon\, Z} \right\} \).

Definition 4.

Let \( \tilde{R} \) be a fuzzy relations on \( \left( {{\text{X}} \times {\text{X}}} \right) \) then

  1. (1)

    \( \tilde{R} \) is called reflexive if \( \mu_{{\tilde{R}}} \left( {x,x} \right) = 1,\forall x \in {\text{X}} \)

  2. (2)

    \( \tilde{R} \) is called \( \varepsilon \)-reflexive if \( \mu_{{\tilde{R}}} \left( {x,x} \right) \ge \varepsilon ,\forall x \in {\text{X}} \)

  3. (3)

    \( \tilde{R} \) is called weakly reflexive if \( \mu_{{\tilde{R}}} \left( {x,y} \right) \le \mu_{{\tilde{R}}} \left( {x,x} \right)\;and\;\mu_{{\tilde{R}}} \left( {y,x} \right) \le \mu_{{\tilde{R}}} \left( {x,x} \right)\forall x \in {\text{X}} \).

Definition 5.

A fuzzy relation \( \tilde{R} \) is called symmetric if \( \mu_{{\tilde{R}}} \left( {x,y} \right) = \mu_{{\tilde{R}}} \left( {y,x} \right) \forall x,y \in {\text{X}} \).

Definition 6.

A fuzzy relation \( \tilde{R} \) is called transitive if \( \mu_{{\tilde{R}}} \left( {x,z} \right) \ge \mathop { \hbox{max} }\limits_{y \in Y} \{ { \hbox{min} }\{ \mu_{{\tilde{R}}} \left( {x,y} \right),\;\mu_{{\tilde{R}}} \left( {y,z} \right)\} \} \); \( \forall x,y,z \in {\text{X}} \).

Definition 7.

A fuzzy relation \( \tilde{R} \) on X is said to be compatible on X if it is reflexive and symmetric.

Definition 8.

A fuzzy relation \( \tilde{R} \) on X is said to be transitive on X if it is reflexive, symmetric and transitive.

2.2 The Construction of a Fuzzy Compatible and Equivalence Relation

To obtain the Fuzzy cluster analysis through equivalence class, the distances between the crisp data sets is required here we are proposing the Mahalonobis, Chebychev, Minkowski, Cosine and Correlation metric distance respectively on crisp data. Let X be an universal space and \( X_{i\;k} \;and\;X_{j\;k} \in X \) then the Minkowski metric distance on crisp data is defined as

$$ D_{{w\left( {i,j} \right)_{k} }} = \left[ {\sum\nolimits_{k = 1}^{n} {\left| {X_{i k} - X_{j\;k} } \right|^{w} } } \right]^{{\frac{1}{w}}} $$
(1)

The Minkowski’s measure holds for \( {\text{w}} \in \left[ {1,\infty } \right) \). For the special case of w = 1, it becomes Hamming distance, and when w = 2, it is Euclidean distance.

The Mahalonobis distance is defined as

$$ M_{{d\left( {i,j} \right)_{k} }} = \left\{ {\left[ {(X_{i } {-}X_{j } )^{T} V^{ - 1} (X_{i } {-}X_{j } )} \right]} \right\}^{1/2} $$
(2)

where V is the sample covariance matrix. If the covariance matrix V is the identity matrix, then the \( M_{{d\left( {i,j} \right)}} \) reduce to the Euclidean distance. If V is diagonal, then the determined distance measure is called a normalized Euclidean distance defined as \( M_{{d\left( {i,j} \right)_{k} }} = \sqrt {\frac{{\mathop \sum \nolimits_{k = 1}^{n} \left| {X_{i\;k} - X_{j\;k} } \right|^{2} }}{V}} \).

The Cosine distance is defined as

$$ D_{{cos\left( {i,j} \right)_{k} }} = 1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} X_{i\;k} .X_{j\;k} }}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} )^{2} } }} $$
(3)

The Correlation distance is defined as

$$ D_{{corr\left( {i,j} \right)_{k} }} = 1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i\;k} }} ).(X_{j\;k} - \overline{{X_{j\;k} }} )}}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i\;k} }} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} - \overline{{X_{j\;k} }} ))^{2} } }} $$
(4)

Where \( \overline{{X_{i\;k} }} = \frac{1}{n}\sum\nolimits_{k = 1}^{n} {X_{i\;k} } \) and \( \overline{{X_{j\;k} }} = \frac{1}{n}\sum\nolimits_{k = 1}^{n} {X_{j\;k} } \).

And the Chebychev distance is defined as

$$ D_{{max\left( {i,j} \right)}} = { \hbox{max} }_{k} \left| {X_{i\;k} - X_{j\;k} } \right| $$
(5)

According to the distances, the fuzzy compatible relation matrix is yielded. For Minkowski class (w = 2) the relation matrix is

$$ \tilde{R}\left( {X_{i } , X_{j} } \right) = 1 - \delta \left[ {\sum\nolimits_{k = 1}^{n} {\left| {X_{i\;k} - X_{j\;k} } \right|^{2} } } \right]^{{\frac{1}{2}}} \;{\text{Where}}\;\delta = \left\{ {max\left[ {\sum\nolimits_{k = 1}^{n} {\left| {X_{i\;k} - X_{j\;k} } \right|^{2} } } \right]^{{\frac{1}{2}}} .} \right\}^{ - 1} $$
(6)

And the Fuzzy compatible relation matrix for Mahalonobis distance is generated by

$$ \tilde{R}\left( {X_{i } , X_{ j} } \right) = 1 - \lambda \left\{ {\sqrt {\left[ {(X_{i } {-}X_{j } } \right)^{T} S^{ - 1} (X_{i } {-}X_{j } )]} } \right\}.\;{\text{Where}}\;\lambda = \left\{ {max\sqrt {\left[ {(X_{i } {-}X_{j } } \right)^{T} S^{ - 1} (X_{i } {-}X_{j } )]} } \right\}^{ - 1} $$
(7)

The Fuzzy compatible relation matrix for Cosine distance is

$$ \tilde{R}\left( {X_{i } , X_{ j} } \right) = 1 - \gamma \left\{ {1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} X_{i\;k} .X_{j\;k} }}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} )^{2} } }}} \right\}\;{\text{Where}}\;\gamma = \left\{ {\hbox{max} \left\{ {1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} X_{i\;k} .X_{j\;k} }}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} )^{2} } }}} \right\}} \right\}^{ - 1} $$
(8)

The Fuzzy compatible relation matrix for Correlation distance is

$$ \tilde{R}\left( {X_{i } ,X_{ j} } \right) = 1 - \rho \left\{ {1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i\;k} }} ).(X_{j\;k} - \overline{{X_{j\;k} }} )}}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i\;k} }} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} - \overline{{X_{j\;k} }} ))^{2} } }}} \right\}\;{\text{Where}}\;\rho = \left\{ {\hbox{max} \left\{ {1 - \frac{{\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i k} }} ).(X_{j\;k} - \overline{{X_{j\;k} }} )}}{{\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{i\;k} - \overline{{X_{i\;k} }} )^{2} } .\sqrt {\mathop \sum \nolimits_{k = 1}^{n} (X_{j\;k} - \overline{{X_{j\;k} }} ))^{2} } }}} \right\}} \right\}^{ - 1} $$
(9)

The Fuzzy compatible relation matrix for Chebychev distance is

$$ \tilde{R}\left( {X_{i } , X_{ j} } \right) = 1 - \sigma \left\{ {max_{k} \left| {X_{i\;k} - X_{j\;k} } \right|} \right\}\;{\text{Where}}\;\sigma = \left\{ {\hbox{max} \left\{ {max_{k} \left| {X_{i\;k} - X_{j\;k} } \right|} \right\}} \right\}^{ - 1} $$
(10)

After the fuzzy compatible relation matrix, the Fuzzy transitive closures were constructed for each matrix. If \( \tilde{R} \circ \tilde{R} \subseteq \tilde{R} \) then \( \tilde{R} \circ \tilde{R} = \tilde{R}_{T}^{2} \) is said to be transitive closure of \( \tilde{R} \) for \( k = 1 \). If \( \tilde{R} \circ \tilde{R}{ \varsubsetneq }\tilde{R} \) then construct \( \tilde{R}^{2} \circ \tilde{R}^{2} \). If \( \tilde{R}^{2} \circ \tilde{R}^{2} \subseteq \tilde{R}^{2} \) then \( \tilde{R}_{T}^{4} \) is said to be transitive closure of \( \tilde{R}^{2} \) for \( k = 2 \). If there are n-elements in the universal space then the fuzzy transitive closure is achieved until \( 2^{k} \ge n - 1 \).

The \( \alpha \)-cut relation can be obtained from a transitive fuzzy relation by taking the pairs which have membership degrees no less than \( \alpha \).

$$ \tilde{R}_\alpha = \left\{ {\left[ {\left( {x,y} \right),\mu_{{\tilde{R}}} \left( {x,y} \right) \ge \alpha } \right]|\left( {x,y} \right) \,\epsilon\, {\text{X}} \times {\text{Y}}} \right\}. $$
(11)

3 Experimental Data

The experimental data used was taken from the research [19] under NCHRP (National Co-operative Highway Research Program) and TFHRC (Turner–Fairbank Highway Research Center).

4 Result

Using descried methodology, the performance grades of binders were targeted for Mahalonobis metric, Minkowski (w = 2) metric, Chebychev matric, Cosine matric and Correlation metric distances. The fuzzy compatible relation matrices and transitive closure are derived for each distance (Figs. 1, 2, 3, 4 and 5).

Fig. 1.
figure 1

Graphical representation of results achieved by Minskowski distance.

Fig. 2.
figure 2

Graphical representation of results achieved by Mahalonobis distance.

Fig. 3.
figure 3

Graphical representation of results achieved by Cosine distance.

Fig. 4.
figure 4

Graphical representation of results achieved by Chebychev distance.

Fig. 5.
figure 5

Graphical representation of results achieved by Correlation distance.

5 Analysis

All the binders are separated according to their performances at different level of \( \alpha \) by using Minkowski, Mahalonobis, Cosine, Correlation and Chebychev distance differently. It is observed by Fig. 6 that Binder \( x_{1} \) = flux (B6224) is detached first at \( \alpha = 0.68 \), then Polymer-modified binder \( x_{5} \) = Elvaloy No. 1 (B6228) is separated at \( \alpha = 0.77 \) and after that \( x_{2} \) = unmodified base (B6225) separated at \( \alpha = 0.85 \). Similarly \( x_{16} ,x_{13} \ldots ,x_{12} \) are clustered successively at different high degree of \( \alpha \). The desired clusters are also identified from clustering tree for suitable value of \( \alpha \) for example if \( \alpha = 0.90 \), then the clusters are: \( \left\{ {\left\{ {x_{1} } \right\},\left\{ {x_{5} } \right\},\left\{ {x_{2} } \right\},\left\{ {x_{16} } \right\},\left\{ {x_{13} } \right\},\left\{ {x_{3} } \right\},\left\{ {x_{4} ,x_{6} ,x_{7} ,x_{8} ,x_{14} ,x_{15} } \right\},\left\{ {x_{9} ,x_{10} ,x_{11} ,x_{12} ,x_{17} } \right\}} \right\} \).

Fig. 6.
figure 6

Clustering tree by Minkowski distance

The total number of clusters can be obtained from clustering tree by Minkowski metric distance for different alpha and they are described as: If \( \alpha \in \left[ {00.68} \right] \) then N(c) = 1, If \( \alpha \in \left( {0.68 0.77} \right] \) then N(c) = 2. Similarly if \( \alpha \in \left( { 0.77 0.85} \right], \) N(c) = 3. If \( \alpha \in \left( {0.85 0.86} \right], \) N(c) = 4.

If \( \alpha \in \left( {0.86 0.88} \right], \) N(c) = 5. If \( \alpha \in \left( {0.88 0.89} \right] \) N(c) = 6. If \( \alpha \in \left( {0.89 0.93} \right] \), N(c) = 7.

If \( \alpha \in \left( {0.93 0.94} \right] \), N(c) = 8. If \( \alpha \in \left( {0.94 0.95} \right] \), N(c) = 9. If \( \alpha \in \left( {0.95 0.96} \right] \), N(c) = 11.

If \( \alpha \in \left( {0.96 0.97} \right], \) N(c) = 13. If \( \alpha \in \left( {0.97 1} \right] \) N(c) = 17.

According to the Clustering tree by Mahalonobis metric distances, it is observed by Fig. 7 that the binder \( x_{1} \) = flux (B6224) is detached first at \( \alpha = 0.49 \), then Polymer-modified binder \( x_{5} \) = Elvaloy No. 1 (B6228) is separated at \( \alpha = 0.54 \). After that \( x_{16} \) = B6252 is separated at \( \alpha = 0.64. \) similarly \( x_{15} ,x_{2} \ldots ,x_{8} \) are clustered successively at different high degree of \( \alpha \). The desired clusters are also identified for suitable value of \( \alpha \). If the \( \alpha = 0.90 \), then the binders are separated differently as compared to Minkowski and other distances and the clusters are: \( \left\{ {\left\{ {x_{1} } \right\},\left\{ {x_{5} } \right\},\left\{ {x_{16} } \right\},\left\{ {x_{15} } \right\},\left\{ {x_{2} } \right\},\left\{ {x_{13} } \right\},\left\{ {x_{3} } \right\},\left\{ {x_{10} } \right\}\left\{ {x_{4} ,x_{11} ,x_{14} } \right\},\left\{ {x_{9} ,x_{12} \} ,\{ x_{6} ,x_{7} ,x_{8} } \right\}} \right\} \).

Fig. 7.
figure 7

Clustering tree by Mahalonobis distance.

The total number of clusters can be obtained from clustering tree by Mahalonobis metric distance or different alpha and they are described as: If \( \alpha \in \left[ {00.49} \right] \) then N(c) = 1. If \( \alpha \in \left( {0.49 0.54} \right] \) then N(c) = 2. Similarly If \( \alpha \in \left( {0.54 0.61} \right] \), N(c) = 3. If \( \alpha \in \left( {0.61 0.73} \right] \), N(c) = 4.

If \( \alpha \in \left( {0.73 0.75} \right] \), N(c) = 5. If \( \alpha \in \left( {0.75 0.84} \right] \), N(c) = 6. If \( \alpha \in \left( {0.84 0.85} \right] \), N(c) = 7. If \( \alpha \in \left( {0.85 0.86} \right] \), N(c) = 8. If \( \alpha \in \left( {0.86 0.87} \right] \), N(c) = 9. If \( \alpha \in \left( {0.87 0.88} \right] \), N(c) = 10. If \( \alpha \in \left( {0.88 0.90} \right] \), N(c) = 11. If \( \alpha \in \left( {0.90 0.91} \right] \), N(c) = 12. If \( \alpha \in \left( {0.910.93} \right] \), N(c) = 14. If \( \alpha \in \left( {0.93 0.94} \right] \), N(c) = 15. If \( \alpha \in \left( {0.940.96} \right] \), N(c) = 16. If \( \alpha \in \left( {0.96 1} \right] \), N(c) = 17.

According to the Clustering tree by Correlation distances it is observed by Fig. 8 that Binder \( x_{1} \) = flux (B6224) is detached first at \( \alpha = 0.87 \), then Polymer-modified binder \( x_{5} \) = Elvaloy No. 1 (B6228) is separated at \( \alpha = 0.96 \). Similarly remaining binders are clustered into two groups after \( \alpha = 0.99 \). The desired clusters are also identified from clustering tree for suitable value of \( \alpha \) for example if \( \alpha = 0.90 \), then the clusters are: \( \left\{ {\left\{ {x_{1} } \right\},\left\{ {x_{2} ,x_{3} ,x_{4} ,x_{5} ,x_{6} ,x_{7} ,x_{8} ,x_{9} ,x_{10} ,x_{11} ,x_{12} ,x_{13} ,x_{14} ,x_{15} ,x_{16} ,x_{17} } \right\}} \right\} \).

Fig. 8.
figure 8

Clustering tree by Correlation distance.

The total number of clusters can be obtained from clustering tree by Correlation distance for different alpha and they are described as: If \( \alpha \in \left[ {00.87} \right] \) then N(c) = 1, If \( \alpha \in \left( {0.87 0.96} \right] \) then N(c) = 2. Similarly if \( \alpha \in \left( { 0.96 0.99} \right] \), N(c) = 3. If \( \alpha \in \left( {0.99\;1} \right] \), N(c) = 4.

According to the Clustering tree by Chebychev distances. It is observed by Fig. 9 that Binder \( x_{1} \) = flux (B6224) is detached first at \( \alpha = 0.66 \), then Polymer-modified binder \( x_{5} \) = Elvaloy No. 1 (B6228) is separated at \( \alpha = 0.73 \) and after that \( x_{16} \) = polymer-modified Ethylene–Styrene–Inter polymer No. 2 (B6252) is separated at \( \alpha = 0.79 \) Similarly \( x_{2} ,x_{3} \ldots ,x_{9} \) are clustered successively at different high degree of \( \alpha \). The binders are separated differently as compared to the Mahalonobis Cosine and Correlation distances but the separation is quite similar to the Minskowski metric distance. The desired clusters are also identified from clustering tree for suitable value of \( \alpha \) for example if \( \alpha = 0.90 \), then the clusters are:\( \left\{ {\left\{ {x_{1} } \right\},\left\{ {x_{5} } \right\},\left\{ {x_{16} } \right\},\left\{ {x_{2} } \right\},\left\{ {x_{3} } \right\},\left\{ {x_{13} } \right\},\left\{ {x_{4} ,x_{6} ,x_{7} ,x_{8} ,x_{9} ,x_{10} ,x_{11} ,x_{12} ,x_{14} ,x_{15} ,x_{17} } \right\}} \right\} \).

Fig. 9.
figure 9

Clustering tree by Chebychev distance.

The total number of clusters can be obtained from clustering tree by Chebychev distance for different alpha and they are described as: If \( \alpha \in \left[ {00.66} \right] \) then N(c) = 1, If \( \alpha \in \left( {0.66 0.73} \right] \) then N(c) = 2. Similarly if \( \alpha \in \left( { 0.73\;0.79} \right] \), N(c) = 3. If \( \alpha \in \left( {0.79\;0.84} \right] \), N(c) = 4.

If \( \alpha \in \left( {0.84\;0.86} \right] \) N(c) = 5. If \( \alpha \in \left( {0.86\;0.87} \right] \) N(c) = 6. If \( \alpha \in \left( {0.87\;0.92} \right] \), N(c) = 7.

If \( \alpha \in \left( {0.92\;0.94} \right] \), N(c) = 8. If \( \alpha \in \left( {0.94\;0.95} \right] \), N(c) = 12. If \( \alpha \in \left( {0.95\;0.96} \right] \), N(c) = 13.

If \( \alpha \in \left( {0.96\;0.97} \right] \), N(c) = 15. If \( \alpha \in \left( {0.97\;1} \right] \) N(c) = 17.

According to the Clustering tree by Cosine distances it is observed by Fig. 10 that no binder is separated till \( \alpha = 0.91 \) and after \( \alpha = 0.91 \) two binders \( x_{5} \) = Elvaloy No.1 (B6228) and \( x_{16} \) = polymer-modified Ethylene–Styrene–Inter polymer No. 2 (B6252) are detached together as a one cluster. After the \( \alpha = 0.93 \) binder \( x_{5} ,x_{16} \) and binder \( x_{1} \) = flux (B6224) are detached separately. Similarly remaining binders are clustered successively at different high degree of \( \alpha \). The desired clusters are also identified from clustering tree for suitable value of \( \alpha \).

Fig. 10.
figure 10

Clustering tree by Cosine distance.

The total number of clusters can be obtained from clustering tree for different alpha and they are described as: If \( \alpha \in \left[ {00.91} \right] \) then N(c) = 1, If \( \alpha \in \left( {0.91\;0.93} \right] \) then N(c) = 2. Similarly if \( \alpha \in \left( { 0.93\;0.98} \right], \) N(c) = 4. If \( \alpha \in \left( {0.98 0.99} \right], \) N(c) = 5. If \( \alpha \in \left( {0.99\;1} \right) \), N(c) = 6. If \( \alpha = 1 \) N(c) = 9.

The following graph shows the number of clusters achieved by Mahalonobis, Chebychev, Minskowski \( \left( {w = 2} \right) \), Cosine, and Correlation distance with respect to different membership grade. The all distances illustrate the same number of cluster (N(c) = 1) till membership grade \( \alpha = 0.49 \). After the \( \alpha = 0.49 \), there exits significant difference in the number of cluster by all five distance functions. The Mahalonobis distance quantize more number of cluster than other four distances function for each \( \alpha \in \left( {0.49\;0.97} \right] \). The clustering performance achieved by Chebychev distance function is quite better than Minskowski \( \left( {w = 2} \right) \) distance and substantially finer than Cosine and Correlation distance function. The Mahalonobis, Chebychev, and Minskowski \( \left( {w = 2} \right) \) demonstrate the same number of cluster (N(c) = 17) for each \( \alpha \in \left( {0.97\;1} \right] \). Overall the Mahalonobis distance shows the viable feasibility as compared to Chebychev, Minskowski \( \left( {w = 2} \right) \), Cosine, and Correlation distance function in terms of desired number of clusters (Fig. 11).

Fig. 11.
figure 11

The comparison of clustering by Mahalonobis, Chebychev, Minkowski, Cosine and Correlation distances.

6 Conclusion

In this paper a Comparative fuzzy equivalence class clustering of binders based on their performance, is proposed. The performances of binders were graded in terms of high specific temperature at three different parameters. Five distance functions namely Minskowski (w = 2), Mahalonobis, Cosine, Chebychev and Correlation are applied in the separation methodology and it is a first attempt where the Mahalonobis distance function is used for equivalence fuzzy clustering. The fuzzy compatible relation matrices and transitive closures are derived for each distance function. Then the separate cluster analysis is done for Minskowski, Malonobis, Cosine, Chebychev and Correlation distance function and the desired clusters are identified for suitable value of membership grade. It was observed that the overall clustering performance of binders by Mahalonobis distance function is better than the performance by Minskowski and other distance function. The overall Performance achieved by Chebychev distance function is quite better than Minskowski \( \left( {w = 2} \right) \) distance and substantially finer than Cosine and Correlation distance function. Mahalonobis distance function produce more number of cluster at most of the \( \alpha \)-level. The separation stages by Mahalonobis distance are also extensively better than other distance. So the fuzzy cluster analysis by Mahalonobis distance function can provide effective grip in the separation analysis and strategy formulation.