Introduction

A new approach to knowledge sharing, known as Electronic learning (E-learning) has seen a growing trend among the youth in the recent years. This rise can be attributed to the rapidly advancing information and communication technology. The basic notion of this approach is to educate and impart knowledge to learners through modern technologies like internet. In other words, E-learning is a network-enabled technology which educates and teaches students through means like internet, virtual classes. Mahanta and Ahmed (2012) talk of web and computer-based learning technology wherein students have access to study sources and acquire knowledge via digital channel. The growth of E-learning depends on the development and improvement in the quality and effectiveness of knowledge transfer. CD’s, DVD’s and internet are the most common electronic means used in E-learning approach (Covella and Olsina 2002).

The current state of conditions shows that E-learning is picking up quickly over the traditional classroom education system. This approach provides learners with quality education accessible anywhere around the world at any time. Some of the advantages of E-learning are as follows: low-priced, good education standard, online access, constant improvements in study modules and learning at own pace, etc. Research shows that the performance of E-learning is dependent on many factors like study modules, user interface and support (Zaman et al. 2012). These aspects are controlled by factors such as pedagogical styles, multimedia enhancements, element of interactivity, use of teaching aids and logical style of presentation. In the absence of face-to-face mode of teaching and learning in which the process is facilitated by a teacher/instructor, the role of e-learning platform becomes crucial in which the platform should not only take care of contents but also its style of presentation, ease of learning through modern tools such as graphics, audiovisuals and tables. This style and nature of platform makes it easy and interesting for the learner to grasp the content. The educators who tend to develop e-learning materials should be aware of the various modern tools which can be employed and fitted in typical practical situations. The developer must use graphical animation as well as audiovisual components quite often in the content of e-learning websites which not only tends to break the monotonicity of heavy text contents and ensures concentration of the user but also supports the imagination for content which is abstract in nature. Organisations are also increasingly opting for E-learning services rather than hiring trainers which proves to be beneficial in many ways like less expensive, better content and reduced physical classroom training. Today, many of the world-renowned universities are also providing open courseware for students who have a desire to learn from some of the best faculties. Due to the growing popularity of E-learning platforms and swift rise in the number of learning websites available, choosing the right platform becomes crucial for learners. The selection of the best performance website in terms of different criteria is discussed in this paper. This paper considers the selection of the suitable E-learning website to be a MCDM problem and attempts to solve it using PIV method.

MCDM literature review

The existing research shows the application of various MCDM techniques to rank E-learning websites based on certain performance criteria. There are various criteria considered by different researchers. Volery and Lord (2000) assessed the websites based on technology and instructors facilitating knowledge transfer, whereas Blanc and Wands (2001) evaluated the websites on success factors which include organisational, cognitive and general factors. Soong et al. (2001) examined the websites considering attributes like infrastructure, technical ability, cooperation, attitude of users and service providers and other human factors. Govindasamy (2001) listed out factors like support of learners, teachers and the e-platform, module design and development and evaluation methods. Ehlers (2004) evaluated criteria like support to user, service worth, module division, teaching method and transparency of the platform. Pruengkarn et al. (2005) considered quality parameters like ease of use, efficiency, functional performance, maintenance, access location to evaluate E-learning platforms. Selim (2007) solved the problem considering student and tutor skills, website structure and university appreciation as decision criteria. An evaluation model called HELAM was given by Ozkan and Koseler (2009) to evaluate E-learning platforms taking into consideration the tutors and students perspectives, study module standardisation, quality of user interface and supportive help. Sela and Sivan (2009) suggested the following factors to be adopted to become a successful service provider: incentives, organisation system, easy interface, learning time, compulsory use, need to learn, support from management and advertising teams, whereas Mosakhani and Jamporazmey (2010) suggested factors like ICT, tutor’s and student’s skills, course modules design and student–teacher interaction. Vukovac et al. (2010) studied two categories of factors extensively, i.e. general attributes and specific E-learning attributes. FitzPatrick (2012) suggested the following factors for higher education E-learning systems: institution assistance, technological advancement, human factor, assessment method and platform structure. Alias et al. (2012) mentioned factors like supportive attitude, appearance, communication, linked association, utility, effectiveness, layout, information, security and trust to be the most desired qualities by students. XaymoungKhoun et al. (2012) evaluated E-learning websites using two methods, analytical hierarchy process (AHP) (Saaty 1980) and Delphi considering various criteria like architecture, student and tutors skills, module standard, motivating attitude, environment and support from institute. Cheawjindakarn et al. (2012) identified critical parameters like instruction pattern, evaluation scheme, management, supportive attitude and institute’s management for a successful online distance program. Oztekin et al. (2013) proposed to evaluate the usability of an E-learning platform using machine learning concepts.

Yunus and Salim (2013) gave the E-learning evaluation model considering parameters like user interface, module quality and teaching method, inspiring attitude, efficiency, academic interaction among students and with the teachers, infrastructure, instruction, interactivity and media. Öztürk (2014) used analytical neural network (ANP) to prioritise e-learning platforms by selecting factors like multimedia use, examination style, learner, infrastructure, administrative and counselling services. Aparicio et al. (2016) considered facilities, interactors and technological influence to anticipate a theoretical structure to evaluate E-learning platforms. Jain et al. (2016) suggested the use of WDBA method to rank E-learning platforms by considering factors like security, correct and easy to understand modules, complete modules, navigation, personal customisation, navigation and system interface. A wide and extensive study of the researches carried out by several researchers suggests that the problem of evaluation of E-learning websites is a MCDM problem.

Research framework and proposed MCDM method

In this research work, two illustrative examples related to the selection of the E-learning websites which have already been solved by the previous researchers have been selected and solved by the PIV method. Selection of the E-learning websites is indeed an MCDM problem as it comprises of several alternatives which are evaluated on the basis of conflicting criteria. First step in solving an MCDM problem is to select the alternatives and decision criteria. In this research, the E-learning website alternatives and the decision criteria already selected by the previous researchers have been considered (Garg 2017; Garg and Jain 2017). Further, it is also necessary to determine criteria weights to reflect the relative importance of the involved criteria. Several methods such as AHP, FAHP, entropy, standard deviation, Best–Worst method, Principal component analysis, are available in literature which can be used to determine criteria weights. Our concern in this research is not to calculate the criteria weights and therefore, we have simply taken the criteria weights calculated by the previous research studies using FAHP. For ranking and selection of the E-learning websites, a recently developed MCDM method, i.e. PIV method has been used which is described in the following section. The research framework adopted in this paper is shown in Fig. 1.

Fig. 1
figure 1

Research framework

Proximity Indexed Value (PIV) method

The method proposed in this paper has been developed by Mufazzal and Muzakkir (2018) which can be used by the decision makers for solving varieties of MCDM problems including the selection of the most suitable E-learning websites. This method involves the following simple steps:

  1. Step 1:

    Identify the available alternatives Ai (i = 1, 2,…., m) and decision criteria Cj (j = 1, 2,…., n) involved in the decision problem.

  2. Step 2:

    Formulate the decision matrix Y by arranging alternatives in rows and criteria in columns as given in Eq. (1)

    $$Y = \left[ {Y_{ij} } \right]_{m \times n} = \left[ {\begin{array}{*{20}c} {Y_{11} } & {Y_{12} } & \ldots & {Y_{1j} } & \ldots & {Y_{1n} } \\ {Y_{21} } & {Y_{22} } & \ldots & \ldots & \ldots & {Y_{2n} } \\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots \\ {Y_{i1} } & \ldots & \ldots & {Y_{ij} } & \ldots & {Y_{in} } \\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots \\ {Y_{m1} } & \ldots & \ldots & {Y_{mj} } & \ldots & {Y_{mn} } \\ \end{array} } \right]$$
    $${\text{where}}\, i = 1, 2, \ldots , m;\,\,j = 1, 2, \ldots , n$$
    (1)

    where Yij represents ith alternative performance value on jth criterion, m is the number of alternatives and n is the number of criteria.

  3. Step 3:

    Determine the normalised decision matrix using Eq. (2)

    $$R_{i} = \frac{{Y_{i} }}{{\sqrt {\mathop \sum \nolimits_{i = 1}^{m} Y_{i}^{2} } }},$$
    (2)

    where Yi is the actual decision value of the ith alternative.

  4. Step 4:

    Determine the weighted normalised decision matrix using Eq. (3)

    $$v_{i} = w_{i} \times R_{i},$$
    (3)

    where wj is the weight of the jth criterion.

  5. Step 5:

    Evaluate the Weighted Proximity Index (WPI), \(u_{i}\) using Eq. (4)

    $$u_{i} = \left\{ {\begin{array}{*{20}c} {v_{{{\text{max}}}} - v_{i} ;} & {{\text{for}}\,{\text{beneficial}}\,{\text{attributes}}} \\ {v_{i} - v_{{{\text{min}}}} ;} & {{\text{for}}\,{\text{cost}}\,{\text{attributes}}} \\ \end{array} } \right\}.$$
    (4)
  6. Step 6:

    Determine the Overall Proximity Value, \(d_{i}\) using Eq. (5)

    $$d_{i} = \mathop \sum \limits_{j = 1}^{n} u_{i}.$$
    (5)
  7. Step 7:

    Rank the alternatives based on \(d_{i}\) values. The alternative with least value of di represents minimum deviation from the best and therefore, it is ranked first, followed by alternatives with increasing \(d_{i}\).

Illustrative examples

This section presents two examples pertaining to the selection of E-learning websites to reveal applicability and efficacy of the combined FAHP-PIV methods in providing solution to the website selection problems.

Example 1

This example is taken from Garg (2017) in which the authors have considered problem of selecting the E-learning websites. Table 1 shows the 5 alternative websites and 10 criteria/attributes for this problem. Garg (2017) used FAHP for the determination of the criteria weights. In this decision problem, functionality (C1), maintainability (C2), portability (C3), reliability (C4), usability (C5) and efficiency (C6) are beneficial criteria whose high values are required, whereas ease of learning community (C7), personalisation (C8), system content (C9) and general factors (C10) are non-beneficial criteria for which lower values are preferred. The beneficial criteria and non-beneficial criteria have been indicated with (+) and (−), respectively.

Table 1 Decision matrix for Example 1 (Garg 2017)

Weights of the criteria calculated by Garg (2017) using FAHP are shown in Table 2. Since our main objective is to demonstrate the applicability of the PIV method, not to calculate the criteria weights therefore, we used the criteria weights obtained by Garg (2017) for ranking the alternatives using PIV method.

Table 2 Criteria weights for Example 1 (Garg 2017)

Normalised decision matrix, as shown in Table 3, was obtained using Eq. (2).

Table 3 Normalised decision matrix

Using criteria weights (Table 2), weighted normalised decision matrix was obtained using Eq. (3) and it is shown in Table 4.

Table 4 Weighted normalised decision matrix

The weighted proximity index (ui), the overall proximity value (di) of all the alternatives were calculated using Eqs. (4) and (5), respectively, as shown in Table 5. Based on the values of di, the ranking of alternatives was done in such a way that the alternative with the least value of the di is ranked first followed by the alternatives with increased values of di. The ranking of alternatives is also shown in Table 5.

Table 5 Weighted proximity index, overall proximity index and ranking results

It is evident from Table 5 that the ranking order of the e-learning websites is CPW-5 > CPW-1 > CPW-3 > CPW-4 > CPW-2. Table 6 shows the comparison of ranking of all the five websites obtained by different MCDM methods.

Table 6 Ranking results of the five e-learning websites obtained by different MCDM methods

It is evident from Table 6 that the proposed PIV method gives exactly same ranking as that of AHP and COPRAS. However, there is a small difference in the ranking given by the PIV and WEDBA methods. The Spearman’s correlation coefficient (r) values between rankings of the websites obtained by different methods are shown in Table 7. Table 7 reveals almost the same performance of all the four MCDM methods.

Table 7 Spearman’s correlation coefficient (r) values

Example 2

This example is taken from Garg and Jain (2017) in which the authors have considered the problem of selecting the e-learning websites. Table 8 shows the eight alternative websites and ten criteria/attributes for this problem. The authors used FAHP for determination of the criteria weights. In this decision problem, functionality (C1), maintainability (C2), portability (C3), reliability (C4), usability (C5) and efficiency (C6) are beneficial criteria for which high values are required, whereas ease of learning community (C7), personalisation (C8), system content (C9) and general factors (C10) are non-beneficial criteria for which lower values are preferred. The beneficial criteria and non-beneficial criteria have been indicated with (+) and (-), respectively.

Table 8 Decision matrix for Example 1 (Garg and Jain 2017)

Weights of the criteria shown in Table 2 were used for ranking the alternatives using PIV method. Normalised decision matrix, as shown in Table 9, was obtained using Eq. (2)

Table 9 Normalised decision matrix

Using criteria weights (Table 2), weighted normalised decision matrix was obtained using Eq. (3) and it is shown in Table 10.

Table 10 Weighted normalised decision matrix

The weighted proximity index (ui), the overall proximity value (di) of all the alternatives were calculated using Eqs. (4) and (5) respectively as shown in Table 11. Based on the values of di, the ranking of alternatives was done in such a way that the alternative with the least value of the di is ranked first followed by the alternatives with increased values of di. The ranking of alternatives is also shown in Table 11.

Table 11 Weighted proximity index, overall proximity index and ranking results

Table 11 reveals the ranking order of the e-learning websites as CPW-5 > CPW-7 > CPW-1 > CPW-3 > CPW-6 > CPW-4 > CPW-8 > CPW-2. Table 12 shows the comparison of ranking of all eight websites obtained by different MCDM methods.

Table 12 Ranking results of the five e-learning websites obtained by different MCDM methods

It is evident from Table 12 that the ranking of the E-learning websites given by the proposed PIV method exactly matches with that of COPRAS and VIKOR. However, there is a small difference in the ranking given by the PIV and WDBA methods. The Spearman’s correlation coefficient (r) values between rankings of the websites obtained by different methods are shown in Table 13. Thus, Table 13 reveals almost the same performance of all the four MCDM methods.

Table 13 Spearman’s correlation coefficient (r) values

Conclusions, limitations and future research directions

The main objective of this paper was to demonstrate applicability and effectiveness of a newly developed MCDM method, i.e. Preference Indexed Value (PIV) method for the selection of the best ‘C’ programming language E-learning website from the existing ones. PIV method was applied on two problems related to the selection of the E-learning websites which were solved by researchers using relatively more complex methods. In the first problem, five E-learning websites were considered and these were ranked by PIV method and ranking order was found as CPW-5 > CPW-1 > CPW-3 > CPW-4 > CPW-2. Similarly, in the second problem, eight E-learning websites were considered and their ranking order using PIV method was found as CPW-5 > CPW-7 > CPW-1 > CPW-3 > CPW-6 > CPW-4 > CPW-8 > CPW-2. For both problems, ranking of the E-learning websites obtained by the PIV method was compared with those derived by other methods such as AHP, VIKOR, COPRAS, WEDBA and WDBA, and it was found that ranking given by the PIV method exactly matched with those given by other methods except WEDBA and WDBA methods. A small difference in the ranking given by PIV method and WEDBA as well as WDBA was observed. The description of the PIV method given in Sect. “Proximity Indexed Value (PIV) method” of this paper reveals that this method comprises of relatively simple computational steps as compared to other MCDM methods and also this method minimises rank reversal problems (Mufazzal and Muzakkir 2018) which is a major issue associated with the MCDM methods. Hence, it is suggested that PIV, being a very simple MCDM method, may be used for solving varieties of decision-making problems. The method proposed in this paper can be used by the decision makers for solving varieties of MCDM problems including the selection of the most suitable E-learning websites.

Although the proposed method minimises the rank reversal problem compared to previously well-established techniques, it does not conclusively eliminate the issue. The reason is that it employs normalisation process, which indirectly affects the relative ranking of alternatives, altogether. This means when more than two alternatives are compared at a time, the relative ranking of the two alternatives will be affected by the presence of other irrelevant alternatives, due to normalisation. Hence, two directions follow: (i) either to make the process of normalisation free to eliminate the reversal problem or (ii) otherwise reduce the influence of normalisation to mitigate the issue. The later approach has been adopted in the proposed method, and thus the problem is just only reduced and not removed. This is crucial because rank reversal gives an idea of ranking reliability.

Further, this method only provides ranking procedure and does not throw light on finding criteria weights. Hence, this could be further extended by combining both the tasks, to develop a complete framework for decision making.