Abstract
This paper presents application of a newly developed multi-criteria decision-making (MCDM) method, i.e. Proximity Indexed Value (PIV) method for the ranking and selection of the E-learning websites. PIV is a computationally simpler method as compared to other MCDM methods such as AHP, VIKOR, COPRAS, WEDBA, WDBA, and it also minimises the rank reversal problem. The applicability and efficacy of the PIV method has been demonstrated with the help of two illustrative examples pertaining to the selection of the E-learning websites which have already been solved by researchers using different MCDM methods. Results of this study revealed that the ranking of the E-learning websites obtained by the PIV method exactly matched with those derived by AHP, VIKOR and COPRAS. However, a small difference in the ranking by PIV method with those of WEDBA and WDBA was observed. It suggests that PIV method is a simple, effective and efficient method which can be used to solve different types of problems related to the ranking and selection of alternatives.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
A new approach to knowledge sharing, known as Electronic learning (E-learning) has seen a growing trend among the youth in the recent years. This rise can be attributed to the rapidly advancing information and communication technology. The basic notion of this approach is to educate and impart knowledge to learners through modern technologies like internet. In other words, E-learning is a network-enabled technology which educates and teaches students through means like internet, virtual classes. Mahanta and Ahmed (2012) talk of web and computer-based learning technology wherein students have access to study sources and acquire knowledge via digital channel. The growth of E-learning depends on the development and improvement in the quality and effectiveness of knowledge transfer. CD’s, DVD’s and internet are the most common electronic means used in E-learning approach (Covella and Olsina 2002).
The current state of conditions shows that E-learning is picking up quickly over the traditional classroom education system. This approach provides learners with quality education accessible anywhere around the world at any time. Some of the advantages of E-learning are as follows: low-priced, good education standard, online access, constant improvements in study modules and learning at own pace, etc. Research shows that the performance of E-learning is dependent on many factors like study modules, user interface and support (Zaman et al. 2012). These aspects are controlled by factors such as pedagogical styles, multimedia enhancements, element of interactivity, use of teaching aids and logical style of presentation. In the absence of face-to-face mode of teaching and learning in which the process is facilitated by a teacher/instructor, the role of e-learning platform becomes crucial in which the platform should not only take care of contents but also its style of presentation, ease of learning through modern tools such as graphics, audiovisuals and tables. This style and nature of platform makes it easy and interesting for the learner to grasp the content. The educators who tend to develop e-learning materials should be aware of the various modern tools which can be employed and fitted in typical practical situations. The developer must use graphical animation as well as audiovisual components quite often in the content of e-learning websites which not only tends to break the monotonicity of heavy text contents and ensures concentration of the user but also supports the imagination for content which is abstract in nature. Organisations are also increasingly opting for E-learning services rather than hiring trainers which proves to be beneficial in many ways like less expensive, better content and reduced physical classroom training. Today, many of the world-renowned universities are also providing open courseware for students who have a desire to learn from some of the best faculties. Due to the growing popularity of E-learning platforms and swift rise in the number of learning websites available, choosing the right platform becomes crucial for learners. The selection of the best performance website in terms of different criteria is discussed in this paper. This paper considers the selection of the suitable E-learning website to be a MCDM problem and attempts to solve it using PIV method.
MCDM literature review
The existing research shows the application of various MCDM techniques to rank E-learning websites based on certain performance criteria. There are various criteria considered by different researchers. Volery and Lord (2000) assessed the websites based on technology and instructors facilitating knowledge transfer, whereas Blanc and Wands (2001) evaluated the websites on success factors which include organisational, cognitive and general factors. Soong et al. (2001) examined the websites considering attributes like infrastructure, technical ability, cooperation, attitude of users and service providers and other human factors. Govindasamy (2001) listed out factors like support of learners, teachers and the e-platform, module design and development and evaluation methods. Ehlers (2004) evaluated criteria like support to user, service worth, module division, teaching method and transparency of the platform. Pruengkarn et al. (2005) considered quality parameters like ease of use, efficiency, functional performance, maintenance, access location to evaluate E-learning platforms. Selim (2007) solved the problem considering student and tutor skills, website structure and university appreciation as decision criteria. An evaluation model called HELAM was given by Ozkan and Koseler (2009) to evaluate E-learning platforms taking into consideration the tutors and students perspectives, study module standardisation, quality of user interface and supportive help. Sela and Sivan (2009) suggested the following factors to be adopted to become a successful service provider: incentives, organisation system, easy interface, learning time, compulsory use, need to learn, support from management and advertising teams, whereas Mosakhani and Jamporazmey (2010) suggested factors like ICT, tutor’s and student’s skills, course modules design and student–teacher interaction. Vukovac et al. (2010) studied two categories of factors extensively, i.e. general attributes and specific E-learning attributes. FitzPatrick (2012) suggested the following factors for higher education E-learning systems: institution assistance, technological advancement, human factor, assessment method and platform structure. Alias et al. (2012) mentioned factors like supportive attitude, appearance, communication, linked association, utility, effectiveness, layout, information, security and trust to be the most desired qualities by students. XaymoungKhoun et al. (2012) evaluated E-learning websites using two methods, analytical hierarchy process (AHP) (Saaty 1980) and Delphi considering various criteria like architecture, student and tutors skills, module standard, motivating attitude, environment and support from institute. Cheawjindakarn et al. (2012) identified critical parameters like instruction pattern, evaluation scheme, management, supportive attitude and institute’s management for a successful online distance program. Oztekin et al. (2013) proposed to evaluate the usability of an E-learning platform using machine learning concepts.
Yunus and Salim (2013) gave the E-learning evaluation model considering parameters like user interface, module quality and teaching method, inspiring attitude, efficiency, academic interaction among students and with the teachers, infrastructure, instruction, interactivity and media. Öztürk (2014) used analytical neural network (ANP) to prioritise e-learning platforms by selecting factors like multimedia use, examination style, learner, infrastructure, administrative and counselling services. Aparicio et al. (2016) considered facilities, interactors and technological influence to anticipate a theoretical structure to evaluate E-learning platforms. Jain et al. (2016) suggested the use of WDBA method to rank E-learning platforms by considering factors like security, correct and easy to understand modules, complete modules, navigation, personal customisation, navigation and system interface. A wide and extensive study of the researches carried out by several researchers suggests that the problem of evaluation of E-learning websites is a MCDM problem.
Research framework and proposed MCDM method
In this research work, two illustrative examples related to the selection of the E-learning websites which have already been solved by the previous researchers have been selected and solved by the PIV method. Selection of the E-learning websites is indeed an MCDM problem as it comprises of several alternatives which are evaluated on the basis of conflicting criteria. First step in solving an MCDM problem is to select the alternatives and decision criteria. In this research, the E-learning website alternatives and the decision criteria already selected by the previous researchers have been considered (Garg 2017; Garg and Jain 2017). Further, it is also necessary to determine criteria weights to reflect the relative importance of the involved criteria. Several methods such as AHP, FAHP, entropy, standard deviation, Best–Worst method, Principal component analysis, are available in literature which can be used to determine criteria weights. Our concern in this research is not to calculate the criteria weights and therefore, we have simply taken the criteria weights calculated by the previous research studies using FAHP. For ranking and selection of the E-learning websites, a recently developed MCDM method, i.e. PIV method has been used which is described in the following section. The research framework adopted in this paper is shown in Fig. 1.
Proximity Indexed Value (PIV) method
The method proposed in this paper has been developed by Mufazzal and Muzakkir (2018) which can be used by the decision makers for solving varieties of MCDM problems including the selection of the most suitable E-learning websites. This method involves the following simple steps:
-
Step 1:
Identify the available alternatives Ai (i = 1, 2,…., m) and decision criteria Cj (j = 1, 2,…., n) involved in the decision problem.
-
Step 2:
Formulate the decision matrix Y by arranging alternatives in rows and criteria in columns as given in Eq. (1)
$$Y = \left[ {Y_{ij} } \right]_{m \times n} = \left[ {\begin{array}{*{20}c} {Y_{11} } & {Y_{12} } & \ldots & {Y_{1j} } & \ldots & {Y_{1n} } \\ {Y_{21} } & {Y_{22} } & \ldots & \ldots & \ldots & {Y_{2n} } \\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots \\ {Y_{i1} } & \ldots & \ldots & {Y_{ij} } & \ldots & {Y_{in} } \\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots \\ {Y_{m1} } & \ldots & \ldots & {Y_{mj} } & \ldots & {Y_{mn} } \\ \end{array} } \right]$$$${\text{where}}\, i = 1, 2, \ldots , m;\,\,j = 1, 2, \ldots , n$$(1)where Yij represents ith alternative performance value on jth criterion, m is the number of alternatives and n is the number of criteria.
-
Step 3:
Determine the normalised decision matrix using Eq. (2)
$$R_{i} = \frac{{Y_{i} }}{{\sqrt {\mathop \sum \nolimits_{i = 1}^{m} Y_{i}^{2} } }},$$(2)where Yi is the actual decision value of the ith alternative.
-
Step 4:
Determine the weighted normalised decision matrix using Eq. (3)
$$v_{i} = w_{i} \times R_{i},$$(3)where wj is the weight of the jth criterion.
-
Step 5:
Evaluate the Weighted Proximity Index (WPI), \(u_{i}\) using Eq. (4)
$$u_{i} = \left\{ {\begin{array}{*{20}c} {v_{{{\text{max}}}} - v_{i} ;} & {{\text{for}}\,{\text{beneficial}}\,{\text{attributes}}} \\ {v_{i} - v_{{{\text{min}}}} ;} & {{\text{for}}\,{\text{cost}}\,{\text{attributes}}} \\ \end{array} } \right\}.$$(4) -
Step 6:
Determine the Overall Proximity Value, \(d_{i}\) using Eq. (5)
$$d_{i} = \mathop \sum \limits_{j = 1}^{n} u_{i}.$$(5) -
Step 7:
Rank the alternatives based on \(d_{i}\) values. The alternative with least value of di represents minimum deviation from the best and therefore, it is ranked first, followed by alternatives with increasing \(d_{i}\).
Illustrative examples
This section presents two examples pertaining to the selection of E-learning websites to reveal applicability and efficacy of the combined FAHP-PIV methods in providing solution to the website selection problems.
Example 1
This example is taken from Garg (2017) in which the authors have considered problem of selecting the E-learning websites. Table 1 shows the 5 alternative websites and 10 criteria/attributes for this problem. Garg (2017) used FAHP for the determination of the criteria weights. In this decision problem, functionality (C1), maintainability (C2), portability (C3), reliability (C4), usability (C5) and efficiency (C6) are beneficial criteria whose high values are required, whereas ease of learning community (C7), personalisation (C8), system content (C9) and general factors (C10) are non-beneficial criteria for which lower values are preferred. The beneficial criteria and non-beneficial criteria have been indicated with (+) and (−), respectively.
Weights of the criteria calculated by Garg (2017) using FAHP are shown in Table 2. Since our main objective is to demonstrate the applicability of the PIV method, not to calculate the criteria weights therefore, we used the criteria weights obtained by Garg (2017) for ranking the alternatives using PIV method.
Normalised decision matrix, as shown in Table 3, was obtained using Eq. (2).
Using criteria weights (Table 2), weighted normalised decision matrix was obtained using Eq. (3) and it is shown in Table 4.
The weighted proximity index (ui), the overall proximity value (di) of all the alternatives were calculated using Eqs. (4) and (5), respectively, as shown in Table 5. Based on the values of di, the ranking of alternatives was done in such a way that the alternative with the least value of the di is ranked first followed by the alternatives with increased values of di. The ranking of alternatives is also shown in Table 5.
It is evident from Table 5 that the ranking order of the e-learning websites is CPW-5 > CPW-1 > CPW-3 > CPW-4 > CPW-2. Table 6 shows the comparison of ranking of all the five websites obtained by different MCDM methods.
It is evident from Table 6 that the proposed PIV method gives exactly same ranking as that of AHP and COPRAS. However, there is a small difference in the ranking given by the PIV and WEDBA methods. The Spearman’s correlation coefficient (r) values between rankings of the websites obtained by different methods are shown in Table 7. Table 7 reveals almost the same performance of all the four MCDM methods.
Example 2
This example is taken from Garg and Jain (2017) in which the authors have considered the problem of selecting the e-learning websites. Table 8 shows the eight alternative websites and ten criteria/attributes for this problem. The authors used FAHP for determination of the criteria weights. In this decision problem, functionality (C1), maintainability (C2), portability (C3), reliability (C4), usability (C5) and efficiency (C6) are beneficial criteria for which high values are required, whereas ease of learning community (C7), personalisation (C8), system content (C9) and general factors (C10) are non-beneficial criteria for which lower values are preferred. The beneficial criteria and non-beneficial criteria have been indicated with (+) and (-), respectively.
Weights of the criteria shown in Table 2 were used for ranking the alternatives using PIV method. Normalised decision matrix, as shown in Table 9, was obtained using Eq. (2)
Using criteria weights (Table 2), weighted normalised decision matrix was obtained using Eq. (3) and it is shown in Table 10.
The weighted proximity index (ui), the overall proximity value (di) of all the alternatives were calculated using Eqs. (4) and (5) respectively as shown in Table 11. Based on the values of di, the ranking of alternatives was done in such a way that the alternative with the least value of the di is ranked first followed by the alternatives with increased values of di. The ranking of alternatives is also shown in Table 11.
Table 11 reveals the ranking order of the e-learning websites as CPW-5 > CPW-7 > CPW-1 > CPW-3 > CPW-6 > CPW-4 > CPW-8 > CPW-2. Table 12 shows the comparison of ranking of all eight websites obtained by different MCDM methods.
It is evident from Table 12 that the ranking of the E-learning websites given by the proposed PIV method exactly matches with that of COPRAS and VIKOR. However, there is a small difference in the ranking given by the PIV and WDBA methods. The Spearman’s correlation coefficient (r) values between rankings of the websites obtained by different methods are shown in Table 13. Thus, Table 13 reveals almost the same performance of all the four MCDM methods.
Conclusions, limitations and future research directions
The main objective of this paper was to demonstrate applicability and effectiveness of a newly developed MCDM method, i.e. Preference Indexed Value (PIV) method for the selection of the best ‘C’ programming language E-learning website from the existing ones. PIV method was applied on two problems related to the selection of the E-learning websites which were solved by researchers using relatively more complex methods. In the first problem, five E-learning websites were considered and these were ranked by PIV method and ranking order was found as CPW-5 > CPW-1 > CPW-3 > CPW-4 > CPW-2. Similarly, in the second problem, eight E-learning websites were considered and their ranking order using PIV method was found as CPW-5 > CPW-7 > CPW-1 > CPW-3 > CPW-6 > CPW-4 > CPW-8 > CPW-2. For both problems, ranking of the E-learning websites obtained by the PIV method was compared with those derived by other methods such as AHP, VIKOR, COPRAS, WEDBA and WDBA, and it was found that ranking given by the PIV method exactly matched with those given by other methods except WEDBA and WDBA methods. A small difference in the ranking given by PIV method and WEDBA as well as WDBA was observed. The description of the PIV method given in Sect. “Proximity Indexed Value (PIV) method” of this paper reveals that this method comprises of relatively simple computational steps as compared to other MCDM methods and also this method minimises rank reversal problems (Mufazzal and Muzakkir 2018) which is a major issue associated with the MCDM methods. Hence, it is suggested that PIV, being a very simple MCDM method, may be used for solving varieties of decision-making problems. The method proposed in this paper can be used by the decision makers for solving varieties of MCDM problems including the selection of the most suitable E-learning websites.
Although the proposed method minimises the rank reversal problem compared to previously well-established techniques, it does not conclusively eliminate the issue. The reason is that it employs normalisation process, which indirectly affects the relative ranking of alternatives, altogether. This means when more than two alternatives are compared at a time, the relative ranking of the two alternatives will be affected by the presence of other irrelevant alternatives, due to normalisation. Hence, two directions follow: (i) either to make the process of normalisation free to eliminate the reversal problem or (ii) otherwise reduce the influence of normalisation to mitigate the issue. The later approach has been adopted in the proposed method, and thus the problem is just only reduced and not removed. This is crucial because rank reversal gives an idea of ranking reliability.
Further, this method only provides ranking procedure and does not throw light on finding criteria weights. Hence, this could be further extended by combining both the tasks, to develop a complete framework for decision making.
References
Alias, N., Zakariah, Z., Ismail, N. Z., & Aziz, M. N. A. (2012). E-learning successful elements for higher learning institution in Malaysia. Procedia -Social and Behavioral Sciences, 67, 484–489.
Aparicio, M., Bacao, F., & Oliveira, T. (2016). An e-learning theoretical framework. Journal of Educational Technology & Society, 19(1), 292–307.
Cheawjindakarn, B., Suwannatthachote, P., & Theeraroungchaisri, A. (2012). Critical success factors for online distance learning in higher education: a review of the literature. Creative Education, 3(8), 61.
Covella, G. J., Olsina Santos, L. A. (2002). Specifying quality characteristics and attributes for E Learning sites. In IV Workshop de Investigadoresen Ciencias de la Computación
Ehlers, U. D. (2004). Quality in e-learning from a learner’s perspective. European Journal of Open, Distance and E-learning, 101, 1–7.
FitzPatrick, T. (2012). Key success factors of eLearning in education: A professional development model to evaluate and support eLearning. Online Submission.
Garg, R. (2017). Optimal selection of E-learning websites using multi attribute decision-making approaches. Journal of Multi-Criteria Decision Analysis, 24(3–4), 187–196.
Garg, R., & Jain, D. (2017). Fuzzy multi-attribute decision making evaluation of e-learning websites using FAHP, COPRAS, VIKOR. WDBA. Decision Science Letters, 6(4), 351–364.
Govindasamy, T. (2001). Successful implementation of e-learning: Pedagogical considerations. The Internet and Higher Education, 4(3–4), 287–299.
Jain, D., Garg, R., Bansal, A., & Saini, K. K. (2016). Selection and ranking of E-learning websites using weighted distance-based approximation. Journal of Computers in Education, 3(2), 193–207.
Le Blanc, A., Wands, M. (2001). Critical success factors: E-learning solutions cappuccino. The official E-Newsletter of the Change and Learning Practice2.
Mahanta, D., & Ahmed, M. (2012). E-learning objectives, methodologies, tools and its limitation. International Journal of Innovative Technology and Exploring Engineering, 2, 46–51.
Mosakhani, M., Jamporazmey, M. (2010). Introduce critical success factors (CSFs) of e-learning for evaluating e-learning implementation success, In Proceedings of the Educational and Information Technology (ICEIT), 2010 International Conference (Vol. 1, pp. V1-224). IEEE.
Mufazzal, S., & Muzakkir, S. M. (2018). A new multi-criterion decision making (MCDM) methodbased on proximity indexed value for minimizing rank reversals. Computers & Industrial Engineering, 119, 427–438.
Ozkan, S., Koseler, R. (2009). Multi-dimensional evaluation of E-learning systems in the higher education context: an empirical investigation of a computer literacy course, Proceedings of the Frontiers in Education Conference, 2009. FIE’09. 39th IEEE (pp. 1–6). IEEE.
Oztekin, A., Delen, D., Turkyilmaz, A., & Zaim, S. (2013). A machine learning-based usability evaluation method for eLearning systems. Decision Support Systems, 56, 63–73.
Öztürk, Z. K. (2014). Using a multi criteria decision making approach for Open and distance learning system selection. Anadolu University Journal of Science and Technology, 15(1), 1–14.
Pruengkarn, R., Praneetpolgrang, P., Srivihok, A. (2005). An evaluation model for e-learning Websites in Thailand University, Proceedings of the Advanced Learning Technologies, 2005. ICALT 2005. Fifth IEEE International Conference (pp. 161–162). IEEE.
Saaty, T. L. (1980). The analytic hierarchy process (p. 1980). New York: McGraw-Hill.
Sela, E., & Sivan, Y. Y. (2009). Enterprise e-learning success factors: An analysis of practitioners’ perspective (with a downturn addendum). Interdisciplinary Journal of E-Learning and Learning Objects, 5(1), 335–343.
Selim, H. M. (2007). Critical success factors for E-learning acceptance: Confirmatory factor models. Computers & Education, 49(2), 396–413.
Soong, M. B., ChanHC, Chua B. C., & Loh, K. F. (2001). Critical success factors for on-line course resources. Computers & Education, 36(2), 101–120.
Volery, T., & Lord, D. (2000). Critical success factors in online education. International Journal of Educational Management, 14(5), 216–223.
Vukovac, D. P., Kirinic, V., & Klicek, B. (2010). A comparison of usability evaluation methods for e-learning systems (pp. 271–289). Vienna: DAAAM International Scientific Book.
Xaymoungkhoun, O., Bhuasiri, W., Rho, J. J., Zo, H., & Kim, M. G. (2012). The critical success factors of e-learning in developing countries. Korea, 305, 701.
Yunus, Y., & Salim, J. (2013). E-learning evaluation in Malaysian public sector from the pedagogical perspective: Towards e-learning effectiveness. Journal of Theoretical & Applied Information Technology, 51(2), 201–210.
Zaman, W., Ghosh, P., Datta, K., & Basu, P. N. (2012). A framework to incorporate quality aspects for e-learning system in a consortium environment. International Journal of Information and Education Technology, 2(2), 159.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Khan, N.Z., Ansari, T.S.A., Siddiquee, A.N. et al. Selection of E-learning websites using a novel Proximity Indexed Value (PIV) MCDM method. J. Comput. Educ. 6, 241–256 (2019). https://doi.org/10.1007/s40692-019-00135-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40692-019-00135-7