Keywords

1 Introduction

The pedagogical and visual design of MOOCs, their information architecture, usability and interaction design can have a negative impact on learners’ engagement [1]. In particular for disabled learners there are accessibility barriers that can affect the learners’ experience; these barriers are not only in access to the technology, but the way educational resources are pedagogically designed.

A study from Blackboard [2] assessing the overall accessibility of content in online courses over a 5-year period from 2012 to 2017 identified that the progress in making accessible educational resources has been slow, describing such materials as having become “only slightly more accessible”. The study showed the value of an automated process to help quantify the issues that need to be addressed and supports the need to provide processes for making MOOCs accessible for disabled learners.

Rodrigo and Iniesto [3] also argue the need to provide a holistic vision for creating accessible MOOCs. As part of a research programme at The Open University (UK) interviews were carried out with MOOC providers and learners [4] which showed that issues extended beyond the technical considerations that are typically considered in accessibility testing and compliance. In this paper several accessibility evaluation methods are brought together into an accessibility audit to evaluate MOOCs, to provide indicators of the accessibility barriers and to propose processes to address them.

2 MOOC Accessibility Audit

The methodology in the audit combines existing or adapted methods from four main evaluation areas to provide four checklists that can be applied in a heuristic evaluation approach. The selection of these components combines different aspects of accessibility to provide a holistic approach, evaluating not only technical aspects related to accessibility but also the experience of learners [5], the quality of the educational resources produced and its pedagogical design, the four components are:

  1. 1.

    Technical Accessibility evaluation. Conformance to guidelines and standards through WCAGFootnote 1, with additional analysis of the text-based files [6].

  2. 2.

    User experience (UX) evaluation. Evaluation of usability and user experience characteristics of the user interface design and pedagogical design with cognitive and UX walkthroughs [7].

  3. 3.

    Quality evaluation. Assessing the properties of MOOCs, the quality of the design, platform and support for learners adapting an approach from OpenupEd [8].

  4. 4.

    Learning design evaluation. Evaluation of the learning design characteristics within MOOCs through Universal Design for Learning (UDL) [9].

2.1 Technical Accessibility Evaluation

WCAG-EMFootnote 2 methodology was designed for experts to follow a common approach for evaluating the conformance of websites to WCAG. The use of WCAG is a standardised and commonly used instrument for accessibility evaluation in MOOCs [5]. WCAG-EM has been designed with a heuristic evaluation approach in mind and based on previous methodologies such as Unified Web Evaluation Methodology (UWEM). Due to its extensive use, WCAG was the selected standard for the accessibility evaluation of the audit applying AAA conformance level (the most restrictive) adding evaluation of text-based files commonly used in MOOCs such as PDFs.

2.2 User Experience Evaluation

UX evaluation takes the approach of usability inspections following cognitive walkthroughs that include two separate activities: the use of personas and scenarios [7]. This component required new development as an established reference set for accessibility is not available. A set of engaging personas perspective was developed, which incorporate goal-directed personas [10]. Engaging personas take a realistic description of people to draw evaluators into the lives of the personas, and so avoid stereotypical stories that focus only on behaviours rather than considering the whole person. To gain a focus on accessibility, these personas were abstracted from self-description of disabled learners interviewed in related research in MOOCs [4].

The narrative scenarios were developed from the scenarios used in a major European project (EU4ALL) reviewed to be reused in MOOCs [11]. The set of cognitive walkthroughs is complemented with UX walkthroughs oriented to the learning design as used in the Fluid projectFootnote 3. UX walkthrough is a synthesis of methods that enables the evaluator to make assessments both from the learner’s point of view and of a design expert. In this case, the aim is to check if the designed tasks within the MOOC are feasible to be achieved by the personas.

2.3 Quality Evaluation

Quality evaluation was adapted from the OpenupEd quality label influenced by the Quality Code at the Quality Assurance Agency (QAA) and based on the E-xcellenceFootnote 4 approach of using a benchmark for quality assessment in MOOCs [8]. The label has been used to evaluate the quality in MOOC platforms such as FutureLearn and UNED Abierta [12]. There have been several projects about quality in MOOCs within OpenupEd: Score2020 and BizMOOC. The tested version of the checklists produced and available under creative commons (CC) licence was adapted to provide an evaluative perspective for this audit component.

2.4 Learning Design Evaluation

MOOCs by definition aim for “massiveness”, which leads to difficulties in taking a personalised approach, though makes them suitable for a universal design approach to evaluate the learning design. Universal design considers how to meet the needs of all learners through design. The approach selected for this audit component to evaluate the learning design has been UDL, due to its greater development and its widespread use [13]. The UDL approach is to present the information in ways that fit learners’ needs, rather than requiring learners to adapt to the information [9]. This approach is relevant to understand learners who may like to adjust the curriculum to their needs rather than them to the curriculum. This component required new development to apply UDL in the context of MOOCs.

3 Conclusions and Future Work

A four-component audit has been designed for improving the accessibility in MOOCs for disabled learners from an expert evaluation perspective. The components for standards compliance, quality and learning design were developed by adapting existing tools after extensive research on the available options. User experience personas have also been built from interviews with learners. At this stage:

  • The audit has been validated by ten experts through inter-rater reliability evaluations to establish usefulness as a tool to identify and address accessibility barriers.

  • The audit has been trialled by application to MOOCs from four providers to help to understand the current state of accessibility in MOOCs: FutureLearn, Coursera, edX and Canvas.

The validation and implementations suggest the audit is a robust tool with the following advantages: visualisation of the results; overlap between components and the strength of the criteria; and complementarity in the checklists. The aim of the audit is to derive recommendations to address accessibility barriers. The processes of validation and implementation allow barriers to be identified and also facilitate discussions to address them in the MOOC design stages. Future work with the audit includes: evaluating further platforms; evaluating several MOOCs per platform; refinement of the audit itself; and involvement of stakeholders in the evaluation process.