Keywords

1 Introduction

The web is an essential part of the current day education system as it aids students to access information, offering the flexibility in times and locations for learning and personal growth [1]. Easily accessible and useable higher educational websites are essential because they assist a wide range of students with diverse abilities to use and access these websites. University websites facilitate teaching, learning, and communication [2]. Despite this, a digital divide exists in context of accessing information on webpages because many educational websites are not accessible and usable to all students, particularly to the blind users who rely on assistive technologies to navigate websites [1]. Early work uncovered basic usability problems with learning management systems (LMS) [3]. This study investigates the accessibility level of Norwegian university websites and addresses accessibility and usability issues that screen reader users commonly face.

Universal design concept emerged from North Carolina State University in 1997, and the expert group of advocates developed its seven well-known principles. They coined universal design as “the design of products and environments to be usable by all people, to the greatest extent possible, without the need for adaptation or specialized design”. More refined definition of universal design focusing on all people has also been proposed. According to Steinfeld and Maisel [4], universal design is defined as a process which authorizes a wide range of people by enhancing individuals’ potential, health, and involvement in various social sectors.

Web accessibility refers to websites and tools to which people with disabilities are able to use [5]. People with disabilities are able to get all information and use all the functionality available to users without disabilities, such as links, buttons, and form controls [1]. Web accessibility empowers individuals with disabilities or special needs to operate the web contents, making web accessibility a fundamental matter in web design [6]. W3C [7] elaborates on the accessibility requirements for people with disabilities as follows: (a) Websites should work well with assistive technologies such as screen reader tools, screen magnifiers, and voice recognition tools for the text input; and (b) General usability principles should be included. One factor for successful delivery of web accessibility is developers’ awareness of the aspects involved [8]. The level of web accessibility is often low in many websites although various tools have been developed to help increase accessibility [9].

The term usability refers to the extent to which a product or system can be used by particular users with a specified objective in a particular situation with effectiveness, efficiency, and satisfaction [10]. Usability is also defined as the state of ease of use [11]. A product in a given context is considered usable if a person is satisfied using it. When a person purchases products, he/she expects them to function well and be easy to use in order to meet his/her needs [12]. Nielsen [13] defined usability using five key components: Learnability–How easy is it for users to accomplish tasks the first time they encounter the design? Efficiency–Once users have learned the design, how quickly can they perform tasks? Memorability–When users return to the design after a period of not using it, how easily can they reestablish proficiency? Errors–How many errors do users make, how severe are those errors, and how easily can they recover from the errors? Satisfaction–How pleasant is it to use the design?

Web accessibility means people with disabilities can perceive, understand, navigate, and interact with the websites’ tools and features without barriers [14]. Inclusive web design gives people with disabilities equitable access to the functionality of the web as those without disabilities. Web usability concerns users’ experience when they browse a website in terms of ease of use. According to Kamal, Alsmadi, Wahsheh, and Al-Kabi [6], web accessibility and web usability share common concerns, but they are not identical.

This study investigated the accessibility level of Norwegian university websites using two automated tools against Web Content Accessibility Guidelines (WCAG) 2.1. The study also addresses common accessibility and usability issues screen reader users encounter. The following research questions are asked: (1) To what level of compliance do the Nordic university websites meet the criteria for successful inclusive web design following WCAG 2.1 guidelines using automated tools? (2) What are common accessibility barriers screen reader users face when interacting with the different Norwegian university webpages based on user experience and automated tools? (3) Are there assessment discrepancies between the two automated tools employed?

2 Related Work

There is a vast body of work on accessibility on the web. Some studies have addressed the assistive technology in use such as screen readers [15]. This study is concerned with the content. Kurt [16, 17] evaluated the accessibility of ten university websites over an interval of 5 years based on two automated tools, namely AChecker and Sortsite. In the first study [16], none of the assessed websites met the minimum success criteria. The follow-up study [17] showed that the same websites had not improved much over the 5-year period, and there was even a marginal decrease in accessibility.

Larzar, Allen, Kleinman, and Malarkey [18] investigated challenges faced by 100 screen reader users by collecting time diary data. The researchers identified five causes of user frustration when interacting with the website using screen reader software: (a) design of the page resulting in confusing screen reader response, (b) incompatibility of screen reader software with the internet browsers, (c) poorly designed unlabeled forms, (d) missing alternative text for images, and (e) inaccessible PDF-files and screen reader crashes. The results also showed that it took on average of 30.4% longer to use the websites due to frustrations.

Thompson, Burgstahler, and Moore [19] evaluated the homepages of 127 higher education websites over a 5-year period with experts’ manual accessibility checks. They found significant accessibility improvement. However, most issues involved keyboard navigation which the researchers assumed to be caused by emerging new dynamic web contents.

Kesswani and Kumar [20] and Masood Rana, Fakrudeen, and Rana [21] noted that many educational institutes did not conform to recommended accessibility standards. The comparative analysis of top university websites of different countries showed that most schools met less than half of the accessibility recommendations.

Ismail and Kuppusamy [22] evaluated web accessibility of 302 Indian universities using three automatic tools (WAVE, AChecker, and Webpage Analyzer). Common errors were uncovered based on WCAG 2.0 conformance level guidelines. The results showed that none of the university websites tested met the WCAG 2.0 accessibility criteria. Design recommendations for accessibility were then proposed as follows: (a) Text alternatives for all non-text web content should be provided; (b) Headers need to be included for each page, including sections and tables; (c) Color contrast and other keyboard functionalities need to be supported; (d) Well-structured forms with interactive features should be considered; (e) Adjustment control of color contrast should be included and clearly visible in webpages; and (f) Media players should allow users to have full control to resize and reposition media in videos/audios.

Harper and DeWaters [23] evaluated accessibility of 12 university homepages in the United States by using the Watchfire Bobby automated tool according to the WCAG 1.0 guidelines. The results showed that only one university met all the accessibility criteria against WCAG 1.0 three priority levels A, AA, and AAA. Only 50% of the websites met priority 1 and priority 2 criteria; 33% of the websites met priority 1 conformance.

Menzi-Çetin, Alemdağ, Tüzün, and Yıldız [24] conducted a usability evaluation of a university website with six screen reader users employing interviews, usability tasks, and satisfaction questionnaires. The results showed that the most challenging task was finding the final exam dates on the university calendar, and the most time-consuming task was locating the course schedule on the webpage. The participants complained regarding missing search form on each page and suggested that a text version for all pages and proper link-list be provided.

Lazar, Olalere, and Wentz [25] evaluated the accessibility and usability of online job portal sites across eight states in USA. Sixteen participants applied for at least two jobs using automated tools. The results showed that most usability issues were the same for visually impaired users and people without disabilities. Also, user testing was fruitful when the participants performed the tasks including navigation between the various webpages and when they thought out loud during testing. The study deemed that most online accessibility and usability issues are easy to locate and can be fixed with little effort by web designers.

Another avenue of research relates to text readability [26,27,28] which in principle is covered by WCAG. However, it is hard to assess text readability in a practical and consistent manner.

3 Method

3.1 Participants

Ten partially blind and four fully blind participated in the study (N = 16) with a mean age of 19.5 years. Fifteen of the participants were from Nepal and one was from Oslo Metropolitan University. All the participants had at least a bachelor’s degree and were proficient English readers. All participants used their own personal computer for the user testing. Nine participants used the NVDA screen reader tools and seven used the JAWS screen reader.

3.2 Material

Four internationally recognized Norwegian university websites were chosen for this research: University of Stavanger (UiS, https://www.uis.no/), University of Tromso (UiT, https://en.uit.no/), University of South-Eastern Norway (USN, https://www.usn.no/), and University of Adger (UiA, https://www.uia.no/).

The above-listed websites were chosen arbitrarily. We evaluated the homepages, contact pages, and about pages using two automated tools. The homepages were evaluated first because it is the portal through which the users access the websites. If the home page is inaccessible, disabled user may find it challenging to access the remaining part of the website [29]. Only 1-level pages were evaluated. As noted, the homepage alone does not represent the accessibility and usability of the entire website, but the homepage and level-1 represent the site [30].

3.3 Equipment

Two automated tools WAVE (Web Accessibility Evaluation Tool) [31] and Total Validator [32] were used to evaluate the accessibility of the university webpages. Automated tools are essential for checking the minimal accessibility level of the website; however, relying only on automated tests may not be sufficient as automated tools cannot thoroughly check accessibility issues of the webpages [17, 33]. Total Validator is a free software for web accessibility testing. It checks if the website uses valid HTML and CSS with no broken links and complies with WCAG 2.1 [34]. Similarly, WAVE is a free web accessibility evaluation tool which presents a visual description of accessibility issues [34]. Both tools test webpages against the latest WCAG 2.1 guidelines, support direct URL submissions, and generate detailed WCAG 2.1 conformance level reports (A, AA, and AAA).

3.4 Measurements

Web accessibility metrics indicate the accessibility level of websites [35]. WAVE and Total Validator were used to evaluate the different webpages of university websites against WCAG 2.1. The guidelines are categorized into four principles: perceivable, operable, understandable, and robust. These are subdivided into 13 guidelines. Among those guidelines, we selected the checklists for screen reader users. In this study, only conformance Level AA of the webpage is tested. According to the guidelines’ documentation [36], Level AAA conformance is not a must as a general policy for the whole website as it is not practicable to meet the whole Level AAA Success Criteria for some content. We thus chose Level AA conformance because it fulfills both Level A and Level AA conformance of the webpages.

The System Usability Scale (SUS) is a 5-Likert scale consisting of 10 questions; it provides the overall view of subjective assessments of usability of system [37]. SUS score indicates usability interpretation regarding effectiveness, efficiency, and satisfaction [38].

A web accessibility questionnaire was devised. The questionnaire was inspired by structural issues given [40] and included the following checks: (a) page title; (b) image text alternatives; (c) heading, contrast ratio, and text sizing; (d) keyboard access and visual focus, forms, labels, and errors; and (e) moving, flashing, or blinking content, multimedia alternatives, and basic structure checks.

3.5 Procedure

Both face-to-face and remote sessions were conducted. The four university webpages were first evaluated using two automated tools. The most reoccurring results of each webpage from automated tools were then extracted. The participants were given five sets of tasks for each university. Then they were provided with the SUS questionnaire to measure the usability of each website. Further, they were provided with accessibility questionnaires and open-ended questions to assess the accessibility of the website. The face-to-face session lasted approximately 1.5 h. Remote sessions lasted longer (Fig. 1).

Fig. 1.
figure 1

Experimental procedure overview.

4 Results

4.1 Automated Accessibility Testing

Table 1 shows the results obtained with WAVE and Total Validator tools on the four university websites. The number of errors reported by Total Validator (M = 46.75, SD = 24.65) was higher than that of WAVE (M = 36.5, SD = 23.35). In contrast, the number of warnings reported by WAVE (M = 50.5, SD = 33.13) was higher than those reported by Total Validator (M = 45.75, SD = 28.91). Total Validator reported that the number of errors was relatively more severe than the warnings generated, which need to be minimized to achieve successful accessibility.

Table 1. Mean and standard deviation of automated tools report

4.2 Perceived Usability

Figure 2 shows the perceived usability results of the four university websites based on the participants’ responses to the SUS questionnaire. Following the usability interpretation [41], the usability score showed that only University of Adger (UiA) website was acceptable (M = 69.53, SD = 7.14). The other three university websites (UiS, UiT, and USN) fell below the average of usability scale (i.e., M = 68). Only USN came close to the average usability scale (M = 67.81, SD = 10.95), and UiS (M = 54.37, SD = 10.7) and UiT (M = 51.25, SD = 8.36) were much below average.

Fig. 2.
figure 2

Mean SUS scores for university webpages. Error bars show SD.

4.3 Interviews

After the online survey, the participants were asked open-ended questions on commonly occurring issues they encountered on the university websites. The results can be summarized as follows: Frist, the video played automatically and the pages did not have the option to pause the video. Next, some instances of duplicate page titles were observed. As a result, the users who relied on screen reader tools had difficulty distinguishing the pages. Instances of unstructured linked lists and headings were also reported. The users thus had to scan the entire page with the screen reader to find the desired content. Further, the screen reader read the webpage with a Norwegian accent. Some of the breadcrumbs were poorly designed which again confused the screen readers. There were also browser compatibility issues with the screen readers. Some participants had to switch to other browsers to complete some tasks when they could not accomplish them in one browser. Finally, the search form within the website did not provide relevant results. Instead, when searching using the widely available google search engine, they were able to accomplish the pertinent contents.

After participants completed the online accessibility questionnaire, they were asked two open-ended questions related to accessibility issues. The accessibility problems encountered included poor heading structures, poor link list structures, ambiguous links, screen reader incompatibility in the browser itself, and inaccessible keyboard navigation.

5 Discussion

This study aimed to uncover common accessibility and usability issues screen reader users experience when interacting with the contents provided by typical Norwegian university webpages. The results employing the two automated tools indicate that all four universities had accessibility level A checkpoint issues of 1.1.1 (non-text content). This checkpoint has been found to be the most commonly violated issue in other university websites [16, 17, 41,42,43]. This accessibility problem is frustrating for people with disabilities, especially screen reader users; therefore, fixing this issue would enable users to more effectively perceive the web content.

Level A checkpoint 1.3.1 (info and relationships) and level AA 3.3.2 (labels or instructions) were outlined as distinct issues, as also found in other educational institute websites [22, 41,42,43]. These issues should also be entailed to increase accessibility for users relying on screen reader tools. Note that USN and UiT had slightly different ratings compared to the other two universities: the UiT homepage had a 3.3.1 issue (instead of 3.3.2) reported by both tools and its contact page had no 3.3.2 issue according to WAVE. USN’s homepage and contact page had no 3.3.2 issue according to WAVE, while both pages had such issues according to Total Validator. Additionally, there appears to be a trend where more level AA 1.4.3 (contrast) issues were detected by WAVE while more 1.4.4 (resize text) issues were detected by Total Validator.

Further, level A checkpoint 2.4.4 (link purpose) was violated by all four university websites. Entailing this checkpoint would ensure that all the links have a meaningful purpose and the potential users can understand the context of the links. This checkpoint issue has also been identified in other websites [22, 40, 41]. Also, 2.4.6 (headings and labels) was the only level AA checkpoint issue detected on all selected websites by WAVE only, except USN’s contact page which was instead detected by Total Validator. This issue has also been a major issue in other university websites [41,42,43].

The participants responded that navigation was the most reoccurring issue they experienced when accessing the Norwegian university websites with screen readers. This issue acknowledges earlier research [18] which confirms that navigation is one of the most frustrating challenges screen reader users face when accessing the web. Previous studies [45,46,47] also point out that the navigation issue should be considered in educational websites.

In addition, the participants experienced that the screen reader tool read all the links and headings when browsing the webpages, which was annoying. Previous studies [18, 24, 48] also report that the users get frustrated when the pages are read out every time the webpage is loaded. Inclusion of a skip link within a webpage is recommended [18] such that screen reader users can bypass unwanted links.

Another common usability issue is incompatibility of screen reader software with the internet browsers. A few students were observed switching between browsers to complete the task. This reveals a violation of one of the usability principles, i.e., learnability, coined by [13], as also addressed in other studies [18, 24].

During the testing, it was observed that most participants visited external search engines to locate the desired information which they could not find using the internal search engine of the website. Most were able to accomplish the tasks via the search engines. Menzi-Çetin and colleagues [24] detected similar usability barriers.

The automated tools revealed that none of the university websites investigated met the minimum WCAG accessibility guidelines. The tools reported inconsistent accessibility issues and warnings. This finding is in agreement with that of Molinero and Frederick [44] who used three automated tools to evaluate 50 websites with different results. They warn that simply relying on one automated tool is risky since different tools seem to provide different accessibility results.

In this study we recruited only fully blind and partially blind participants. Other participant groups who may also rely on screen reader tools (e.g., motor impaired and users with cognitive disorders) were not included. Most of the tests were performed remotely. A remote study can be error prone and it is difficult to observe all the issues and sessions during the tasks [49]. Further, all four tested websites are dynamic; their contents are frequently updated daily. The results could differ when evaluations are performed over time. Moreover, the evaluation was performed only on three webpages of each university website. Including more evaluation tools and manual evaluations may also help reaffirm the findings. Further analysis of the open-ended questions may also help clarify related issues.

6 Conclusion

Our experiment suggests that the accessibility level of higher educational websites at the time of the study was inadequate. It was observed that none of the evaluated sites met the minimum WCAG 2.1 guidelines. Additionally, entirely relying on automated tools is probably not the optimal practice for uncovering website accessibility issues. This study reveals that the most common usability issues universities need to consider are clear labelling of page titles, ease of keyboard access on navigation, presenting the breadcrumb easy to locate, and proper interface and results of search form design. Also, universities should focus on accessibility aspects including organizing the heading and link structures, proper labelling of headings and links, and keyboard navigation. It is advisable that the screen reader developers design the software compatible with most browsers. Future work includes assessing larger samples and conducting face-to-face interviews to gather more complete impressions. Also, manual evaluations may help in-depth analyses of accessibility issues. Future studies may address individuals with different disabilities who also rely on screen reader tools to access web information, with different assistive tools and evaluation tools.