Keywords

1 Introduction

Driving is a daily activity for many people and, since no one is immune to road problems, there are services that provide on-the-spot assistance for drivers. Unfortunately, deaf people are unable to make use of travel assistance services which rely on the traditional voice-based services. In addition, there are problems common to other users, such as people with sudden problems of word articulation, vocal diseases, or situations of shock and panic caused by violent occurrences.

Consequently, there is an increased drive to overcome barriers to human communication and interaction, as well as to make improvements in the quality of life of the deaf community, promising to meet the needs of deaf drivers. Hence, the research question is: can a mobile application based on visual interface be a usable solution for emergency situations for the deaf community?

Thus, we present the MyCarMobile application for mobile devices, which allows the user to contact assistance on the road through a smartphone without need of audio. The solution presented is a native mobile application that is always available through rapid prototyping for multiplatform, by simply defining the application flow, with the integration of services permanently available through a mediation server and dynamic communication channels. In previous works [1] we focused on literature and technical issues. Specifically, in this paper we focused on assessment.

The paper is structured as follows: in the second section, we list and analyze related studies; in the third, MyCarMobile we identify travel assistance problems to be overcome, then we reveal the development process (methodology, architecture design and implementation) and the usability assessment results; in the fourth, we present the conclusions and future work.

2 Related Studies

The popularity and the massification of smartphone usage has boosted the development of mobile applications, which provide huge benefits to their users [2, 3]. However, they started to be developed and designed without any consideration of accessibility and usability, challenging the interaction of users with disabilities. Indeed, developers of mobile applications ignore the fact that almost one in five people in the world live with recognized disability [4] and they are all potential users of these technologies [5]. Despite this fact, there are already some mobile applications developed to minimize their limitations of their use, helping people with special needs in their daily lives [6, 7], particularly for deaf people, with visual or speech impairment [8].

Although they are similar, each development platform has its own accessibility guidelines regarding the mobile application development, such as the guidelines for Android [9], mobile applications for the Windows Phone [10], and for the iOS mobile [11]. Furthermore, there are also guidelines for accessibility (such as: the Word Wide Web Consortium (W3C) accessibility guidelines (WAI - WCAG 2.0) [12] and the European standard EN 301 549 [13]) and usability [14, 15] (such as: ISO 9241-11 [16], ISO 13407 [17] and ISO 9241-210 an improved update of ISO 13407 [18]).

Moreover, when developing communication systems easy-to-use tools can facilitate a clear and efficient conversation. The CommunicateHealth design team presented three tools specifically for health communication systems, which focused on the increment communication during health emergencies. They were the following: the Show Me booklet (“a spiral-bound, laminated, dry erase booklet for use in emergency shelters”), the Show Me mobile app (a mobile application for volunteers and staff who work in a particular location or going door-to-door); and, the Show Me FAC mobile app (for staff and volunteers at family assistance centers). The booklet and the two mobile apps used icon-based forms of communication [19].

In the work of Buttussi et al., a mobile system is also presented for use in the health context. The solution proposed was based on a “collection of emergency-related sentences, showing videos of the corresponding translations in sign language to the deaf patients” [20].

The eCall system, presented in the work of Cabo et al., automatically calls for help in case of a car accident. This system works by sending the geographic location and the vehicle identification data containing information, while at the same time executing a 112-voice connection. The interface of the eCall System Proposal relies on text –based communication [21].

Another solution presented is the PeacePHONE, a simulated mobile phone, designed to compensate for existing functions on mobile phones that were not practical for deaf individuals, by providing an evaluation of these functions and a conceptual design based on the daily life requirements of the Deaf community. These functions are related to the global interaction in a multifunction mobile phone [22].

The system proposed by Vaso et al. consists of a mobile application that provides feedback of an emergency situation to the emergency services. The solutions allowed for three emergency contact options: to the police, to an ambulance, and to the fire department. It is a text and icon-based communication [23].

Also, Weppner and Lukowicz presented an application that lets people with hearing and speech impairments make emergency calls to standard emergency call centers, but the interaction with the interface is mostly done by text input and output [24].

As it can be seen, the icon-based communication is considered a powerful tool in the development of interfaces for deaf users, as it allows communication feedback between the user in need and the emergency services. This design solution can be considered for other users, not only for deaf people, as it can be useful for people with low literacy and/or communication challenges [19].

Regarding commercial solutions, there are also current solutions (mobile applications) that come as a workaround solution for replacing the phone call for getting in touch with the travel assistance, describing their claim and asking for proper assistance. In this context, we analyzed several apps for travel assistance; however, their service was provided throughout a telephone call, unsuitable for the deaf or people with speech impairments, and/or need to have an internet connection to report occurrences. For example, the Portuguese company AXA offers the My AXA mobile application for Android and iPhone devices. In terms of accessibility, the solution does not meet the requirements for deaf users because during the car claim process the user is requested to make an emergency call in case of injuries. In addition, it also requests a call to the assistance if the vehicle is immobile. These call options are inaccessible to deaf users as it requires audio in order to be used [25]. Another solution is iBrisa. It is an application for iOS (iPhone and iPad), Android and Windows 8, which is a fundamental tool for drivers on the Brisa Group’s motorways. Specifically, in terms of accessibility for deaf users, the iBrisa application travel assistance system is not feasible because it requires audio stimulus for proper assistance to be obtained [26]. Another example is the Seguros Directo company, which offers Direct mobile application for Android and iOS devices. The application provides a travel assistance service via telephone call to the company number, but in terms of accessibility for deaf users this service is not feasible due to the need for audio [27].

3 MyCarMobile

After analyzing the apps described above, we felt it was necessary to develop an automatic system for mediation of non-verbal and asynchronous communication which would overcome the necessity of using the audio stimulus to make the emergency call and/or must be connected to report occurrences. As an alternative, we present a solution that is a rapid development method of native mobile applications. These are always available through rapid prototyping for multiplatform, by simply defining the application flow, with the integration of services permanently available through a mediation server and dynamic communication channels. This solution focuses on the provision of a generic service that guarantees interactive solutions based on iconographic flows for non-verbal communication, such as an integrated system of mediation and communication between different entities [28]. The idea arose as a means for facilitating the daily activities of the deaf through mobile applications.

Therefore, the first application based on this system was the SOSPhone mobile application, which aimed at assisting deaf users although it is not specifically geared towards travel assistance. The concept of SOSPhone is to get in touch with emergency services without using a voice call. The application has an iconographic interface that facilitates the process of interaction with the deaf users, allowing to select images that describe the problem that is intended to be reported. This selection of images results in an SMS message containing all the codes corresponding to the selected information, which is immediately sent to the emergency services. The solution was developed to ensure access to emergency services for the Deaf community, but could be used by the general population given its universal design [29].

Furthermore, the same concept was applied in two other scenarios: setting up medical appointments (M3App) and travel assistance (MyCarMobile).

The MyCarMobile application is presented in more detail in the next section.

3.1 MyCarMobile: Identifying the Problem

Travel assistance services aim to provide support to all drivers; however, when a deaf driver gets stuck on the road due to a road problem or accident, he/she cannot call assistance using a telephone call) because this action involves audio stimulus.

To understand how deaf people overcome this situation, we carried out a survey with the Portuguese Deaf community, aiming to collect statistical data on the major impediments that deaf people felt when communicating with travel assistance services [30].

On the basis of the statistical data from the survey it was found that 80% of the respondents had a driving license, and 56% of them have had to resort to travel assistance at least once. The means they used to contact the travel assistance differs: 55% said that they had to send an SMS to a friend/family member to call the travel assistance; 30% asked another driver to make the call; 5% contacted a Sign Language interpreter by 3G; 5% contacted FPAS by GNR; and 5% used another form of contact, not specified.

Furthermore, 40% of participants in the study said they were no longer being assisted due to communication problems and 64% agreed that travel assistance companies do not have support services for communication with low-hearing or deaf people.

3.2 MyCarMobile: Presenting the Mobile Application

Through the survey results and the analysis of some of the most relevant mobile travel assistance applications currently available, we have found that current solutions for Deaf people with travel assistance services are not efficient and there is a need to develop a solution that can be used, autonomously, by deaf people.

Accordingly, the solution MyCarMobile is presented as a mobile application which allows calls without requiring the usual telephone emergency call in order to guarantee use by the Deaf or people who are incapacitated to speak. The application provides an iconographic interface allowing the user to report occurrences through simple touches on the smartphone’s touch screen. This way, the user can easily and intuitively contact travel assistance and report a specific occurrence. The design developed takes into consideration the accessibility and usability guidelines referenced in Sect. 2.

Regarding the application development process, it followed the ISO 9241-210 standard, which addresses a user-centered design methodology [31].

The solution requires two functional prototypes in particular. The first prototype operates as a client application and the second prototype operates as a server application.

The client application prototype was implemented under the Android operating system. It used the Integrated Development Environment (IDE) Eclipse, as well as the support components for Java programming language, the Java Development Kit (JDK) and Java Runtime Environment (JRE).

The prototype developed has two main phases of operation. The first collects the data for a given occurrence. The second consists in sending the data through an SMS message to the server application of the travel assistance services.

In the first phase, a simple and intuitive interface was implemented, which was capable of collecting all the necessary information to characterize a given occurrence. Thus, the user can quickly provide data by simply tapping the smartphone’s touch screen. This method is intended to simplify the process of using the travel assistance service because the user can use the service describing its occurrence without having to enter text or involve an audio call. The content that will be displayed in the interface focuses on the description of the occurrences, which can be defined in three different types of main categories (malfunction, accident, or other situations) with three different degrees (light, serious or very serious). In case of another situation, a list of other situations that may have occurred (broken glasses, loss or locked keys, lack of fuel or battery, robbery or theft, fire or explosion) is available.

During the data collection phase, the user always has an option available that allows to change data that has already been selected using a slide menu. Note that during the process of data collecting, the values are subsequently stored in strings defined as global variables, so that the contents can be accessed in any part of the application interface. At the end of all options, a global string containing all the required information of the occurrence is obtained. The same coordinates are automatically added to the GPS coordinates of the location of the occurrence. To do this, the user must activate the appropriate GPS permissions. Once data has been collected, the content stored is sent via an SMS message to the server application with all the necessary information of a particular occurrence. Before sending the message, the application checks the network coverage to avoid failure messages or lost information. If it is not possible to send the message, a failure message is shown to the user during sending, so that he/she is aware that the request for assistance has not been sent and that he/she needs to send it again. At the end of the information collection and in case the SMS message with its occurrence information is sent successfully, the user has the possibility to use a live chat implemented in the application to add additional information. This live chat also works by exchanging SMS messages, which are displayed by the interface in a synchronized way, in real time, as they are transferred between the client application and the server application. In Fig. 1, we can see the MyCarMobile client application interface running on the Android operating system.

Fig. 1.
figure 1

Screenshots of the client application prototype implemented in Android.

The server application prototype was developed with Microsoft Visual Studio Ultimate 2012. To transmit information from one application to another through the SMS messaging service, the communication solution uses a GSM modem to access the GSM network and ensure communication [32]. To implement GSM technology, a GSM modem with a SIM card connected to the computer’s serial port is required. Since the exchange of information between the computer and the GSM modem uses a protocol designated by the AT command (command language used to control the modem), it is essential to define the AT commands that will perform the intended operation.

When the application is executed, the communication window is shown to the user. For user interaction, it is first necessary to open the serial port that connects the application with the modem, and only then use the application that has access to the GSM network so that it is able to send and receive information. This method of communication ensures a solid solution to its integration.

Once the application integration is ensured through a viable communication source to avoid data loss, then the mechanism that allows it to manage and analyze occurrences was implemented. This management is based on messages received from the prototype of the Android application.

The server application is constantly listening to new messages, but only hosts the messages that are sent by the Android application, because these messages contain a code that validates them. Messages that do not contain this code will be ignored. Whenever there are new messages, the server application shows the number of new occurrences as a notification on the upper left corner of the window. You can click on these notifications to list all occurrences that have not yet been viewed on the screen.

Each message received is stored by the application so that it can be consulted at any time. To list all occurrences registered, users simply click on the query button of the server application, which immediately checks the text files and displays the occurrences on the screen.

In addition to these features, the server application includes the option of a live chat, which is synchronized with the live chat in the Android application, and the messages are shown simultaneously in both applications. This live chat system also works by exchanging SMS messages, which are listed on the screen, in a similar chat interface. Received and sent messages are immediately displayed on the screen. The purpose of this service is to complement the recording of occurrences in case there is a need to detail a certain situation or if the user wants to ask questions to the helpdesk or vice versa, offering the possibility of a greater user interaction with travel assistance services.

3.3 MyCarMobile: Assessing Usability

An assessment phase was performed to evaluate the usability of the MyCarMobile mobile application, also with the intention of validation the interface by deaf users.

Regarding participants, the tests were carried out with the collaboration of the Deaf Association of Porto. This phase included eleven deaf participants who were invited to participate in the pilot study (five women and six men). Of the total participants, 26% of the ages ranged from 20 to 29 years old; 37% were between 30 and 39 years old; and 37% were between 40 and 49 years old. The educational level of participants was the 12th grade or lower. In the group 10% had a Master’s degree; 18% had a Bachelor’s degree; 36%, 12th grade; and less than 36% had left school before the 12th degree.

In relation to procedures, a script was elaborated in order to perform the user tests. This script was translated by a Sign Language interpreter and was used to make a short introduction of the MyCarMobile app. After this introduction, the tests started, with all participants performing the tests individually, under the same conditions. After having completed the tests, a Computer System Usability Questionnaire (CSUQ) was filled out by the participants to gather usability results.

To perform the tests, an Android smartphone was used, with the MyCarMobile application installed. The smartphone had the following features: HTC One X (Android 4.2.2) with a 4.7″ (720 × 1280) pixels monitor.

Concerning the experimental design, each user followed the script provided to them, with scenario information and specific tasks.

After performing the tasks, users answered a CSUQ questionnaire to assess the prototype [33]. This questionnaire is an instrument that measures user satisfaction regarding the computer system usability. The questionnaire consists of 19 questions, with a likert scale of seven points for each answer, where point 1 corresponds to “I totally disagree” and point 7 corresponds to “I totally agree”. The 19 questions are grouped into 4 subscales: overall, overall system satisfaction measurement (all 19 questions); sysuse, measures the system utility (questions 1 to 8); infoqual, measures the quality of information (questions 9 to 15); and, interqual, measures the quality of the interface (questions 16 to 18).

Summarizing, users had to follow the instructions of the script (translated by a Sign Language interpreter), perform the scenario described in the script, and then respond to the questionnaire to provide data.

For the results, we used the likert scale (1 to 7) and the average obtained in each response was calculated to ascertain the level of user agreement for each question. Also, the standard deviation of the responses was calculated, as well as the total number of responses (Total N) entered in the average calculation. Furthermore, when users gave a not applicable response (N/A), the number was also considered.

The results showed users evaluated MyCarMobile regarding simplicity of system usage, with an average of 6.60 and a standard deviation of 0.52. Also, in terms of general satisfaction, the application obtained an average of 6.55 with a standard deviation of 0.69. Despite the positive results in the simplicity usage and satisfaction, users expected to encounter more functions and capacities (5.91 with a standard deviation of 1.22). Overall, concerning the average and standard deviation of the usability metrics under analysis, the system satisfaction rate was 6.31 (up to 7) with a standard deviation of 0.18. Also in the other metrics, Sysuse (average: 6.35 and standard deviation: 0.19), Inforqual (average: 6.30 and standard deviation: 0.12) and interqual (average: 6.16 and standard deviation: 0.22), showed a good user assessment. It is important to note that users found the MyCarMobile mobile application easy to use and were satisfied with it, even when they wished it had more features.

4 Conclusions and Future Work

To create the MyCarMobile mobile application, we took into consideration accessibility and usability standards to develop a client application, which allowed the use of travel assistance services without audio recourse, but also a server application that could receive and manage the occurrences sent by the client application. After the prototypes were implemented successfully, the client prototype was validated with the Deaf users, through user tests. The results showed that the MyCarMobile can be a useful solution for travel assistance for deaf people, as it proved to be easy to use and users where satisfied after usage. However, as future studies, we propose adding new functionalities to the application, since the tests revealed that despite the satisfaction of the users regarding their ability to easily use this application to call the travel assistance services, they would have liked to have more functionalities in the application.