Keywords

1 Introduction

The development of devices and applications based on Virtual Reality (VR), with inviting and affordable prices, is increasingly attractive to consumers eager for novelty. However, this context still lacks studies that assess in depth the ergonomic constraints and risks related to these products and systems, both with respect to the physical world and the adopted virtual environment.

The fact that the user is simultaneously engaged in the physical and virtual worldsFootnote 1 requires the ergonomist to adopt a new attitude, since he/she must be attentive (1) to the human factors related to the experience with the physical and virtual body in the use of these products and systems and (2) to the users’ safety, effectiveness, efficiency and satisfaction.

2 About Virtual Reality (VR)

According to Britannica’s Encyclopedia,Footnote 2 Virtual Reality (VR) is characterized by the use of computational modeling and simulation in order to allow the person to interact with an artificial three-dimensional (3-D) visual or other sensory environment. In VR applications, the user is immersed in a computer-generated environment that simulates reality through interactive devices. These devices send and receive information and are used as goggles, helmets, gloves or body suits. Typically, the VR user, wearing a helmet with a stereoscopic screen, sees animated images of a simulated environment.

Although performed in a simulated, synthetic environment, VR takes the physical body as a reference and keeps characteristics and similarities with the physical world so the user him/herself feels present and contextually perceives the experience through his/her own sensation of physical involvement [1].

Similar to a book or a picture, which transports the reader or the observer from the physical environment to that of the story or painting, VR transports the person from the physical world to an environment in which he/she is not physically present, although he/she feels like he/she was [2].

Even when resorting to the fictional, the simulation in this environment presents situations with a degree of realism that allows the user to make decisions and solve problems in the physical world [3].

Such feature has by definition the coexistence of biological and virtual bodies which act in a coordinated way so that the VR user can present him/herself, interact with and even modify the virtual world [4].

In other words, by using multimedia resources, computer graphics, image processing and others, to create synthetic environments, VR allows the body to act both as a support for cultural prostheses (user dressed with VR devices) and as sign (user immersed, metaphorized, in the virtual environment) [4].

In this context, concepts such as immersion, presence, interaction and involvement, fundamental to the study of VR, become indispensable to the understanding of the users physical and psychological experience in these systems [2, 3, 5].

Thus, this study focuses on the concept of immersion, which is related to the psychological state in which the user perceives him/herself involved by, included in and interacting with an environment that provides a flow of stimuli [2].

3 VR Usability Testing

ISO 9241-11 [6] defines usability as the extent to which a product can be used by certain users to achieve specific objectives with effectiveness, efficiency and satisfaction in a given context of use.

To Jordan [7], effectiveness corresponds to the extent to which a goal is achieved or a task is performed. Efficiency is characterized by the effort required to achieve a goal. The lower the effort, the greater the efficiency is. And the satisfaction both relates to the comfort level during the product use and the level of acceptance of the product by its users.

Iida relates usability to ease and convenience in products use, both in the domestic and professional environments. However, he warns that usability depends on the interaction between the user, the product, the task and the environment, so that the same product may be suitable for certain types of users and unsatisfactory for others or even suitable for certain situations and not suitable for others [8].

Due to the dispute between manufacturers, who constantly seek to improve the quality of their products, and also due to the increase of technological evolution over the years, usability becomes a strategic factor of competitiveness, differentiation and good practice [9].

In view of the above, it is indispensable: Efficiency; Efficacy; Easy of learning; Easy to remember; Few errors in use; Feedback of users actions; Safety; And user satisfaction so that the product presents a good usability.

In this regard, when evaluating VR devices and applications, it is important that both physical products and virtual environments could be tested while the user performs a specific task.

4 Physical Evaluation in VR Usability Testing

In a global, competitive economy, the design and development of products and systems needs to consider the wide variation of users’ characteristics [8].

All this diversity requires adapting the product/system (scaling, weight, shape, etc.) to human characteristics to ensure that the demands of this product/system are adequate to the capabilities and limitations of the user.

Thus, the usability evaluation process is present in the product/system design since the initial stages, with the definition of specifications, passing by the prototype until culminate in the final product/system.

In this context, the product/system usability can be improved by conducting users tests in order to make a detailed examination of the users’ interactions with the product/system. This assessment allows to thoroughly investigate the design details that may adversely affect the performance and acceptancy of these products/systems by its users.

Regarding the physical aspects, this evaluation can be performed both in the laboratory and in the field and includes the measurement of biomechanical and physiological results, such as torque, muscle activity and heart rate.

These results are then used to improve the design or even propose the redesign of the product already on sale.

5 Psychological and Behavioral Evaluation in VR Usability Testing

During usability testing, the participants are often asked to complete tasks and answer questionnaires. However, there are other forms of evaluation, especially with regard to psychological and behavioral aspects in situations of system use.

While performing tasks, users can smile, sigh, blush, squirm in the chair, lean forward, show impatience, and so on. Although non-verbal, users behaviors can be recorded and potentially measured, in order to provide subsidies for product redesign.

Both verbal and nonverbal behaviors can be observed. In some cases, these behaviors can happen fast enough that the observer can not record in time, as occur with facial expressions, for example. In such situations, video recordings of user actions are often employed [10].

In other cases, vital signs are investigated to assess the physiological response of the user during the task and thus obtain information about increased heart rate, respiratory rate, blood pressure, capillary glycemia and body temperature, among others. In some cases, even without conscious control, some behaviors may be observable. Sweating and pupil dilation, for example, requires monitoring. Such procedures will require the ergonomist to use specialized equipment for recording and/or monitoring this information.

6 Equipments to Capture the Users Behaviour

A more refined analysis of verbal and non-verbal behavior in usability testing requires the use of equipments for capturing facial expressions, physiological variations, body temperature variations, ocular tracking and brain mapping, among others.

The information obtained with such equipments provides objective and reliable indicators for the identification of ergonomic risks and difficulties in users performance during the task, in order to contribute to an adequate usability evaluation of the system under analysis.

6.1 Videographic Records of the Users’ Actions and Facial Expressions

Recognizing and interpreting users actions and facial expressions is critical to understanding human communication and interaction.

In addition, it should be noted that facial expressions provide greater accuracy on what users are really feeling in relation to what is said [10] as well as about the users reactions facing the product.

In such situations, the videographic records of the users actions and facial expressions contribute to a more detailed analysis of the product usability, in order to offer subsidies to ergonomic recommendations and suggestions of improvement for the evaluated system.

6.2 Eye Tracking

In recent years the use of eye tracking to usability evaluation of products and systems has become quite common. With its development, this technology has become increasingly reliable and easy to use [10], as well as offering valuable information about where the user looks while performing the task.

Tullis and Albert [10] define this fixation as a pause in eye movement in relation to a well-defined area on the system screen. These fixations are enumerated and indicate a sequence. The size of each circle is proportional to the length of the eye fixation. The movements between the fixations, also known as saccades, are evidenced by lines. This real-time eye tracking provides information that would be difficult to obtain in any other way.

6.3 Digital Infrared Thermal Imaging

The digital infrared thermal imaging consists of an advanced non-invasive, painless, precise and safe medical examination technique that captures the heat emitted by the body in the form of infrared radiation.

The records are obtained from a digital thermographic camera that captures the surface temperature of the skin through a specific software which transforms this information into scanned images called thermograms.Footnote 3

By not contacting the body, the equipment does not cause pain or discomfort to the user. Composed by individual pixels, each thermogram represents the exact measurement of the temperature at a specific point of the body, with an accuracy of 0.01 °C. From these images a colored mapping of the body surface is obtained.

6.4 Galvanic Skin Response (Skin Conductance)

It consists of a technique used to evaluate psychophysical changes by monitoring the electrical activity of the glands that produce sweat on the palms of the hands and fingertips that are more sensitive to emotions and thoughts.

It allows to detect strong emotions through the electrodes in contact with the fingers and can be used in the identification of situations that generate stress and anxiety [11].

Obtained through the equipment, the biofeedback or biological feedback enables the user to voluntarily regulate his/her physiological and emotional reactions in order to become aware of and improve the information obtained through learning, training and self-regulation mechanisms.

These sensors are connected to a person whose physiological responses are monitored and sent to a computer that processes the obtained data. When viewing the responses on the computer screen, the user is able to modify his/her own body responses, which characterizes the process of self-regulation.

In education and training, biofeedback can be used to measure and increase attention and cognition potentials, speed of decision making, and ability to manage stress in challenging or assessment situations.

6.5 The EEG Technology

Considered the oldest and most common technique for measuring brain activity, the EEG can detect the tiny electrical currents produced by neurons [12].

The electrodes located on the user’s scalp detect the electrical potential immediately underneath the electrode and the reference electrode placed at the spot that is electrically inert (e.g. earlobe) [12].

The EEG is composed by three types of neural activity: (1) spontaneous activity not correlated with any particular task; (2) task-related induced activity, but not related to particular events; And (3) evolutionary activity related to particular events [13].

The maps produced by the electrical activity of neurons, measured from the EEG, illustrate the differential of activation of the brain. Although it does not provide good detail, the EEG allows scientists to observe the occurrence of events on a time scale of milliseconds [12].

As a result of neural activity, the EEG presents a graph of electrical brain activity in which the vertical axis represents the voltage difference between two different locations of the user’s scalp (measured by electrodes attached to the scalp) and the horizontal axis, the time interval.

Techniques such as electroencephalogram (EEG), which measures and maps brain electrical activity, appear as allies in the hypothesis test on neural circuits related to VR users experience.

These measurements and electrical signals, in turn, can provide detailed information about the duration and course as well as the location of the activity in the brain of these users, which helps the ergonomist to construct a narrative about the events and psychological aspects related to the use of these products and systems.

6.6 The EEG Emotiv EPOC®

Because of the low cost, simpler versions of the EEG can be found in commercial products [12] such as Emotiv.Footnote 4

Emotiv EPOC® consists of a headset with 14 data channels, arranged in main and secondary stems where sensors are responsible for capturing the electrical activity of the human brain.

The data are obtained through the EEG measurement system, which analyzes the user experience according to the electrical signals produced by the neurons, which in turn are captured by sensors arranged on the users scalp.

These data will be recorded and stored in a computer by a simplified Brain-Computer Interface (ICC) through the wireless hardware device known as Emotiv EPOC®, which allows to observe what the user does and thinks during the execution of the task [14].

Currently, Emotiv® uses brain activation as an alternative means of controlling computer games and electronic devices.

7 Conclusion

The evaluation of VR products and systems requires to the ergonomist be aware about the ergonomic constraints and risks related to the use of physical devices (e.g., goggles, haptic gloves, body suits), as well as to the user’s representation and actions performed on the virtual, simulated environment.

As the user assumes new body configurations, the physical and virtual bodies act in coordination. Although metaphorized as an Avatar, it is possible for the user to both physically feel the experience and to learn from that experience in order to solve problems in the physical world. Paradoxically, the user, as an avatar, takes as reference the physical body and everything that is familiar to act and interact in the synthetic, simulated environment.

In such situations, it is up to the ergonomist to investigate how the virtual body interferes in the physical body and vice versa, since the user’s experience with the product/system may or may not contribute to its acceptancy by VR consumers and, consequently, to a greater competitiveness of the product/system.