Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Life is a narrative, and meaningful communication needs to be shaped within the boundaries of this narrative. The implication of this statement is that as a teacher you are a storyteller, as a marketing expert you are a storyteller, as an entertainer you are a storyteller—or as Walter Fisher (1987) says: We are all “homo narrans”.

Across cyber space, there is a new grammar of storytelling. Almost all disciplines are touching upon it, but from various perspectives, using diverging terminology, but with one common denominator: The experience of the story is enabled through the use of various digital tools. There is an enormous media convergence taking place evolving our ability to communicate. The well-known media guru Marshall McLuhan would probably consider it as extending our voices as storytellers, or perhaps rather a whole nervous system evolving outside our physical bodies. Think of it as the stories told by the campfires long ago echoing into our time with the help of new tools. Our aim is to understand this media convergence from an interdisciplinary perspective and to learn how to speak the new grammar of digital storytelling. We are embracing it as both content developers and researchers.

Disregarding what your purpose as a digital storyteller is—be it to sell products, to enhance learning, or simply to entertain—the core of this new grammar is about understanding the audience/learner/customer and how you can engage them in your storyworld across media platforms. Therefore, the concept of user experience (UX) is highly relevant, as it places the human in the centre of any kind of design solution, i.e. user-centred design. To take this one step further, Experience Design gives importance to the experience as the guiding star for the design, rather than technology in itself. According to this design genre, the aim of any design should start with a “Why?”—referring to the motives why the end users would want to engage with your product, the so-called be-goals (Hassenzahl 2010; Carver and Scheier 1989) . Our emphasis in both research and digital content development lies in experience design, in which technology becomes transparent and human needs and UX guides the design process. It is simply the idea of using technology for all the right reasons (Hassenzahl 2010).

Our aim in this chapter is to describe our notion of UX and audience research related to media content creation and format development. We discuss our mixed research methods in relation to research questions. We further illustrate this using an example how mixed data can be integrated in UX research during iterative design of a website, and how eye tracking is an essential method involved in several steps of analysis.

2 Experience as a Design Aim

Experience design is becoming more widespread among digital content creators. For instance, instructional designers are starting to realize that the job to design an e-learning course is not only about delivering material to be learnt, but also about managing a learning experience. Matthew Moore wrote a post in The eLearning Guild’s discussion forum (30 December 2009) on LinkedIn, titled “Do we need to move from instructional design to experience design?” The topic generated 80 comments in 15 days, which, after a quick glance in the forum, is about 99 % more than usual .

An experience is always dependent on many interrelated dimensions and aspects, which differs from person to person (Hassenzahl 2010) . We are interested in the dimensions of UX created by encountering various media content—mostly digital, and how we can design media solutions for targeting specific user needs in order to facilitate a great experience and user satisfaction. When testing digital media content on users, we emphasize on the difference between usability and UX . These two aspects of human interaction with any system are two sides of the same coin, whether the focus is on the technological product or the digital media content. Nonetheless, these perspectives raise different questions, require different methods and research instruments, and generate different results. In our research on UX we target both sides of the coin: for instance, users’ attributions to product qualities, pragmatic “do-goals”, hedonic “be-goals” (Hassenzahl 2010; Carver and Scheier 1989) , the affective experience (Watson and Clark 1994; Eilers et al. 1986) , needs related to the experience (Sheldon et al. 2001) , and mental effort (Zijlstra and van Doorn 1985) . We apply several research methods that are specifically chosen to support each other in order to form a holistic picture of the experience. This allows us to capture a broad perspective of UX, and we can validate findings by triangulating data collected by a variety of research instruments. This will be explained in the methods section discussing mixed research.

Hassenzahl (2010) describes experiences as both actual and reflective. Actual experiences are in the moment, objective phenomena dealing with usability, functionality, as well as visceral stimulation. Reflective experiences are subjective memories activated by experienced emotions, benefits felt, as well as stories we remember and tell about our experiences to other people. There are four essential characteristics to an experience: it is always subjective, holistic, situated, and dynamic. Given these characteristics, experiences can be shaped. This fact is something that the field of human–computer interaction still needs to recognize on a wider scale (Hassenzahl 2010) , and it is something that we find of great importance in our research on UX and media content development.

Experience can be regarded as meaning making. There are four concepts that can guide us in the design of technology and digital content, in order to support the end-user experience : (1) experience in interaction involves the dynamic process, (2) experience in interpretation involves how we actively construct meaning, (3) experience as what the designers offer and what the users bring to it, and (4) involves four dimensions of experience, which are inseparable, interrelated, and experienced as a coherent whole, i.e. practical involvement, cognitive involvement, emotional involvement, and sensual involvement (Vyas and van der Veer 2006) .

Table 1 illustrates categorization of experience according to Hassenzahl (2010), as well as Vyas and van der Veer (2006). We can look at an experience from three perspectives based on time: before, during, and after—or in other words, pre-experience, actual experience, and post-experience. This has implications for the methods we chose for looking at different aspects of an experience.

Table 1 Perspectives on the structure of an experience

3 Needs as Determinants of Experiences

The ultimate goal of the digital media content and products we are designing and developing is to invoke a positive UX . One approach to achieve this endeavour is to look at the role of human fundamental needs for the content of our experiences. We suggest that the source of UX lies in the needs that drive our actions and choices (Hassenzahl 2010; Wiklund-Engblom et al. 2009) . Research on needs address the content of experiences, rather than focusing on the structure of experience. Where, for instance, McCarthy and Wright’s (2004) experience framework, stop with the observation that experiences have an emotional–motivational and a meaning-making thread, the needs perspective clarifies where the emotion, motivation, and meaning comes from, as well as the essence of it. By that, it becomes much easier to address experiences in the context of product and media content design (Wiklund-Engblom et al. 2009).

According to Sheldon et al. (2001) , fundamental needs are limited in number. By looking at various theories, they found ten needs that seemed to represent the most important categories. This summary of needs builds upon well-known theories, such as Ryan and Deci’s self-determination theory, Maslow’s theory of personality, Epstein’s cognitive–experiential self-theory, and Derber’s lay theory of human needs. The ten needs are: self-esteem, autonomy, competence, relatedness, pleasure-stimulation, physical well-being, self-actualization-meaning, security, popularity-i nfluence, and money-l uxury.

Sheldon et al. (2001) wanted to know which basic human needs were mostly related to satisfying events. They developed a Likert-scale-based questionnaire including three items for each of the ten needs. In a number of studies, college students were asked to think of the most satisfying event they remember from the last month, and then fill in the questionnaire while thinking of this event. These studies continuously showed that the four most important needs that people related to a satisfying event were self-esteem, relatedness, competence, and autonomy. We are interested in creating positive experiences from digital media content. The needs approach is one way of defining the aimed experience. This approach was, for example, applied in the pre-design phase of a multiplatform music show The Mill Sessions. The ten fundamental needs were used as a base for a survey to capture peoples’ conceptions of what a great music experience is in relation to the ten needs (Esch et al. 2011) . The aim of the study was to confirm the design idea of the show.

4 Research Questions

Our questions involve looking at UX from (1) an ontological perspective; what a great UX is in specific contexts, (2) a methodological perspective; which methods should be applied in various studies in order to capture end-users’ experiences, as well as (3) a content design perspective; exploring how to design media content for a great UX .

The media convergence culture is upon us, and new generations are more or less expecting it. The convergence involves “the flow of content across multiple media platforms, the cooperation between multiple media industries, and the migratory behavior of media audiences who will go almost anywhere in search of the kinds of entertainment experiences they want”. (Jenkins 2006, p. 2) . To experience the flow of a story is often the aim with digital content. When the story flows between different media platforms it becomes transmedia storytelling. One of the most obvious questions designers have when creating transmedia solutions is the notion of keeping the audience with the story flow between options of content and discontinuities between technological platforms. The experience of the story (transmedia content) is intended to evoke feelings and thoughts by allowing for identification and participation. The question is how this transmedia UX is optimized—especially at the trigger points where users need to be motivated to move with the flow of the story. This is where UX research becomes critical in order to spot flaws of the design, as well as to explore potential design choices. To design for a transmedia experience, we have to account for media discontinuity as part of the experience, and this is not necessarily a negative thing. However, it is not only about delivering content, it is also about participation—letting the audience become part of a story in order to enable a deeper level of identification.

For transmedia learning, the UX research further involves the dimensions of the learning process with its specific aims for learning, such as motivating self-regulation , reflection, and higher order thinking. From the user’s perspective, transmedia storybuilding is the process of taking in a story told across platforms and creating your own version of it through your own choices and your own experience of the content (Wiklund-Engblom et al. 2013) . The transmedia seams across media discontinuities represent the choices you make for your own learning to move with the flow in co-constructing the content. The totality of your experience becomes your own storybuilding and transmedia learning process; what you have learned. The what, how, and why of your choices for constructing your own story reveal important aspects of your UX . These are, for instance, the physical, the sensory, the cognitive, and the affective influences on the experience. Other aspects are expectations, prior knowledge, prior experiences, social context, etc.

5 Mixed Methods

UX of media users is in many ways an abstract notion. How to best measure it is a debatable question. For instance, multidimensionality is an essential factor of UX , but still, not many studies explore the interrelations of dimensions (Bargas-Avila and Hornbæk 2011) . Over the years, we have used many different methods to collect both quantitative and qualitative data targeting the multidimensionality of UX, for instance, users’ observable behaviour, a variety of physiological reactions, product evaluations and attitudes, judgements of affect and effort, as well as users’ needs and reflections.

“Mixed methods” is the most common term for blending of research methods and methodological paradigms in a study. Broader terms are mixed research or integrative research, which according to Johnson and Onwuegbuzie (2004) would provide a more inclusive meaning . Creswell and Plano Clark (2007, p. 5) refer to mixed methods research “as a research design with philosophical assumptions as well as quantitative and qualitative methods”. Others argue that the minimum criterion for defining a research design as mixed methods is that you have an “interdependence of component approaches during the analytic writing process” (Bazeley and Kemp 2012, p. 70) . The key is the integration of data at some phase of the analysing process (Creswell et al. 2003; Bazeley and Kemp 2012) .

The purpose of using multiple methods in our UX research has been to capture the phenomenon as closely as possible, not being restricted by any methodological paradigm, but rather to explore how the use of methods can be expanded. Bazeley and Kemp (2012, p. 16) suggest that researchers should take “every opportunity to fully exploit the integrative potential of mixed data sources and analysis methods” if it benefits the research by resulting in better validity and stronger conclusions. Our goal is to find out the best ways of integrating mixed data in order to answer research questions. Our approach to research is, therefore, pragmatic. This is understandable, as we are looking at different media solutions in various media contexts, and these are forever changing into innovative new forms. Similarly, we need to be constantly re-evaluating what the best ways to answer research questions are. For instance, new technological solutions and software are providing new possibilities for analysing larger amounts of mixed data, which was not an option before because of the time it demanded.

Although mixed methods research has been carried out in various forms since the 1950s, it is only in the recent years that it has become a more widespread acknowledged research approach (Creswell and Plano Clark 2007) . Historically, there have been two base camps of purists when it comes to methodological approaches: the quantitative vs. the qualitative advocates. These often argue for the incompatibility thesis that the two paradigms never should or even could be mixed. However, nowadays there is a new set of advocates suggesting that focus should rather be on the many similarities between the two approaches. In this debate, the mixed methods research is seen as a third paradigm drawing on the strengths and counteracting the weaknesses of each of the first two methodological research approaches. As such, qualitative and quantitative research are seen as end points on a continuum where mixed methods are represented by the large middle area (Johnson and Onwuegbuzie 2004) .

5.1 Differentiating Methods

We explore methods to discover the added value of combining instruments and methods for collecting data. This methodological research process in itself is one of our areas of interest. We are using a number of laboratory-based methods to collect both quantitative and qualitative data targeting the multidimensionality of UX . In our laboratory for UX, we have equipment such as eye trackers, electroencephalography (EEG) helmets, and various psychophysiological instruments for measuring skin response and heart rate. In combination with these methods, we use stimulated recall interviews, think-aloud protocols, as well as traditional questionnaires for measuring emotional experiences, needs, and evaluations targeting hedonic and pragmatic qualities of media solutions.

In order to better understand the varieties of data we collect, we visualize it in Table 2. The table is a matrix of the methods used, categorizing them according to type of data generated. The matrix is organized by four categories of data characteristics: objective, subjective, formative (qualitative), and summative (quantitative). Which methods we use in a study is, of course, determined by the particular research questions at hand—usually generated by the phase of development during the iterative design of the digital content.

Table 2 Methods matrix illustrating types of mixed data

The table further illustrates how the differing data target a variety of research questions. In the following, the four questions heading each of the four centre cells will be described in relation to the variety of methods we employ.

Question 1: What did they do?

Observations of user activities and eye movements recorded on the screen generate objective formative data. User activities on the screen are recorded by video and audio using the functions of the TobiiStudio recording software. The qualitative eye-tracking data, i.e. the eye movements or scan paths, which are synchronized with the screen recording, can be imported as video into, for instance, the QSR Nvivo analysing software. Eye-tracking data may be used as a base for a narrative description of the event, or for describing and categorizing activities and task performance. Participants are also filmed during the session. These video recordings are used as validity check for psychophysiological reactions, which are sensitive to body movements and other disturbances.

Question 2: Why did they do/think what they did?

Interviews and concurrent think aloud by participants during task performance generate subjective formative data. Instant recall interview questions relating to usability and UX are often asked after each task during a test session. Screen recordings, including eye tracking, are used as a stimulus for the stimulated recall interviews. Furthermore, the participants are often encouraged to think aloud during task performance, in order to visualize their thought process and motivations for actions. This is referred to as concurrent think aloud in contrast to retrospective think aloud where participants reflect on their actions while watching their eye movements after the actual task performance is over (Tobii 2009). This is an example of the difference between researching UX as a process or as a memory in retrospect. Interviews and the concurrent think aloud are usually transcribed and may be imported into, for instance, the QSR Nvivo software for qualitative categorization and/or mixed analyses in which it is compared and correlated to quantitative data. The subjective formative data are in many ways the most critical in determining the essence of the UX, and also for validating objective data analyses.

Question 3: How much reaction and action was measured?

Objective summative data are generated from measuring psychophysiological reactions and eye movements. Skin response data are measured with the Affectiva Q-sensor or clip-on sensors on the index finger. The sensor registers changes in the electrical conductance of the skin that are driven by sympathetic or parasympathetic nervous system activation controlled by the brain. The sympathetic nervous system is activated when we have an emotional response or a stress response to stimuli. As opposed to this, parasympathetic activation means the subject is calm and has low stress levels. High skin conductance levels, or electrodermal activation (EDA) levels, mean sympathetic activation and emotional response; low EDA levels mean parasympathetic activation and low emotional stress levels (Poh et al. 2010) . The Affectiva Q-sensor is placed on the wrist and is unobtrusive to the test persons. EDA data are transferred from the sensor to a computer and synchronized with eye-tracking data and video recordings for the analysis. By defining areas of interest (AOI) in the eye-tracking data, we can correlate the user’s attention to these AOIs with psychophysiological reactions during this defined attention span .

Participants’ eye movements are recorded using TobiiX120 and the recording software TobiiStudio. Our eye movements can be categorized in two ways: the fixation on something we are looking at, and the movements in between fixations, which are called saccades. The eye tracker records both fixations and saccades (Tobii 2009). This enables us to measure, for instance, time spent in areas of interest, initial perception , and order of actions taken and fixations made within the media environment. An analysis of the quantitative eye-tracking data would include measuring fixations and saccades in AOIs, mouse clicks, time, etc. The quantitative eye-tracking data, i.e. number and length of fixations within pre-defined AOI and areas of non-interest (AOINS), may be exported as an Excel file, which enables import into, for instance, Statistical Package for the Social Sciences (SPSS) or the QSR Nvivo software.

Brain waves (EEG data) are registered with the Emotiv EPOC, including 14 sensors to be placed on the scalp. The sensors detect a person’s emotional experience by tuning into the electrical signals the brain is producing. In addition to raw EEG data, the EPOC uses an algorithm to estimate a person’s levels of frustration, engagement, and meditation related to the experience subjected to study.

Question 4: Who are they: identity and attributes?

Demographic data about the participants, as well as questionnaire data covering attitudes, needs, preferences, etc., can be used for describing the users in relation to the experience they had during their activities in the session. These kinds of instruments generate subjective summative data. As a post-questionnaire, we often use the abridged version of the AttrakDiff2 (Hassenzahl and Monk 2010) , developed for evaluating interactive products. It consists of ten seven-point semantic differential items, including four items measuring pragmatic quality (confusing–structured, impractical–practical, unpredictable–predictable, complicated–simple), four items measuring hedonic quality (dull–captivating, tacky–stylish, cheap–premium, unimaginative–imaginative), and two items measuring general product evaluation, i.e. goodness (good–bad) and beauty (beautiful–ugly). Traditionally, subjective summative data are mostly used for studies aiming for generalizations based on large data samples. However, we use it as indications for product evaluations, in order to see if the targeted object of the evaluation is up to the standards aimed for in the product development. These quantitative data can further be used in mixed analyses in correlations with qualitative data, or with objective summative data, such as eye-tracking measures. This will be further explained in the example below.

For the non-verbal measuring of mental effort, positive/negative affect, and enthusiasm/serenity, we use the Subjective Mental Effort Questionnaire (SMEQ) and the Self-Assessment Manikin (SAM). The SMEQ is a self-report questionnaire measuring the amount of mental effort invested during task performance. There is a numeric scale parallel to a verbal scale, which ranges from 0 (no effort at all) to 220 (very much effort). Thus, the participant can judge his/her mental effort according to both scales (Eilers et al. 1986; Zijlstra 1993) . The SAM scale, developed by Lang (1980), is a non-verbal, pictorial questionnaire measuring general emotional states, including valence, arousal, and dominance. (The dominance scale is not included in our research at this point.)

There are a variety of reasons for the addition of these questionnaires in our hands-on system testing. First of all, the questionnaires are applicable to studies conducted in different contexts, on different target groups and on different media content as well as media technology. As the scales are short and easy to fill in, it is possible to measure test participants’ experiences several times during a media encounter.

5.2 Example of Mixed Research for Measuring UX

The integration analysis, i.e. mixed research using both qualitative and quantitative data, should be guided by key issues related to the research questions, rather than separating the analysis by method (Bazeley and Kemp 2012) . The following example of an integration analysis is guided by the aim to explore usability and UX of a website (Table 3).

Table 3 Type of data, methods, and research targets in relation to UX key issue

Key issues concern, for instance, first impressions, perceived purpose, and motivation. In order to illustrate the potentials of integration of mixed methods in analysis, we will present one key issue targeting the relation between first impression, perceived purpose, and motivation.

The first impression we get of something affects our experience. Therefore, participants’ first impressions are crucial in determining UX. It is further of importance for determining usability, as the functions of the website need to be designed to guide the user in a positive direction in using the site from the first point of entry. By using mixed methods, we can look at this key issue from several perspectives:

  • Users’ initial perceptions are recorded from their eye movements

  • Their first thoughts while using the site are recorded from their concurrent think aloud

  • Their physiological reactions (EDA) are recorded, from which the first encounter and first task can be extracted

  • Time to their first fixation on the screen can be measured from the eye-tracking statistics

  • Their impressions of the site are discussed during the stimulated recall interview

Table 4 lists the different methods we can use to approach this key issue. It is an example of how multiple methods are employed in order to answer research questions related to UX. In the table, research targets are separated by the kind of data generated (as described earlier in Table 2).

6 Iterations of Analysis

NVivo nodes (categories) are created for important aspects related to the first impression using qualitative data. Participants are clustered (fast vs. slow) by how fast they make their first fixation on the website using eye-tracking data. They are further clustered by their physiological reactions for the first fixation in which EDA data and eye-tracking data are synchronized. These clusters are imported into NVivo as case attributes linked to each participant. All case attributes can be cross-tabulated in a matrix in order to explore the qualitative data, which will be sorted according to matrix cross-tabulations as shown in Table 4.

Table 4 Matrix illustrating integration in analysis using mixed methods

Furthermore, our hypothesis is that a person’s motivation for and perceived purpose and value of a website will influence his/her first impression. The scaled questionnaire variables related to perceived purpose are dichotomized (high vs. low), and participant clusters are made based on these. Thereafter, the clusters are imported into NVivo as case attributes. The interview discussions related to motivation are categorized and coded. NVivo nodes for motivation are then used in a matrix, as presented in Table 3, in the same manner as described above.

7 Software for Data Manipulation and Synchronization

The rich data of varied types that are produced during investigations cause concern as it is very time consuming to do analyses using mixed data of these kinds. Therefore, we are further developing our software program, eValu8, to assist in the analysing process of data collected with the above-mentioned instruments. Important steps of analysis according to the mixed research process model are: data reduction, data display, data synchronization, data transformation, data comparison, data correlation, data consolidation, as well as data integration . Our aim is to be able to develop the eValu8 software as a useful tool to assist these steps of the analysing process. It is also important that the eValu8 can be synchronized with other analysing tools such as SPSS and the QSR Nvivo software, as these are inevitable in for instance the data reduction process (Table 5).

Table 5 Examples of qualitative and quantitative data manipulation and synchronization in mixed research

The core of mixed methods research is legitimation, assessing the trustworthiness of data. The mixed data are used for validating findings of both the qualitative and quantitative data and subsequent interpretations. The legitimation process might include additional data collection, data analysis, and/or data interpretation until as many rival explanations as possible have been reduced or eliminated. In Table 5, we are using terminology from mixed methods research (Johnson and Onwuegbuzie 2004) to describe the functions we aspire for in developing the data synchronization tool eValu8 .

8 Concluding Thoughts on Mixed Methods in UX Research

When exploring UX in, for instance, a computer-based environment, as in the above-mentioned example, for the purpose of improving the user interface, the research questions are not always obvious or even possible to pinpoint on a specific level. Rather, the researcher must be able to shift focus whenever something of interest arises. The research situation needs to remain flexible to the dynamics of human experiences targeting a specific event or process of activities.

The added value of using mixed methods within the field of UX research is that it provides the potential for diving deeper into the analyses, and we are able to ask more intricate questions of the data. Instead of separating methods as incompatible as they generate different types of data, we would rather like to see them as pieces of a puzzle, in need of a little twisting in order to fit the larger picture. A variety of pieces can be used to validate perspectives on what is, elaborating on findings, as well as provide expanded possibilities to ask questions. However, analysing mixed methods data is messy. It is also very much experimental. As a mixed methods researcher you need to have “flexibility and pragmatism about design, openness to data, and a touch of inventiveness in approach to analysis” (Bazeley 2012, p. 825) . One aggravating circumstance is that several iterations of analyses need to be made whenever a study includes rich sources of mixed data. However, this is also positive as the researcher builds deep levels of meaning through an iterative analysing process (di Gregorio and Davidson 2008) .

One thing we are certain about is that an experience of a media user needs to be explored as an event in which both objective and subjective data illuminate important parts of the puzzle as a whole. Another methodological reflection is that research methods must be flexible enough to explore an unlimited number of experience dimensions. For us, this is especially interesting as we are involved in both development and content testing research in the field of crossmedia and transmedia creation and innovation, in which the essence of the UX makes the difference between failure and success (Hassenzahl 2010) . This kind of exploration of UX is limited if the research remains restricted by paradigmatic boxes and divisional thinking. Our challenge lies in looking beyond the norm of what UX is and how it can be measured, observed, and explored. Therefore, we are advocates of mixing methods, team building, and collaboration across all kinds of borders, be it normative, paradigmatic, scientific, or merely psychological.

There are several challenges of UX research as part of a creative development process. These challenges can shortly be referred to as time, money, and asking the right questions at the right phase of development. Every project presents its own challenges, and for the UX researcher this is a never-ending, but nonetheless exciting learning process.