1 Introduction

Computing technology is becoming increasingly present in our everyday environment. These technologies are often equipped with user interfaces such as keyboards and touch screens: traditional methods of human–computer interaction (HCI) that typically require focused attention during interaction. As a result of these developments, researchers in the field of HCI have foreseen a challenge in fluently embedding computing technologies in people’s everyday routines [13]. To address this challenge, Weiser and Brown envisioned calm technology [3], an approach inspired by the observation that many interactions with the physical world take place in the background or periphery of attention, while they may also engage the center of attention when this is relevant or desired. For example, we are aware of what the weather is like, or we can drink coffee from a cup without conscious thought, while we may also intentionally look outside to see if it is raining, or intentionally sip from our cup to check if the temperature is right. These activities are available to be undertaken in the periphery of attention, but can easily shift to the center of attention and back.

The approach of employing the periphery of attention when interacting with computing technology was initially presented as calm technology [3] and later explored under a number of terms such as ambient information systems [4] and peripheral displays [5]. These research areas focus on presenting information that is to be perceived in the periphery of attention. Recently, a few studies have been conducted under the term peripheral interaction [69], aiming to broaden the scope of calm technology by designing not only for the perceptual periphery but also enabling users to physically interact with the digital world in their periphery. The authors have been active in this area by developing and evaluating a number of peripheral interaction designs for a primary school context [9, 10]. These and related studies [68] have provided preliminary support for the feasibility of interactions with technology taking place in the periphery of attention.

Given the increasing number of interactive systems that support everyday activities, it seems impossible and undesirable for all technology to be in our center of attention. In fact, it appears inevitable that many interactions with everyday interactive systems will at times take place in the periphery of attention. Since traditional methods of HCI are intended for interaction in the center of attention, we believe that the alternative approach of peripheral interaction may be beneficial for many researchers and practitioners in the area of interaction design.

This paper addresses the question: How can HCI researchers and practitioners anticipate, facilitate and evaluate peripheral interaction with the interactive systems they are studying or developing? After addressing background literature, we explore this question through an in-depth analysis of three case studies on peripheral interaction design and evaluation: CawClock [10], NoteLet [10] and FireFlies [9]. Abstracted from both literature and these case studies, this paper first discusses two generalized essential characteristics of peripheral interaction. Next, we discuss how these characteristics may be taken into account in interaction design and research, by presenting considerations for peripheral interaction.

2 Background

This paper presents characteristics of and considerations for peripheral interaction design and evaluation. In this section, we will first address divided attention and multitasking theory, in which peripheral interaction is grounded. Subsequently, we will discuss examples of related research and design in the area of peripheral interaction.

2.1 Divided attention and multitasking theory

The concept of peripheral interaction originates in the observation that in many everyday life situations, multiple activities can be performed at once. This phenomenon is elaborately addressed by divided attention theory [11, 12], which describes attention as a finite amount of mental resources that can be divided over different activities. These activities can be bodily (e.g. walking), cognitive (e.g. thinking) or sensorial (e.g. listening to music). When such activities require only few resources, multiple activities can be performed at once. The resource demand of activities depends on several factors such as the difficulty of the operation. Additionally, the automaticity [13] or habituation [14, 15] of activities influences the amount of resources required: activities that have been trained extensively, such as walking, require only few mental resources. The division of resources over activities is furthermore influenced by the likelihood of activities being performed, which is managed by the supervisory attentional system [16]. For example, when cooking, one is more likely to open the refrigerator than to start typing an email on the laptop at the kitchen table, even though both activities are equally available. Resources are thus more likely to be allocated to certain activities than to others.

While divided attention theory describes attention as the division of mental resources over different activities, these resources cannot arbitrarily be divided: concurrent multitasking [17] is only feasible under certain conditions. This is clarified in the theory of threaded cognition [18], which describes each activity a person is performing as a cognitive thread. Multiple threads can be active at the same time, for example, we can easily drive a car (thread one) and listen to the radio (thread two) at the same time, as also evident from multiple resource theory [19]. Next to the so-called central procedural resource, which coordinates the execution of multiple threads, these threads can make use of various ‘peripheral resources,’ such as visual resources, motor resources or memory resources [20]. As described by threaded cognition theory [18], each particular resource can only be used by one thread at a time. For example, since one can only look at one visual object at the time, a person who is driving while using a navigation system can only look at either the road or the navigation system’s display. When both require visual attention, a bottleneck [20] occurs and one of these two threads must wait before the visual resource is free. Therefore, the extent to which two activities can be performed in parallel depends on their stage of execution and the particular resources they require.

In the area of visual perception, the word periphery is often used when referring to the parts of vision that occur outside the center of the visual field [12]. Authors in the area of HCI generally use the term periphery in a broader context, to name ‘what we are attuned to without attending to explicitly’ [3, p. 79]. In line with divided attention theory, we describe the center of attention as the one activity to which most mental resources are currently allocated, while the periphery consists of all other activities (also see [21]). An activity can therefore be performed in the periphery of attention when another activity is being performed simultaneously in the center of attention, which requires more mental resources.

2.2 Related research and design

The observation that traditional implementations of HCI demand focused attention, which prevents them from being seamlessly integrated into the everyday world, was first observed by Weiser et al. [2, 3]. They suggested that computing technology should vanish into the background, not only by ‘hiding’ it in the environment, but rather by integrating their use in the everyday routine such that interactions can take place outside the focus of attention. Weiser and Brown [3] later coined the term calm technology, which ‘engages both the center and the periphery of our attention, and in fact moves back and forth between the two’ [3, p. 79]. As they envisioned, when interactions with technology would be available to be undertaken both in the user’s periphery and center of attention, people could be in control of technology without being overburdened by it. Similar to interactions with our everyday environment, calm technology is intended to support technology in becoming a seamless or unremarkable [1] part of everyday routines.

Building on the ideas of Weiser et al. [3], many researchers have aimed to employ the user’s periphery of attention. Although the initial idea of calm technology did not specifically focus on peripheral perception, by far most of the work it inspired aimed to develop and evaluate visual and auditory displays which subtly present information such that people can perceive it in their periphery of attention [4, 2226]. An early example of a calm technology design is the Dangling String [3], a ‘plastic spaghetti string’ that spins based on the information sent through the Ethernet cable, forming a visual and auditory display which subtly presents the network activity. Pinwheels [27] is a large-scale installations of pinwheels whose physical motion can represent various types of digital data, such as the activity of people in the room in which it is installed. Water lamp [28] shows the heartbeat of a significant other as shadows of water ripples on the ceiling to promote a feeling of connectedness. SnowGlobe [29] also aims to support social connectedness between two remote living rooms, through subtle light changes on a physical artifact. Specific for the office environment, Audio Aura [26] uses background auditory cues to provide office workers with information such as the availability of colleagues. ShareMon [30] is an application that enables computer users to monitor background file sharing events through audio.

Only few recent studies are known that explored physical interactions with technology to take place in the periphery of attention. Edge and Blackwell [7] present a design that consists of digitally augmented physical tokens that can be manipulated on the side of the office workspace outside the visual focus. StaTube [6] is a peripheral interaction design that can be physically manipulated to set and change the user’s instant messaging status, while the status of contacts is subtly presented through colored light. Similarly, Olivera et al. [8] studied physical six- and twelve-sided dice that could be peripherally rotated and placed on one of their sides to set the user’s social network availability status. PinchWatch [31] is a wrist-worn device that recognizes gestures made with hand and fingers such as sliding with one finger along another finger. These gestures can be performed during other activities, and they can be interpreted as input by PinchWatch, e.g., to adjust the volume of a music player. Similarly, Whack Gestures [32] are ‘inexact and inattentive interactions’ [32, p. 109] through which a user can respond to a cue on his mobile phone or PDA by firmly striking the device while it is in his pocket.

In everyday life situations, both actions and perceptions seem to shift between the center and periphery of attention. The area of peripheral interaction [69], which aims to fluently embed meaningful interactive systems into people’s everyday lives, therefore encompasses both perceptions of and interactions with computing technology. Such perceptions and interactions can take place in the periphery of attention and shift to the center of attention when relevant for or desired by the user. In order to cover a broad range of interaction possibilities, the three case studies we discuss in this paper explore three approaches to peripheral interaction: (1) peripheral perception, (2) physical peripheral interaction and (3) a combination of the two.

3 Peripheral interaction case studies

The aim of this paper is to present characteristics of and considerations for peripheral interaction, which may support HCI researchers and practitioners in anticipating, facilitating and evaluating interactions with everyday interactive systems that can shift between center and periphery of attention. We identified these characteristics and considerations based on extensive previous work in the area of peripheral interaction, represented here by three case studies.

Each of these case studies was conducted in the context of a primary school, with the teachers as the main users of the peripheral interaction designs. The reason for selecting this target group is that the everyday routine of primary school teachers is characterized by a large number of activities, such as explaining lessons to the class and giving instructions individually or in groups. Next to these primary tasks, several secondary tasks have to be performed as well, such as handing out assignments, monitoring the children’s progress, keeping track of the time and preparing the next lesson. Although some of these secondary tasks could valuably be supported by technology, the technologies currently present in the classroom, e.g., interactive whiteboards and desktop computers, seem unsuitable since they require focused attention. We therefore believe that primary school teachers are a promising target group for peripheral interaction.

In the case studies presented in this section, we adopted a research-through-design [33, 34] approach, which involved the design, development and evaluation of prototype versions of interactive systems. These prototypes should be considered research tools developed to explore the concept of peripheral interaction, rather than finished products. Since peripheral interaction aims to enable interactive systems to fluently embed into people’s everyday routines, each prototype was evaluated in the real context of a classroom for a few weeks. The first case study explores peripheral perception of information through a design called CawClock [10] while the second case study involves a design called NoteLet [10] intended for physical interaction that is to take place in the periphery of attention. The third, more elaborate, case study combines peripheral perception with physical interaction in an interactive system called FireFlies [9], which builds on the earlier two case studies.

3.1 CawClock

CawClock [10], see Fig. 1, is an interactive clock intended for the first grades of primary school in the Netherlands. These grades consist of 4- to 6-year-old children, many of whom are not yet able to read the clock. CawClock is intended to support time awareness, and it displays the time as a regular analog clock. Furthermore, four physical tokens are available, each with its own color and image of an animal on it. The teacher can place these tokens on the clock to mark a time frame. For example, when at 10.30 h, the teacher wants to instruct the children to work on an assignment until 10.45 h, she can place a token next to the 9 of the clock, where the clock’s minute hand will be at 10.45 h. As a result, the part of the clock between the 6 and the 9 (the current time and the end of the time frame) will be colored in the color of the token. While the time frame is ongoing, a background soundscape is played that corresponds to the animal on the token (e.g., cat sounds, bird sounds), informing the teacher and children that the time frame is ongoing. To indicate how much time has approximately passed, the soundscape gradually changes; the number of animals heard increases toward the end of the time frame. The audio of CawClock is intended to provide peripheral awareness of marked time frames.

Fig. 1
figure 1

Illustration of peripheral interaction design CawClock (top) and the prototype version of the design deployed in a primary school classroom (bottom)

A fully functioning prototype version of CawClock was deployed for 2 weeks in a primary school classroom, in which the teacher used the design for 6 days. This deployment was evaluated through informal observations in the classroom, an individual interview with the participating teacher and a group interview with the participating teacher and two of her colleagues.

3.2 NoteLet

NoteLet [10], see Fig. 2, is intended to support the teacher in observing children’s behavior, by enabling him or her to take pictures of the classroom through peripheral interactions on a bracelet. An important secondary task of primary school teachers is to keep track of the children’s development over time, in areas such as motor skills, social skills and language. Observations of children’s behavior are used as input for evaluating these developments. For example, when a teacher sees a child collaborating well with another child, a note needs to be taken. Though important, taking these notes often distracts teachers from their main tasks. NoteLet consists of a bracelet that the teacher can wear around the wrist. When the teacher squeezes his or her wrist, a camera located in the corner of the classroom takes a picture. This picture is stored on the teacher’s computer along with the date and time. Alternatively, teachers can use the back of the bracelet, on which the names of all children are listed. When touching the area next to a name, not only a picture but also the child’s name is stored, making the recorded information more detailed. The teacher can use these pictures at the end of every few days when entering observations in the computer. Since NoteLet is a wearable design, it can be at hand any moment. Taking pictures is intended to be a quick and straightforward action that can potentially be performed in the periphery of attention.

Fig. 2
figure 2

Illustration of peripheral interaction design NoteLet (top) and pictures of the NoteLet prototype: manipulating the bracelet to take a picture without (middle) or with (bottom) a name

A working prototype version of NoteLet was deployed in a primary school classroom for 2 weeks. Similar to the deployment of CawClock, the teacher used the design for 6 days, and observations and interviews were conducted for evaluation.

3.3 FireFlies

Building on the CawClock and NoteLet designs, we conducted a third case study in which we developed a design called FireFlies [9]. This design was developed for the third, fourth and sixth grades (children’s ages 6–9) of primary schools in the Netherlands and is intended to support various secondary tasks of teachers. FireFlies, see Fig. 3, is an open-ended design which consists of three separate design elements: the light-objects, the soundscape and the teacher-tool.

Fig. 3
figure 3

Pictures of the FireFlies prototype: a light-object lit in different colors (top); the teacher-tool when selecting a color, selecting a child’s name and clipped to the user’s clothes; and FireFlies deployed in a primary school classroom

As part of the FireFlies design, each child has a light-object on his desk, which can light up in red, green, blue or yellow, or the light can be off, see Fig. 3. While one or more light-objects are on, an ongoing background soundscape of nature sounds is played depending on the colors that are currently in use. Each color is connected to a specific nature sound: bird sounds (yellow), ocean sounds (blue), cricket sounds (green) and owl sounds (red). The soundscape is designed as a peripheral auditory display, which can be used to obtain overall background awareness of the current colors of the light-objects, without having to look at them. The teacher can set the colors of the light-objects and thereby influence the soundscape through interactions on the teacher-tool, see Fig. 3. This is done by first selecting a color using the slider on the top of the tool. Each child is represented by a bead attached to a string on the bottom of the tool. To set a child’s light-object to the selected color, the teacher squeezes the corresponding bead. Alternatively, teachers can set all light-objects to the same color at once using the button labeled ‘everyone’ on the top part of the teacher-tool. The teacher-tool can be clipped to the teacher’s clothes to easily carry it around the classroom. The interactions with the teacher-tool are intended to be quick and easy so that they can be performed during the everyday routine in the periphery of attention. The purpose of FireFlies is open-ended: it is not predefined for which goals and at which moments FireFlies should be used; this can be chosen by teacher. We thereby aimed to make sure that teachers would be able to use FireFlies for a personally relevant goal.

A fully functioning prototype of FireFlies was deployed in four different primary school classrooms for 6 weeks each. Participating teachers used FireFlies to indicate what the children were expected to do (e.g., to work independently on a task or to collaborate with their neighbors) and to communicate short messages to individual children (e.g., calling a child to the teacher, sending a child to work on the computer, giving a child a compliment). The deployments of FireFlies were evaluated through formal and informal video analyses and through interviews with teachers and children.

3.4 Connection between case studies

Of the three presented case studies, FireFlies was clearly the most elaborate study: the design built on the designs of CawClock and Notelet and the evaluation of FireFlies were much more extensive compared to the evaluations performed in the two earlier case studies. As a result, we gained more elaborate insights in and more detailed examples of peripheral interaction in the FireFlies case study, and several of these insights also confirmed findings of the studies with CawClock and NoteLet. While many of the examples we describe in the coming sections may come from the FireFlies case study, the generalized characteristics and considerations we present in this paper therefore resulted from all three case studies.

4 Characteristics of peripheral interaction

The aim of this paper is to present generalized insights in peripheral interaction design and evaluation in order to support HCI researchers and practitioners in anticipating and facilitating their design being used both in the periphery and the center of attention. As a first step to reach this objective, this section presents two main characteristics of peripheral interaction: shifts between center and periphery of attention and peripheral interaction’s personal nature. These characteristics are elaborated by discussing how they are grounded in our case studies and by underpinning them with theory. In Sect. 5, we explain how these characteristics can be considered in the design and evaluation of peripheral interaction.

4.1 Shifts between center and periphery of attention

The intention of peripheral interaction is to enable everyday interactive systems to be available in the periphery of attention where they may easily shift to the center of attention and back. Such shifts are therefore an important characteristic of peripheral interaction. In our case studies, we gained more specific insights in how such shifts may take place, which we will elaborate on in this section. We start with a detailed look at how single interactions can shift between center and periphery, followed by a contextual look in which we discuss the relation of these shifts to the context in which they take place.

4.1.1 A detailed look

In the evaluations of particularly NoteLet and FireFlies, we found it valuable to split up the interactions into smaller stages of action when discussing whether they took place in the participant’s periphery of attention. In other words, single interactions shifted from the center to the periphery of attention and back, in between different stages of this interaction. We can clarify this by discussing peripheral interaction in the light of Norman’s action cycle [35], see Fig. 4. Norman’s action cycle is a frequently used model to describe interactions with technology (for example [36, 37]), and it seems particularly suitable to describe peripheral interaction as well. In our view, peripheral interaction encompasses both action and perception, and Norman’s action cycle clearly binds these two aspects of interaction in one comprehensive model.

Fig. 4
figure 4

Norman’s action cycle [35, p. 47]

According to Norman’s action cycle, an action consists of seven stages. In order to discuss how interactions may shift between center and periphery of attention, we will apply this model to example interactions with FireFlies. In Fig. 5, we present three example interactions, which are inspired by the teacher’s interactions we observed during the deployment of FireFlies [9]. For each example, Fig. 5 illustrates the way it complies with Norman’s action cycle, and its potential shifts between center and periphery of attention. The illustrations in Fig. 5 are hypothetical and intended to feed the discussion below rather than to provide an accurate and conclusive overview of how the participating teachers’ interactions with FireFlies shifted between the center and periphery of attention.

Fig. 5
figure 5

Three example interactions with FireFlies, and the way these examples may shift between the center and periphery of attention at different stages of Norman’s action cycle [35]. The start of each interaction is indicated by a black circle and the end by a short black bar. Stages of interaction are indicated by dotted circles and explained in text

Figure 5a illustrates a situation in which a teacher uses FireFlies to give a child a compliment by making his light-object green. The interaction starts when the teacher observes that the child is working well and decides to give him a compliment. After forming this goal, the teacher forms the intention to use FireFlies to reach this goal. Next, the teacher specifies an action-sequence and executes this sequence: she grabs the teacher-tool, locates the color green, slides the color slider to this color, locates the correct child’s name and selects this name by squeezing the bead on which it is printed. The teacher then perceives the result of her interaction: she sees a green light on the child’s desk, she hears cricket sounds in the soundscape which reveal that the color green is currently in use and she hears or sees the child’s reaction to the compliment. The teacher can interpret from these perceptions that indeed the light turned green and evaluate that her goal of giving a compliment was reached. The other two examples in Fig. 5 also illustrate interactions with FireFlies, which go through the same seven stages of action, be it in a slightly different manner. The interaction illustrated in Fig. 5b for example starts with a perception rather than by forming a goal, and the example in Fig. 5c shows an interaction that is shortly interrupted by another activity.

As shown in Fig. 5, some stages of interactions may take place in the periphery, while other stages can be in the center of attention. The interaction in Fig. 5a for example started in the center of attention when the teacher consciously decided to give a compliment, but shifted to the periphery of attention when deciding do to this with FireFlies: the teacher automatically grabbed the tool without actively deciding to do so. Later, it shifted to the center of attention to locate the correct child’s name and back to the periphery when evaluating if the interaction was successful. As shown in this and the other examples in Fig. 5, these shifts can happen quickly and frequently, between different stages of interactions. Even short interactions that may require only a few seconds can shift between center and periphery while the interaction is ongoing.

4.1.2 A contextual look

From the previous section, it becomes clear that interactions can shift back and forth between the center and periphery of attention, at different stages of these interactions. While this gives an interesting detailed view on single interactions, we also realize that interactions with everyday interactive systems do not stand on their own, but strongly depend on their context and the user’s everyday routine.

The main aim of peripheral interaction is to support everyday interactions with technology to meaningfully blend into the daily routine in a real-world environment. In the everyday world, multiple activities are taking place at once. For example, the teacher in the scenario illustrated in Fig. 5c is interacting with FireFlies to call two children to her desk. However, at the same time, she may be explaining what exercise the children need to do, walking to her desk, remembering to have an absent child redo the exercise tomorrow, seeing a child raise his hand to ask a question, hearing two children in the back chatting and seeing a child’s pencil fall on the floor. This scenario seems chaotic but such ‘chaos’ seems common practice in many everyday situations. All these individual actions and perceptions can be described through the stages of Norman’s action cycle [35]. This means that, in everyday situations, numerous sequences of action are performed at the same time. Though the examples in Fig. 5 each show only one line that represents an activity, in reality, numerous lines are present which move crisscross between center and periphery of attention. The teacher in the previous scenario may shortly discard her interaction with FireFlies to answer the question of the child who raises his hand (also see Fig. 5c); she may continue her explanation while walking to her desk and picking up a pencil that fell on the floor; and she may form the intention of writing down a reminder about the absent child, but discard that activity after hearing two children chat and deciding that correcting them is currently more urgent. As this example illustrates, in real-world situations, multiple activities are being performed at the same time, activities may start and end in the middle of the action cycle and stages of the cycle may completely be skipped or activities may be discarded. Therefore, when discussing how and why interactions may shift between center and periphery of attention, we should not only look at the step-by-step description of such interactions but also realize that these interactions cannot be seen apart from the users’, possibly chaotic, everyday contexts and routines.

4.2 Peripheral interaction’s personal nature

The above discussion reveals that interactions may frequently shift between the periphery and the center of attention and are strongly connected to the contexts and the routines that these are a part of. Based on theory of divided attention and multitasking (as described in Sect. 2.1), actions may shift between periphery and center of attention as a result of various factors, such as difficulty [13] of the operation or habituation [14, 15] of the activity. These factors may clearly differ from person to person: Habituation happens only if individuals gain experience in performing an activity, and certain activities may simply be more difficult for one person than for another. Therefore, a second characteristic of peripheral interaction is that it is highly personal. As became clear from our case studies, the personal nature of peripheral interaction is mainly manifested in the observation that it requires both learning and unlearning and in the individual users’ personal mind-set.

4.2.1 Learning and unlearning

Since interactions can shift to the periphery of attention when they are habituated [14, 15], getting used to an interaction is needed before it can potentially become a peripheral interaction. In the FireFlies case study, we observed that some elements of the design could quickly be learned and potentially become habituated, while this required more time for other elements. This also differed between individual teachers. Most teachers for example rather quickly understood how they could manipulate the teacher-tool to change the colors of the light-objects. These color changes also influenced the soundscape, which represented each color that was in use though a specific nature sound. Different from the interactions on the teacher-tool, the mapping between colors and sounds (e.g., yellow was connected to bird sounds and blue to ocean sounds) required some time to get used to: only after using it a couple of times, teachers were able to directly interpret that yellow lights were on when hearing bird sounds. The learning process that seemed to require most time was related to the decision to use FireFlies for a certain purpose. Since the purpose for which most participants used FireFlies replaced a way of working that was already habituated, they found it difficult to get used to applying FireFlies rather than the habituated other activity. For example, when a teacher wanted to give a child a compliment, she often had already given it verbally before realizing that she had planned to use FireFlies for that purpose. This example indicates that it may in many cases not only be required to learn to work with an interactive system but also to unlearn another activity.

Similar to interactions with the FireFlies teacher-tool, interactions with the NoteLet bracelet required selecting individual children’s names from a list. Interestingly, in the evaluation of NoteLet, we did not observe situations in which teachers automatically performed a habituated other activity while they were planning to use NoteLet. This can be explained by the fact that interactions with NoteLet, taking quick pictures of the classroom in order to remember events later on, were not directly replacing existing activities in the routine of the participating teacher. Therefore, this teacher only needed to learn to work with NoteLet, without unlearning other activities.

Clearly, habituation of an interaction depends not only on the ease with which an individual user can learn to work with the interactive system but also on existing routines that are replaced by the interaction. Since these routines may differ between users, an activity may easily become habituated for one user while this may require more time for another user.

4.2.2 Personal mind-set

Apart from individual differences in terms of learning and habituation, our case studies also revealed examples in which the personal mind-set of different users influenced the extent to which the designs could be used in the periphery of attention. For example, in the evaluation of CawClock [10], the participating teacher described a situation in which she had used the cat token to set a 20-min time frame on CawClock. In these 20 min, during which cat sounds were heard in the background, the children had to work independently on a task. Although she did not inform the children, she also wanted to use these 20 min to have a quick individual talk with each child. Hearing the cat sounds therefore informed her that she still had some time left for individual talks. The information the teacher gained from hearing the soundscape and seeing the clock (i.e., information about the number of children she could talk to) could clearly only be extracted in that context and by that particular user. Users with another mind-set at that moment, e.g., the children, likely extracted completely different information from the same audio and visuals.

Another example was seen in the case study with FireFlies. After the deployment, we asked the participating teachers about their suggestions for improvements to the teacher-tool design. These discussions revealed that some teachers would have liked the children’s names to be listed in the same way the children were sitting in the classroom as they preferred this spatial orientation to easily find the right name. Other teachers however preferred an alphabetical order, which they found easier to remember. This example reveals that one user’s way of reasoning may not correspond to another user’s way of reasoning, influencing the ease with which an activity can become habituated.

Clearly, an interactive system may more easily shift between periphery and center of attention, for one user compared to another user. This holds not only for the purpose for which an interactive system is to be used but also for the exact way the user interacts with it. This means that an interactive system may easily facilitate peripheral interaction for one user, while this will not as easily be achieved for another user.

5 Considerations for peripheral interaction

Since the number of interactive system in our everyday environment is rapidly increasing, it seems inevitable that not all of our interactions with these systems can take place in our center of attention. Certain interactions will shift to the periphery of attention where they require fewer mental resources and can be performed in parallel to other activities. The aim of the work presented in this paper is to support HCI researchers and practitioners in anticipating and facilitating peripheral interaction with the interactive systems they are studying or developing. In the previous section, we have laid out two main characteristics of peripheral interaction: its shifts between center and periphery and its personal nature. When developing or evaluating interactive systems that are to facilitate peripheral interaction, it is therefore important to consider to which extent these systems (1) support shifts between center and periphery and to which extent they (2) support personal differences. Aiming to provide an overview of lessons learned, this section discusses how we approached these challenges in our case studies. We start this section by addressing when to consider peripheral interaction.

5.1 When to consider peripheral interaction?

While we believe that many interactive systems may benefit from peripheral interaction, we also realize that for some systems, it seems undesirable that they shift to the periphery of attention. A fire alarm, for example, seems always of such significant importance that it requires conscious attention. Similarly, interactions that should not go wrong, such as changing your password for an online service, are unsuitable to be performed in the periphery. Other interactions seem highly engaging most of the time, as a result of which a user likely chooses to focus attention on it. For example, a very engaging computer game preferably seems to be played in the center rather than in the periphery of attention.

Different from these examples, most interactions will not always engage the user’s center of attention. Think for example of systems that help you to keep track of relevant but not crucial information (e.g., the weather or the activities of friends and family), systems to support remembering upcoming agenda items and tasks (e.g., keeping a grocery list or remembering to call someone), or systems for everyday tasks at home, such as setting your alarm clock or controlling your lighting system. Interactions with such systems may at moments be very significant (e.g., when an important agenda item is coming up that cannot be forgotten) or engaging (e.g., when finding out that the weather will be beautiful on a day in which you planned to go on vacation), while in other cases, these interactions are relevant but not crucial. In these latter situations, such interactions are typically performed as a part of the everyday routine and form an ‘unremarkable’ part of this routine [1]. Such systems could, in our view, clearly benefit from peripheral interaction, and we describe such systems using the term ‘everyday interactive systems’.

5.2 Supporting shifts between center and periphery

One of the main characteristics of peripheral interaction is the frequent shifts of such interaction between center and periphery of attention. As discussed before, these shifts happen depend largely on the context in which the interaction takes place. In this section, we discuss what we think is important to consider when aiming to facilitate interactions in shifting between center and periphery of attention.

5.2.1 Taking into account context and routine

In the design and development of everyday interactive systems, a detailed understanding of the context of use is important. This is widely recognized in related literature, which for example states that a primary concern for ubiquitous computing research and practice is ‘the potential relationship between computation and the context in which it is embedded’ [38]. Also for the facilitation of peripheral interaction with everyday interactive systems, a detailed understanding of the context in which these interactions are to take place is important. Several views have been published on what it means to understand context [3941]. Additionally, more practical approaches on how to visualize and communicate context in a design process have been developed [42]. These related studies suggest that understanding the context of use does not only mean having an image of the locations that are involved but also include understanding other contextual aspects such as the social context and the activities that are part of the everyday routine.

Through the case studies presented, we realized that particularly for peripheral interaction, an understanding of the user’s context should involve a detailed image of the different mental resources that users require in their everyday routines. When gaining an understanding of the classroom context in the process of designing CawClock, NoteLet and FireFlies for example, we quickly realized that many of the teachers’ tasks are visual, such as keeping an eye on the children or using the whiteboard for explanations. These tasks are therefore using the teacher’s visual resources. When multiple visual tasks are performed simultaneously, a bottleneck occurs [17, 18] and one activity needs to wait before it can be executed. To prevent such bottlenecks from happening and thereby to support concurrent multitasking [18], we decided to use audio in CawClock to convey information and use tactile cues in both NoteLet (fabrics with different textures) and FireFlies (beads of different sizes) to potentially enable the teachers to operate the tools without looking at them. Understanding the context of use and the division of the user’s mental resources during his or her everyday routine is important to anticipate whether or not an interaction can shift to the periphery of attention.

5.2.2 Enabling easy-to-initiate and easy-to-discard interaction

The observation that interactions may quickly and frequently shift between the center and periphery of attention entails that interactions may be initiated at any moment, potentially in the periphery of attention. To support interactions with the peripheral interaction designs developed in our case studies to be easily initiated, we made sure they did not require any start-up time. For example, the interactive devices did not need to be turned on before they could be used. Additionally, this partially motivated our choice of using audio rather than only visual elements. Since audio does not need to be looked at to be perceived, it can be heard whenever it is available. Furthermore, we found it important that our interactive devices could be ‘at hand’ whenever the user wished to interact with them. Since primary school teachers often walk around the classroom during lessons, we decided to enable our designs to be attached to the body or clothes of the teachers. Of course, many other options to make an interactive device available ‘at hand’ are possible. Interesting directions to achieve this could be wearable computing [43], mobile computing [44], whole body interaction [45], gesture interaction [46] or tangible gesture interaction [47].

Apart from the idea that interactions should be available to be initiated any time, we have also seen in Fig. 5c that interactions may easily be discarded, even when an interaction is unfinished. In this example, a child asked the teacher a question while the teacher was interacting with FireFlies. As a result, the teacher temporarily discarded the interaction with the teacher-tool to pick it up later. Although we did not directly anticipate this with our designs, they seemed to function relatively well in such situations; no settings were lost and no errors occurred when the interaction was discarded. We believe that the possibility of users discarding an interaction in the middle of it may be a relevant consideration for peripheral interaction design.

5.2.3 Evaluating in context

The context and routine in which an interactive system is used highly influence how the user interacts with it, what its value is to the user and whether it can shift between the center and periphery of attention. Given that the aim of peripheral interaction is to embed interactive systems in everyday routines, it seems evident such interactive systems are best evaluated by deploying them in the real context of use for a longer period of time. In this way, users can interact with them in an everyday life setting, and the potential integration of the system into the routine can be experienced by the user and evaluated by the researcher. Although the traditional approach to evaluate how users interact with technology is to observe them in a controlled, laboratory-style setting, the approach to deploy designs ‘in the wild’ seems to be increasingly suggested in literature on interaction design in general [48]. A longitudinal approach to user evaluations is recommended specifically for systems that present information in people’s periphery of attention [22, 49].

In our case studies, we also deployed our designs in the real context of use, and this approach indeed revealed insights that would likely not have been gained otherwise. This for example became clear in the deployment of NoteLet, an interactive bracelet with which teachers could take pictures that could later viewed to remember and take notes of the children’s behaviors. As part of the iterative design process, we discussed an early concept of NoteLet with three teachers, all of whom imagined that they could valuably apply it in their classroom. We deployed a prototype version of NoteLet in one of these teachers’ classrooms, and, after using it, the participating teacher realized that although it seemed valuable at first, the activity of looking at the pictures after school hours took too much time and would therefore not fit in her routine as well as she had imagined. With FireFlies, we had an opposite experience. Of the nine teachers with whom we discussed an early concept of the design, four were hesitant about its potential usefulness. They had difficulty imagining for which purpose they would use it, and therefore, they were unsure if it would be valuable to them. Three of these four teachers eventually used FireFlies in their classroom, and all three found a relevant purpose for it and were able to integrate it in their routine. Though we realize that there is much in between discussing a conceptual version of a design with users and having them use it in their daily routines, these examples do show that crucial parts of the user experience may only become evident after it is deployed in the real context of use for a period of time.

Although everyday interactive systems seem best evaluated in long-term studies, this approach also has clear limitations. Such studies require tremendous time and effort, even if only a small number of participants is involved. While studies in which participants use a new design for a few hours or less seem unsuitable to evaluate the integration of the design in the user’s routine, such studies can of course be suitable to reach other evaluation goals. For example, the usability of the design or the extent to which users can understand the mapping between visual and sound can also be concluded from studies with shorter duration. However, the main goal of peripheral interaction, embedding interactive systems in the everyday context and routine, can only be assessed in a long-term study. The required duration of such studies seems to depend on many aspects, such as the number of times interaction takes place, the difficulty of an interaction and whether or not other activities need to be unlearned. In the six-week deployment of FireFlies, we observed peripheral interactions in the fifth and sixth week of the evaluation. However, we did not find a clear longitudinal effect. For example, we did not find an evidently increasing number of peripheral interactions over the 6 weeks. Longer deployment would likely have been required to observe such effects. Nevertheless, our observation that some interactions with FireFlies can take place in the periphery of attention is a promising support for the feasibility of peripheral interaction. We believe that these results would not have been gained without deploying (prototypes of) interactive systems in the context of use for a longer period of time.

5.3 Supporting personal differences

A second main characteristic of peripheral interaction, as discussed before, is its highly personal nature. Through our case studies, we have aimed to support the use of our designs in a personally relevant way, though iterations were clearly required to achieve this. This gave us insight in possible ways to support habituation and in potential ways to support personal preferences of various users.

5.3.1 Supporting habituation

Before an interactive system can blend into an everyday routine, the user needs to get used to interacting with it: the interaction can then become habituated. Our design CawClock addressed this by involving multiple levels of detail in one information display. CawClock combined the visual display of an analog clock on which colored time frames could be shown, with a soundscape that represented which time frame was ongoing and approximately for how long. The teacher who used CawClock for 2 weeks indicated that she could easily hear which time frame was ongoing (each color was represented by a specific animal sound) but that she needed to look at the clock to find out how much time was left. Although the soundscape also indicated this through the number of animal sounds included, she had not been able to recognize this detail in the two-week period. Although this may very well be due to lack of sophistication in the sound design, it may also show that 2 weeks was not enough to learn to recognize the subtle differences in the soundscape. If she would have used it longer than 2 weeks, she may eventually have learned to recognize these details in the soundscape.

Two things seem interesting in the above example. First, the combination of two modalities that display the same information could potentially have supported the learning process. Although the details of how much time was left could initially not be heard, the fact that it could easily be seen on the visual display may have helped the teacher in realizing how this information was presented by the audio. Second, the different levels of detail in the audio (the overall information of ‘a time frame is ongoing’ versus the detailed information of ‘the blue time frame is almost finished’) enabled the user to quickly apply the design without much learning time, while after a learning period, she may have been able to use the full potential of the audio. When such different levels of detail are implemented in a design, it is likely that people initially only use the overall information. However, while using the overall layer of information, the user may gradually start understanding the details as well and, little by little, learn to (automatically) recognize them. Although the details are this way not directly used, the process of learning how to use them also barely requires conscious effort. It therefore seems that a design with different levels of detail may support the process of learning how to interact with it, enabling its habituation.

5.3.2 Supporting personal preferences

Interactive systems, which facilitate peripheral interaction, should support different individual’s preferences. There may be many ways in which this challenge could be addressed. In our design of FireFlies, we aimed to address it by making FireFlies an open-ended design, which meant that the purpose for which teachers could use FireFlies was not predefined but could be chosen by the teacher. As a result, we indeed found that different teachers used FireFlies for different purposes, while most of them found it a valuable addition to their everyday routine. This seems to indicate a success of our open-ended approach. However, we also recognized that some teachers had difficulty integrating specific elements of the design into their everyday routine, such as the alphabetic order of the names on the teacher-tool as well as the use of audio in general, while this was easier for other teachers. Apart from an open-ended purpose, the design may therefore also have benefitted from an open-ended mapping between input and output. This may be a relevant to consider as a means to facilitate peripheral interaction with everyday interactive systems.

A related issue, which applies mainly to information displays, is that the presented information may not be relevant for everyone who can perceive it. We noticed this with our FireFlies design, which played a soundscape that revealed which colors were currently in use. When a color was used to communicate information to the entire class, e.g., instructing the children to work in silence, the soundscape revealed information that could be useful for everyone. However, FireFlies was often used to send messages to individual children, e.g., to give a compliment. In these cases, only one or a few light-objects had a color and the others were off. The audio was at such moments mainly relevant for the child who received the compliment and not for the other children. Since the audio was played from speakers in the back of the classroom, however, all children perceived it and the audio sometimes distracted those for whom the information was not relevant. To prevent such problems, Eggen and Mensvoort [50] suggested the concept of information decoration, which aims to present information in a decorative way. This way, people to whom the information is not relevant, may still benefit of the design as it also serves a decorative function, such as by providing pleasant or relaxing background sounds. This direction seems particularly suitable in situations in which multiple potential users are involved, such as in public spaces.

6 Conclusions

This paper explores interactions with technology that reside in the periphery of attention, but shift to the center of attention when relevant or desired. By discussing the theory underlying peripheral interaction as well as three case studies on peripheral interaction design and evaluation, we presented two main characteristics of peripheral interaction. Following from these characteristics, we presented considerations that can support researchers and practitioners, who work on the development of everyday interactive systems, in considering their designs being used in the user’s periphery of attention.

In our case studies, we realized that everyday interactive systems very frequently shift between center and periphery of attention, even in-between different stages of interaction. Such shifts therefore make up an important characteristic of peripheral interaction and highly depend on the contexts and routines in which the interaction takes place. Our discussion furthermore made clear that, as a second key characteristic, peripheral interaction has a highly personal nature. Peripheral interaction seems to require both learning and unlearning: while it takes time to get used to new interactions as part of existing routines, users often need to unlearn existing habits at the same time. Additionally, our case studies made us realize that individual person’s mind-sets influence the extent to which a design can shift to the periphery of attention.

These two main characteristics of peripheral interaction reveal that, in the development of such interactive systems, it is important to consider how to support shifts between center and periphery and how to support personal differences. Generalizing from the ways we approached these challenges in our case studies, we concluded peripheral interaction can benefit from taking into account context and routine, enabling easy-to-initiate and easy-to-discard interaction, evaluating in context, and supporting both habituation and personal preferences.

We believe that the characteristics and considerations presented can support researchers and practitioners in the area of interaction design to realize that their design may be used in their users’ periphery of attention. When such peripheral interactions are anticipated and facilitated, everyday interactive systems can fluently be embedded in people’s daily routines.