1 Introduction

Although the topic of wearables has recently gained immense popularity with the advancement of mobile technologies, the history of wearable technology spans well over four centuries. The earliest examples of wearables date back to the 1500s with the invention of the first timepieces designed to be worn on an individual. The invention and distribution of wearable computers, however, began several 100 years later within the twentieth century. Since then, these devices have grown in parallel with computational power and have become increasingly ubiquitous in their applications. Their applications range from fashionable accessories to, more notably, medical devices.

Similarly, the methods of interaction for these devices, including sensing and feedback, have diversified. Traditional wearables relied on visual or haptic interfaces to gather information from users; however, new technologies allow for more discreet methods of input including, for example, electrical stimulation. These passive methods of data gathering have introduced a new paradigm of anticipatory interfaces in wearables that anticipate user needs. This diversification and evolution of wearables has lead to their development in industries ranging from fashion to assistive devices.

While wearable devices are increasingly integrated as commonplace technology within society, their ultimate role remains uncertain. Many organizations continue to invest in the future of wearable technology, but with varying degrees of success; most notably, the Google Glass project was intended to transform human perceptions of wearables and their potential, but was ultimately deemed unacceptable by the general public.

The evolution of wearable devices has also introduced a new consideration in the design of technology: human-centricity. Society is mandating design that takes into account the needs of the individual and addresses these needs throughout the ideation and development of new devices. Wearables have introduced a mobile context to computing that offers new restrictions on the function and appearance of devices in order to best support the individual without inhibiting external needs. To adapt to these changes, many developers have introduced personal, social, cultural, and environmental considerations to the decision-making process in the design of new technology. New interaction paradigms have emerged which consider the users internal and external context in the delivery of information.

2 Definitions

The following definitions serve to clarify and disambiguate several of the terms used in this chapter. The main concepts of “Human-centricity” and “Wearable Computing” are defined in their respective sections below.

  • Mobile Computing: We define Mobile Computing, or “Nomadic Computing” [71], as the design, implementation, and usage of portable devices which can access and transmit information without requiring a fixed location. Mobile devices such as smartphones and laptops are the central focus of the study of mobile computing. This definition includes mobile wearables which travel with the user.

  • Universal Design: The term “universal design” was originally coined by [43] in relation to the construction of buildings. It was intended to describe the process of designing all products and the built environment to be esthetic and usable to the greatest extent possible by everyone, regardless of their age, ability, or status in life.

  • Assistive Technology: We adhere to the definition of Assistive Technology outlined in the Assistive Technology Act of 2004 (29 U.S.C. Sec 2202(2)): “any item, piece of equipment, or product system, whether acquired commercially, modified, or customized, that is used to increase, maintain, or improve functional capabilities of individuals with disabilities” [2].

  • Usability: We rely on the definition of “usability” set forth by the International Organization for Standardization as the “extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.”

  • Human Factors: While there are various definitions for this term in recent literature, in this chapter, we follow James H. Stramler’s definition of Human Factors as the “field which is involved in conducting research regarding human psychological, social, physical, and biological characteristics, maintaining the information obtained from that research, and working to apply that information with respect to the design, operation, or use of products or systems for optimizing human performance, health, safety, and/or habitability” [73].

  • Accessibility: We refer to accessibility as the design of products to include access for people with disabilities. This design strategy can facilitate either direct access to the individual or indirect access through assistive technologies.

  • Anticipatory Interface: Robert Rosen defines an anticipatory system as one that “contains an internal, predictive model of itself and its environment, which allows it to change state at an instant in accord with the models predictions pertaining to a later instant” [70]. Such an interface takes a passive approach to prediction, but ultimately operates in the domain of action in response to predictions of the users context.

  • Modality/Multimodality: We refer to a “modality” in the sense commonly used in the study of Human–Computer Interaction (HCI): it is a perceptual channel through which information is transmitted between a human and a machine, or between two human beings [33]. Multimodality consequently refers to the use of multiple modalities, as defined above, in a system or interface.

  • Smart Device: A “smart device” can be defined as a multifunctional ubiquitous device which is able to communicate, often wirelessly, with other devices, access remote and locally stored information, and provide access to that information to the user via a mobile user interface [65]. Examples of these devices include smartphones, smartwatches, and modern tablets.

3 What Is a Wearable Computer?

Since it is the primary focus of this chapter, we begin our discussion of wearable computing with a definition of the concept:

Wearable computer is a broad term used to describe any computer that is worn to some degree on or inside a human’s body. Due to the wide scope of devices that this term can encompass, it is more beneficial to characterize it rather than use an explicit definition. In 1997, Rhodes described a wearable computer as having five main characteristics [69]:

  • Portable while operational: The most distinguishing feature of a wearable is that it can be used while walking or otherwise moving around. This distinguishes wearables from both desktop and laptop computers.

  • Hands-free use: Military and industrial applications for wearables especially emphasize their hands-free aspect, and concentrate on speech input and heads-up display or voice output. Other wearables might also use chording keyboards, dials, and joysticks to minimize the use of a user’s hands.

  • Sensors: In addition to user inputs, a wearable should have sensors for the physical environment. Such sensors might include wireless communications, GPS, cameras, or microphones.

  • Proactive: A wearable should be able to convey information to its user even when not actively being used. For example, when a new email arrives, your computer should be able to notify you immediately of its arrival.

  • Always on, always running: By default, a wearable is always on and working, sensing, and acting. This can be contrasted to pen-based PDAs, which normally sit idle in one’s pocket and are only activated when being actively used for a task.

Since these characteristics address form factor, input/sensing, feedback/delivery of information, and operational aspects of devices, they operate within the modern definition of a computer.

4 History of Wearable Computers

The concept of wearable computing dates back to the 1500s, with the invention of wearable timepieces that were transitional in size between clocks and watches. These clock-watches were designed to be worn as jewelry on clothing or around the neck and utilized only an hour hand. Although their calculations were fairly imprecise, and they are not considered computers in the modern sense, these were the first wearable devices that computed time.

These primitive wearables were followed by the development of rings that served as a fully functional abacus in the 1600s and the invention of the wristwatch in the 1800s. However, the first generation of modern wearable computers had not emerged until the twentieth century. For this reason, we begin our timeline in this section at the twentieth century with the development of wearable computing devices.

Fig. 1
figure 1

Thorp and Shannon’s shoe

The invention of the first modern wearable computer was self-credited to Edward Thorp and Claude Shannon for their device which aided in predicting the outcome of a roulette wheel in 1960 [76]. The device worked by measuring the position and velocity of the ball and rotor to predict their future paths and stopping points. It was concealed within a user’s shoe and used radio transmission to inform another individual of the winning number as shown in Fig. 1.

Fig. 2
figure 2

HP-01 watch

The 1970s and 1980s yielded the emergence of general purpose wearable computers, and the release of the first wearable computers built for general consumers. Hewlett-Packard released the HP-01, the first algebraic calculator watch, at this time [45]. The watch was released with six interactive functions: time, alarm, timer/stopwatch, date/calendar, calculator, and memory. It included 28 keys, 6 of which were operated by finger and the remainder through a stylus fitted in the watchband clasp (Fig. 2).

The 1990s and 2000s ushered in the integration of new sensors, such as cameras, into wearables enabling new applications including augmented reality. These decades also introduced the first applications of wearables which were implanted in the human body. In 2002, as a part of his Project Cyborg, Kevin Warwick implanted an electrode array to measure electrical signals within his nervous system and relay the information to a pendant worn by his wife that would change colors based on the data [77]. The late 2000s also saw the introduction of mobile phones which were integrated into wristwatches. Due to the diversification of wearables and the introduction of Personal Area Networks (PANs) and Body Area Networks (BANs), the need for standardization of interfaces and communication between these devices had become apparent. Thus, the Institute of Electrical and Electronics Engineers (IEEE) and Internet Engineering Task Force (IETC) began to develop standards for communication protocols such as Bluetooth.

The influence of wearables on technologys evolution was solidified in the 2010s as many companies began developing their own wearable devices, further expanding their ubiquity. A suite of exercise bands (Nike Fuel Band, Fitbit, JawBone, etc.), smart watches (Pebble, Apple Watch, and Galaxy Gear), and assistive devices (exoskeletons, prosthetics, etc.) were rapidly released and adopted by the public within this relatively short timeframe. There were, however, some failures within this relentless expansion. Most notably, Google released the Google Glass as a head-mounted optical display in an effort to introduce seamless augmented reality to the average individual. The release of this device introduced a myriad of privacy and safety concerns ranging from the ability to discreetly record to the usage of the device while operating a motor vehicle. The technology was ultimately discontinued within a year of its launch.

As indicated above, hundreds of years of history have served to shape the purpose of wearables and their integration with modern ubiquitous computing technology. This history has indicated that wearable technology is subject to the evolution and advancement of society and individuals. The recent technological advances above set the stage for an ongoing discussion of the ability of these devices to facilitate variation in user needs and backgrounds, and the challenges and barriers introduced by these factors.

5 Wearable Device Interfaces

In this section, we review the design of wearable interfaces and the advantages and limitations that various design strategies place on user interaction. Furthermore, we introduce some of the primary challenges in interface design for modern wearables, particularly when considering the individual user.

One of the main barriers of entry for wearable devices is the burden of adaptation. When a new device is deployed, users are often forced to adapt to a new interface and method of interaction to adequately use the technology. This has traditionally been one of the main challenges for new devices, which developers have attempted to address by defining a universal set of human factors design considerations. New interaction styles are often derived from existing methods of interaction to minimize adaptation costs.

Similarly, due to the inherent contextual challenges associated with wearable devices, interfaces have an additional set of considerations beyond traditional Human–Computer Interaction that they must fulfill to be considered user-friendly [12]. There are three main considerations outside of the scope of traditional user experience:

  • Interaction Period: These devices are often used within much smaller interaction periods than conventional devices, and are thus required to be highly efficient with respect to user attention.

  • Context: They are often used in dual-task contexts where the user is simultaneously performing some other task while interacting with the device. The primary task is often some physical task in the real world and thus, interaction with the wearable becomes almost a distraction [78].

  • Interface Simplicity: When considering the broad range and volume of devices developed in this domain, it is important that the interfaces are not overly complex and map to interactions that build on what a user is already familiar with. Although interface simplicity is a consideration made in broader applications, the importance of this consideration is much greater in the wearable domain because of a combination of the above factors.

Thus, the main goal of wearable user interfaces is to support users during their day-to-day tasks while minimizing cognitive load and interaction time. These devices typically incorporate multimodal feedback in user interaction. They often rely on more than one sense to relay information back to the user; however, one of their feedback channels typically serves as the primary communication modality with secondary channels that are often redundant. Effectiveness of feedback channels is not only highly dependent on the individual, but also on current context, and it can change as the user’s environment varies. As an example, a visual interface on a smartwatch might be a good way to alert a user that they are receiving a phone call while sitting idle, but may not be as effective when driving. In this scenario, haptic displays may prove more useful to users while driving since they do not shift the users vision from his or her primary motor task. Incorporating multimodality allows users to prioritize which sense they want to dedicate to the reception of information without completely disrupting their primary task.

5.1 Wearable Interfaces by Modality

The modality of interaction for a wearable interface varies greatly depending on its intended purpose and application. We identify three main modalities of feedback for wearable interfaces: visual, auditory, and haptic. Examples and descriptions of the usage of each modality are provided below.

5.1.1 Visual

Visual interfaces are the most common method that wearables use to relay information to users since more than 70% of our sensory receptors are visual, and engage almost 50% of our cortex [21]. Typically, these devices rely on displays mounted on a user’s wrist or head, but may occasionally depend on a non-wearable devices display (for example, smartphones) to communicate wirelessly transmitted information. There is, however, an explicit limitation on the size of these devices and displays due to an individuals visual acuity and their ability to discern minor changes (Fig. 3).

Fig. 3
figure 3

Wearables with visual interfaces. Pebble Smartwatch, FitBit, Google Glass

Visual displays can be dated back to the earliest wearable devices as clocks rely on a visual display to relay information on time. With the advancement of technology, these primitive interfaces turned to digital screens with varying degrees of resolution. Most digital displays for wearables can be categorized as either head-mounted or wrist-mounted.

Head-Mounted Displays (HMDs) are wearable, lightweight displays mounted on a user’s head and have digital displays in front of at least one of the user’s eyes. These displays can be further separated into monocular (only displays to a single eye) and binocular (displays that cover both eyes) with the former being the more recent approach to HMD development [42]. Within these subcategories exist immersive displays that inhibit an individual’s ability to perceive the real world outside of the HMD and semi-transparent, non-immersive displays. Applications of immersive and non-immersive HMDs range from aviation, where they are used for navigation and to enhance situational awareness for pilots [74], to computer-aided drafting for model understanding and manipulation [7].

Because they hinder an individual’s ability to see their surroundings, immersive displays are not often seen on individuals in day-to-day environments [3]. These devices have, however, found many applications within the field of virtual reality. Devices like the Oculus Rift and the HTC Vive have introduced virtual reality to the gaming community, motivating developers to create applications that enhance the experience of the average consumer. These devices were designed to operate with limited mobility as they obscure vision and fully divert attention from the outer environment, but can still be used in limited mobile environments as they provide virtual representations of the real-world environment. Thus, more recently, HMDs have been geared toward non-immersive displays for use in augmented reality applications outside of virtual environments.

Although non-immersive HMDs have design limitations [63, 79] which include reduced vision due to the veiling luminance of the display, more of today’s wearables (outside of virtual reality) are shifting toward this approach as it abides relatively well by one of the basic premises of wearable displays: that the users primary task should not be interrupted [15]. Google Glass was a monocular, non-immersive HMD that aimed to provide contextual information to users through a small projection within a glass in a user’s peripheral vision. A broad range of applications within industry and research were explored including a display for pediatric surgeons [52] or high-level activity recognition using blinking and head motion [30].

Wrist-Mounted Displays (WMDs) are digital displays that are worn on the user’s wrist or forearm. The position of these displays presents inherent difficulties for a user as it distracts attention from the surrounding environment, and thus, they often inhibit a user’s primary motor task. These devices typically require the user to lift his or her arm while interacting with the device, which can lead to issues with muscle fatigue after prolonged use; consequently, although they often provide similar contextual information (for example, navigational or environmental data), WMDs typically have shorter interaction cycles than HMDs and have different interruption techniques. WMDs often rely upon secondary modalities such as haptics to direct the user’s attention outside of their visual field.

The main category of WMDs is smartwatches. A smartwatch is defined as “a wrist-worn device with computational power, that can connect to other devices via short-range wireless connectivity; provides alert notifications; collects personal data through a range of sensors and stores them; and has an integrated clock” [8]. Although they have existed in many different forms for decades, smartwatches took off with the launch of the Pebble device. This was the first platform-agnostic watch that allowed users to interact with their smartphones without having to take the device out of their pockets. Although smartwatches have started to gain interest, they still do not offer enough additional functionality when compared with smartphones to allow for mass adoption [8].

5.1.2 Auditory

The second most popular form of feedback occurs through the auditory channel. An auditory display is defined as “any method of communicating information, usually non-textual information, by means of nonspeech sound” [72]. This channel offers the opportunity to receive information while the eyes and hands may be busy performing some other primary task. Auditory interfaces can be subdivided into three main categories: verbal, audification and sonification [72].

  • Verbal: Verbal interfaces use natural speech to present information. Examples include car navigation systems, hands-free smartphone assistants, and automated museum tours. These interfaces are the most common in wearable devices.

  • Audification: This technique is a direct mapping for data points into audio signals to produce sound patterns. It is limited in use to large, periodic data sets where patterns can be explored at a high level rather than looking for granular differences. An example of this would be looking through large sets of financial data to compare trends between years and playing tones that might be higher in pitch to denote higher profits and lower pitches to denote lower profits or losses.

  • Sonification: This is an analogic approach to mapping data to sound. Frequency, harmonicity, and pulse rate are used as variables in the development of associations between sound and data. This allows for more fine-grained exploration of data since mappings can be made to specific characteristics. As an example, a Geiger counter uses the rate of clicking to denote the radiation level in the environment.

Sound is often used in wearable interfaces to offload visual information. This is vital because, as discussed in the previous section, visual displays on wearables are often small and constrained. Care must be taken in the amount of information presented visually to avoid problems of cognitive overload and confusing, cluttered displays. Thus, audio cues are often used to represent a subset of the information that may be presented through a nonvisual channel. Sound can reliably attract a user’s attention while they perform another task and can, therefore, serve as a method to interrupt the user and redirect his or her attention to a visual display.

5.1.3 Haptic

Haptic interfaces rely on an individual’s sense of touch to provide information and are used both as a primary and secondary modality for feedback in wearable interfaces. Haptic interfaces “generate a feedback to the skin and muscles, including a sense of touch, weight, and rigidity” [31]. Due to restrictions on size, haptic stimulation is most often accomplished through vibrotactile patterns in wearable devices. As more wearable devices are developed, the exploration of haptics becomes more of a necessity than a luxury since the visual and auditory channels are primarily used during navigation. This leaves haptics as an unobstructed channel that can be used without severely impacting day-to-day activities but still allowing the user to receive information from their device.

Fig. 4
figure 4

Moment smartwatch with haptic interface (https://wearmoment.com/)

Because the sense of touch lacks the spatial acuity of vision and the temporal acuity of hearing, frameworks based on natural human speech have been proposed as building blocks for information delivery using haptics [48]. As noted previously, the most common example is that of vibrating smartwatches to inform a user that he or she is receiving a notification or call. Applications of haptic interfaces for wearables range from navigation where a 4-by-4 array of micromotors have been used on a user’s back to present directional information [16] to emotional therapy where human touch can be recorded and played back [5]. While tactile displays are most commonly used in conjunction with visual or auditory displays, new devices are starting to explore the value of haptic-only wearables (e.g., Moment) (Fig. 4).

The true benefit of haptic interfaces lies not only in the small space required for actuators but also in their capability to be personal and discreet. These interfaces have an advantage that is unlike any other interface in that a user can receive information and interact with their device without alerting those around them to the interaction. This allows haptics-enabled wearable devices to seamlessly integrate into social contexts and augment rather than interrupt.

5.2 Examples of Modern Wearable Interfaces

In the expanse of wearable applications, interfaces for these devices have developed unique characteristics of interaction. Two primary examples illustrating the potential of these characteristics, anticipatory and invisible interfaces, are described below. Although these examples do not comprise the entirety of modern wearable interfaces, they are mentioned here as they have gained significant attention in recent research.

5.2.1 Anticipatory Interfaces

With the rise of mobile devices, we are able to infer more about a user’s location, activity, and social setting than ever before. As these devices continue to advance with new sensors, we may see a shift from inference to prediction of context that will, in parallel, open the door for anticipation within computing applications. Although the principle of anticipation has been known, most existing approaches on the interaction cycle for assistive devices have been “laissez-faire.” Put simply, the device will wait for explicit interaction from the user before it processes and provides an output.

In discussing applications, it is important to first differentiate prediction from anticipation since the two are often incorrectly used interchangeably. Predictive applications are those that simply build predictions of the user’s current or future context. Anticipation uses these predictions to impact the future to the benefit of the user. Applications in the field of anticipatory computing rely on two key steps prior to the ability to anticipate. These include sensing the surrounding context and creating a predictive model of this context. Once the predictive model has been created, the system then uses this for anticipation of a user’s future needs [61].

Because the concept of anticipation in wearable computing is so new, few applications exist that are truly anticipatory. The majority of work in this area involves robotics. Within robotics, the principle of anticipation has taken the forefront in navigation [22], perception [27], and human movement characterization [23, 68]. Similarly, authors have explored applications in gaming through eye tracking to predict a player’s actions [35]. These early systems have helped to show the applicability and usefulness of anticipation, but are restricted in context.

Within the wearable domain, there is an abundance of recent literature surrounding predictive applications of internal and external context. One example of an application that emphasizes the usefulness of mobile phones in determining external context is SoundSense [41]. This project explores the use of the microphone to determine information such as activity, location, and social events. The authors proposed a scalable framework for modeling sound events and were able to classify four different activities: walking, driving a car, riding an elevator, or riding a bus (the precision on riding a bus was much lower than other conditions). Similarly, a project that explored internal context, called EmotionSense, was designed to infer a user’s emotional state from microphone data [67].

Furthermore, one of the other major efforts toward the classification of human behavior and extrapolation of context through mobile phones is Darwin Phones [50]. The authors developed the first framework in the mobile domain which could automate the updating of models over time, pool models that have been created and evolved within other mobile devices, and combine classification results from multiple mobile phones. This methodology is a step above most other work in the literature that relies on the local sensing abilities of a single mobile device rather than crowd-sourcing the classification.

Pejovic and Musolesi have proposed the potential for applications of anticipatory computing within the emerging field of digital behavior change interventions in mobile environments [61, 62]. The authors have referenced UbiFit [13] and BeWell [36] as two applications that have taken very rudimentary steps toward the inclusion of anticipation in mobile applications and provided potential architectures for applications in this domain [61, 62]. UbiFit is a personal health application designed to monitor weekly activity and provide subtle feedback when users are not active enough. The app displays a garden which thrives when the user is meeting activity goals, and remains barren when inactivity persists. BeWell is a mobile application that monitors a user’s health along three dimensions: sleep, physical activity, and social interaction. Much like UbiFit, this application provides intelligent feedback to the user to promote better health through an ambient display of an aquatic background which becomes more active the healthier you are.

5.2.2 Invisible Interfaces

A crucial factor impacting the future expansion of wearable devices is ease of use. With the examples stated above, the authors have embedded the interface for the applications into the existing interface of the mobile phone to create an unobtrusive feedback loop. Pantic et al. take this process one step further and state that the key to anticipatory interfaces is “ease of use” and the ability to “unobtrusively sense certain behavioral cues of the users and to adapt automatically to his or hers typical behavioral patterns and the context in which he or she acts” [58].

It is this ability to unobtrusively sense behavior cues and to use those as inputs for technology that comprises an invisible interface. Essentially, the traditional methods of explicit human–computer interaction is abstracted away from the user and instead use both internal and external context as the primary inputs for the technology. This promotes the principles of ubiquitous computing and makes the technology an extension of the person.

Although most applications in mobile computing still maintain the need for interaction with a physical interface, there has been a major effort toward the development of context-aware applications [10, 64]. These systems are generally split into two major subcategories: external context (physical) and internal context (logical). Context is any information that can be used to characterize the situation of an entity where an entity is a person, place, or object that is considered relevant to the interaction between a user and an application, including location, time, activities, and the preferences of each entity [28]. These systems face many challenges including determining relevant information, dealing with uncertainty, and privacy [4].

The literature in this domain is quite developed and uses context as an input in a variety of different ways. One unique approach looks to combine the influences of internal and external user context to proactively determine recommendations [39]. The system builds a context history and a profile for the user in each of these contexts to accurately predict the user’s needs simply based on past behavior.

Fenza et al. explore the usefulness of internal context in the healthcare domain by using a network of wearable sensors to determine the individual’s current state of health and provide personalized services. The authors use Fuzzy Logic to automatically characterize context and find healthcare services that approximately meet this context [17].

Muoz et al. explored the development of a context-aware messaging system in a hospital environment [53]. Users (doctors, nurses, physicians, etc.) were given mobile devices to write messages to each other that are only sent when a specified context is encountered. For instance, a nurse could leave a message for the next doctor entering a given room. The system automates the delivery based upon sensed context across many devices. However, this system does not fully embrace the concept of an invisible interface since the main method of interaction is still physical rather than an automated interaction solely based on context.

6 Areas of Application

As the baby boomer population continues to age, the adoption of wearable, monitoring technology grows at a considerable rate. New wearable devices are coming out everyday that explore applications across all facets of life ranging from entertainment to health. Health has specifically become a focal point in this progression due to the exponentially-growing demand for these devices. The last few years have introduced a broad variety of exercise bands (Nike Fuel Band, Fitbit, JawBone, etc.), smart watches (Pebble, Apple Watch, and Galaxy Gear), and mobile health apps. These technologies offer access to information that was previously unavailable and provide platforms for a myriad of new assistive technologies through remote monitoring.

6.1 Fashion

One historically significant concern in the adoption rate and usability of wearable technology is whether society considers technology to be “fashionable.” This is a cultural concern dating as far back as the introduction of eyeglasses and watches, and is an integral part of the introduction of wearables to human societies. One of the most important elements of fashion is that it relies heavily on context: different cultures, societies, and regions have different takes on what makes a piece of clothing or accessory “fashionable.” Even within the United States, for example, fashion interests can vary greatly by subregion [26]. With this as the basis for our understanding of fashion, we present a list of integration strategies for the design of fashion-aware wearable technology:

  • Assimilation: Some wearable technologies, particularly “electronic textiles”, can be embedded or woven into, or made to resemble, existing fashionable clothing or accessories in their region of deployment. This strategy favors users who value discretion in the technology they use by hiding the circuitry and interface of the wearable device, essentially rendering the technology “invisible” on the wearer. This strategy can be seen, for example, in Liu et al.’s e-textile pants for stability assessment in the elderly with motion impairments [38].

  • Enhancement/Augmentation: In this strategy, wearables are made to “enhance” the appearance of clothing or accessories either by attaching themselves atop these items or by being embedded in such a way that their existence is obvious, either through exposed circuitry or through an exposed interface (Fig. 5). Often these devices are adopted by users who value the high-tech look and wish for their wearables to stand out. Mistry and Maes’ SixthSense gestural device, for example, uses a worn pendant to project an interface to augment the real world [51].

  • Separation: The final strategy involves designing devices so that they can be worn separately from clothing and other accessories, while remaining fashionable. This is perhaps the most difficult strategy, as it involves designing a wearable that does not conform to the form factor of common accessories and articles of clothing in a particular society, but can be considered “fashionable” within that society or culture. Often this means introducing a new category of fashionable items into a society, which can take time to integrate. One example of this is work on fingertip haptic wearables [66].

Regardless of the integration strategy used, there are several general points to consider when designing fashionable wearables. One is that customizability and adaptability in the look and feel of a device can help improve its fashion awareness, as it can then be molded, either manually or autonomously, to match fashion tastes for a variety of users and cultures. Another is that the device should be usable in a variety of different contexts. For example, glasses can be worn to aid users with visual impairment (eyeglasses), improve visual clarity in bright environments (sunglasses), or simply to enhance one’s look (fashion glasses or clear glasses). The more functions a wearable can serve, the greater its targeted audience, and the greater the likelihood that it can be adopted into the fashion of a particular society.

Fig. 5
figure 5

Conductive thread used to create embedded electronics within clothing

6.2 Behavior Modification

One popular use for wearable devices, particularly in healthcare, is the modification of problematic behavior. Devices intended for behavior modification can sense and respond to targeted patterns of behavior to promote positive outcomes for the user. We can classify these devices into two main subcategories: facilitators and drivers [60]. Facilitators of behavior change afford greater control to a user over changing their behavior. They may remind the user that a problematic behavior is occurring, and provide steps and options for correction. These devices are intended to inform, but not to directly elicit behavior change. On the other hand, drivers of behavior change are devices which take direct action to change a user’s behavior, often by modifying/constraining the environment. A driver for eye health may, for example, turn off and disable usage of a television screen once it has detected that the user has been watching for a prolonged period in unhealthy conditions.

There are several general requirements which are commonplace for wearable devices aimed at behavior change:

  1. 1.

    The device should be able to accurately detect problematic behavior. This can often be a nontrivial issue, as the portable nature of wearable devices places limitations on what information they can sense in real-time. As a result, some types of behavior are easier to detect and interpret than others. For example, physical activity is a highly studied sensing category for behavior modification devices, and many wearables exist today which can discern this behavior from quantitative indicators such as step count or heart rate [6]. The accelerometer in the average smartphone, for example, can be utilized to estimate step count and get a generally reliable measure for an individual’s physical activity, under the assumption that the phone travels with its intended user [80]. However, sleep patterns may be harder to detect, as the mechanisms for automatically detecting sleep patterns without user entry can be complex [11].

  2. 2.

    The device should be aware of the context in which behavior occurs. Often the environment and the user’s goals play a large role as predictors of certain behaviors. A device intended for behavior modification should be aware of these factors to prevent the occurrence of false positives and false negatives in the detection of problematic behavior [34]. As an example, consider a device intended to detect and correct problematic gait patterns in users with Parkinson’s disease. Such a device would detect when Freezing-of-Gait (FoG) events occur as the user walks, as these are dangerous symptoms of the disease given that FoG episodes increase the risk of falling [75] (Fig. 6). However, in a crowded environment, a user may freeze his or her gait simply because the path ahead is blocked or the individual is waiting in a line. These FoG events are not attributable to Parkinson’s and should not be treated as problematic behavior. The device would, therefore, need to discern between the different causes for frozen gait and should respond to each appropriately.

  3. 3.

    The device should provide corrective feedback in a way that is intuitive, accessible, and clear. It should be immediately apparent to the user, based on the feedback of the device, what problematic behavior is occurring and, in some cases, how to fix it. Furthermore, the feedback given from the device should not produce any unwanted interference that could affect the safety, comfort or health of the user. For wearables, this often means that the device should not be a distraction when walking, driving, or interacting in public situations [18].

Behavior modification devices often base their evaluation on the very same metrics they use to detect problematic behavior. If the problematic behavior can be accurately detected and quantified, then it follows that researchers can evaluate the effectiveness of a device based on the change in this problematic behavior produced by usage of the device [54]. The most successful wearables for behavior modification are often able to produce significant change in the targeted behavior under a variety of conditions including users, environments, and contexts.

Fig. 6
figure 6

Device used to track Parkinsonian gait [75]

6.3 Fitness

Where devices for behavior modification are aimed at eliminating or changing problematic behavior, some devices instead focus on helping users maintain positive, healthy lifestyles. By far the most popular of such wearable devices are fitness trackers. These wearables are often facilitators as defined in the above section; they provide the information a user needs to maintain healthy levels of activity throughout the week. As these are devices intended for tracking physical activity, they require a greater attention to durability, comfort, and reliability in design [49]. Inevitably, a wearable for physical activity will be subject to a high amount of movement and shaking, outdoor weather, and potential impact against various surfaces. To account for this, wearable trackers such as accelerometers and heart rates sensors are often well reinforced and insulated against damage to critical circuitry while maintaining as little extra weight as possible to avoid burdening the user.

Fig. 7
figure 7

Wearable fitness trackers. Jawbone UP, FitBit, Microsoft Band

Since the mid-2000s, the popularity of wearable fitness tracking devices has taken off (Fig. 7). A vast range of wearable devices have been developed from chest straps to shoe attachments that have started to gather more information than ever on an individuals activity and health. The effectiveness of these devices is vastly attributed to the benefits that the smartphone has provided as a central communication system for this otherwise fragmented market. The industry is now looking at embedding fitness trackers into clothing as well to get even more information on overall health.

6.4 Assistive Devices

Assistive devices are designed, as defined above, to help augment the functional capabilities of individuals with disabilities. A wearable assistive device serves the added benefit of following the user, providing services in many aspects of that user’s daily life. One example of a wearable assistive device is the Haptic Belt [47], which can express nonverbal cues to assist in communication for individuals who are blind (Fig. 8). This device uses a pinhole camera that is embedded into a pair of sunglasses to determine if a person is approaching the visually impaired user. The belt then vibrates to allow the user to turn and face the person so that they can initiate conversation. Assistive devices often target a specific disability and a specific goal or function, and can assist in overcoming the challenges related to that function caused by the disability through various means including sensory substitution or augmentation. However, devices that are designed for individuals with disabilities can often have benefits for the population at large as well. As an example, a project called the Note-Taker was developed as a solution for visually impaired students to be able to more accurately take notes in the classroom environment [25]. The device had a camera that communicated with a tablet to allow the student to record the lecture and zoom in on the board to more easily see what was being written. The project had a lot of success with students who were visually impaired but also saw a huge demand from their sighted counterparts who could also benefit from its features.

Fig. 8
figure 8

Haptic Belt device [47]

6.5 Navigation

The problem of navigation often varies wildly by context; as such, wearables intended to assist with navigation can have drastically different requirements and considerations in their design (Fig. 9). To understand the constraints for effective assistance in navigation, we consider a few of the attributes involved in research in this field:

Fig. 9
figure 9

Navigation on the Apple Watch

Attributes of the Person:

  • Impairment: One of the initial considerations in design focus on the user’s impairment. Does the user have any attributes that make it challenging to navigate in an environment? Typically, these include sensory, cognitive and/or motor impairments, among others. Often these considerations manifest themselves in the design of the interface, although core functionality may be affected as well. For example, users with visual impairment may need more detailed information about their immediate surroundings [29].

  • Degree of Usage: Some users rely on navigation assistance more than others. An interface designed for heavy usage may require additional steps to increase battery life, particularly if the device is providing real-time assistance during navigation.

Attributes of the Task:

  • Type of Navigation Task: A primary concern related to the navigation task is the type of navigation. Is the user walking or driving? What type of vehicle is being used? Each type of navigation task includes its own concerns for safety and responsiveness. For tasks such as driving, the system may need to respond to changes more quickly than for walking due to the relative difference in speed of motion. Walking interfaces may provide the user with real-time information about landmarks along the route [46] while navigation for driving might focus on points of interest at the user’s destination [40].

  • Degree of On-the-fly Assistance: Wearables for navigation may be designed differently depending on how much the device knows about a navigation route beforehand. Devices with high storage capabilities but low real-time memory might pre-calculate optimum routes and follow a predetermined plan for the user’s navigation while those with little storage may rely instead on using real-time learning and adaptation to develop a route in real-time for the user, with only short-term planning involved. These types of devices are often more flexible to change in the environment but may use more power or suffer from signal interference. Many navigation algorithms for autonomous agents in AI research, such as that of Oriolo et al. [57], often deal with unknown environments and real-time planning.

  • Path Attributes: Devices for navigation may also be concerned with details of the path being calculated. For example, in some contexts, the shortest route is desirable while in others a more scenic route is preferable. Furthermore, there may be milestones or checkpoints along the way to a destination that the system should account for in the production of a path.

Attributes of the Environment:

  • Obstructions and Lighting: In general, navigation devices can improve in the quality of their assistance based on the amount of details of the environment makes available. This includes lighting attributes, which help the system determine which paths are most visible to the system and the user so that it can recommend the safest route to a destination, particularly at night. For navigation by blind and visually impaired users, information on obstructions and moving objects in the environment may also assist the system in preventing a collision in real-time. The system by Mann et al., for example, utilizes real-time detection by a Kinect camera for collision avoidance [44].

  • Scope/Scale of Navigation Environment: Finally, navigation systems may also be concerned with the scale of the navigation environment, as it may impact the type of services offered by the system and the number of available routes. Indoor navigation systems such as the one proposed by Golding and Lesh [20] may require higher accuracy of location and orientation detection of the user than navigation within a city.

7 Key Barriers to the Success of Wearable Computers

Although the wearable market is growing at an alarming rate, barriers exist to the full adoption of these devices. We have already seen this with the failure of the Google Glass in 2015 and the fall of the Pebble smartwatch in 2016, but what are the factors that inhibit a wearable technology’s success? We identify four main categories of barriers to the success of these devices:

  1. 1.

    Cost: Cost has been one of the most important considerations in the rise and fall of wearable technologies. With the rise in popularity of wearable technologies, the markets for these devices mature a lot quicker than traditional technologies. As a result, two things occur: the bottom line prices are driven down and the demand for innovation goes up. This cuts the margins on newer technologies and often leaves only a couple frontrunners. In the case of smartwatches, the Pebble, FitBit, and the Apple watch took over. In order to keep up, these devices must innovate much faster not only on the technical side but also on the manufacturing side. It is the innovations in manufacturing that lead to a decrease in retail price and make the products more affordable for lower income brackets.

  2. 2.

    Specialization: Too many highly specialized wearables are being designed with single use cases. This is a barrier since there is an explicit limit (body real estate) on the amount of devices that a single individual can wear at the same time which, if devices are highly specialized, forces users to make decisions of priority on what needs are most important. A survey conducted of user wearing habits indicated that the preferred body parts for various types of wearables are as follows (in descending order): (i) eyes (approximately 72%)—sunglasses, shades, and prescription glasses; (ii) head (approximately 70%)—hats, caps, and scarves; and (iii) hand—wrist watches (68.1%), bracelets (49.7%), and rings (59.4%). Audio earphones and headphones, wearables with which consumers are already familiar, received a preference rate of 64.7% [9].

  3. 3.

    Social Acceptability: Form factor and methods of interaction have been major drivers for the social acceptance of wearable computers. Because wearable devices are still relatively new, more emphasis is currently being placed on the innovativeness of the technology rather than the external appearance and social impacts. As a primary example, Bluetooth headsets were designed to be a modernization for the interaction between people and their smartphones. Iterations of headsets smaller in size began to appear on the market, but ultimately did not achieve mass adoption due to the social awkwardness they created. The devices were so inconspicuous that passersby would not know if the individual was talking to them or talking to someone on their phone. Similarly, the Google Glass’ form factor was deemed pretentious and scoffed in public settings. It also introduced questions about privacy that were deemed socially unacceptable. Other factors such as cultural and ethical considerations have an impact on the acceptability of a wearable device within a social setting. For example, in countries which censor social media interactions, devices which augment or enable social media usage by individuals may be deemed unacceptable.

  4. 4.

    Human-centricity: There is a set of human factors that are explicit to wearable computers which cannot be ignored. These devices are often used in contexts that are completely unique to those of traditional Human–Computer Interactions. Thus, special considerations need to be made into how these devices are used, when they should interrupt the user, where on the body they should be placed and what their intended purpose is [9].

8 What Is Human-Centricity?

Interfaces are typically categorized as “user-friendly,” “accessible,” or “intuitive.” Human-centricity is a newer design paradigm that aims to address aspects of each of these terms. The term “human-centered” or “user-centered” was coined by Donald Norman in the 1980s [56], and was expanded upon several years later to include four basic suggestions on design [32, 55]:

  1. 1.

    Make it easy to determine what actions are possible at any moment.

  2. 2.

    Make things visible, including the conceptual model of the system, the alternative actions, and the results of actions.

  3. 3.

    Make it easy to evaluate the current state of the system.

  4. 4.

    Follow natural mappings between intentions and the required actions; between actions and the resulting effect; and between the information that is visible and the interpretation of the system state.

The basic principle of human-centric design is that the end user is considered in all stages of design and development. Considerations are made to the needs, wants, limitations, uses, benefits and risks of a device from ideation all the way through development and even marketing [24]. In essence, this philosophy looks to alter the traditional feedback loop that exists between designers and developers to also include the class of end users. That representation must be actively involved in the entire process and constantly giving feedback on their desires since they are an important stakeholder in the final product.

As the concept has matured, standardizations have been developed around human-centricity providing more concrete requirements. Most notably, the International Organization for Standardization created ISO 13407: “Human-centred design processes for interactive systems” in 1999. It was revised with IISO 9241-210: “Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems” in 2010 and was reconfirmed in 2015 [1]. It describes human-centered design as “an approach to systems design and development that aims to make interactive systems more usable by focusing on the use of the system and applying human factors/ergonomics and usability knowledge and techniques.”

9 Concerns of Human-Centricity

ISO 9241-210: “Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems” recommends six characteristics for human-centered design [14, 19]:

  • The adoption of multidisciplinary skills and perspectives.

  • Explicit understanding of users, tasks and environments.

  • User-centered evaluation driven/refined design.

  • Consideration of the whole user experience.

  • Involvement of users throughout design and development.

  • Iterative process.

Fig. 10
figure 10

Consideration categories of human-centric devices

These characteristics outline four basic categories of considerations that need to be made in the design of human-centric devices: personal, social, cultural, and environmental (Fig. 10).

9.1 Personal

This is the most critical consideration in human-centric design. The device should be designed in a way that it is aware of the user’s limitations from a human factors perspective. The device should be aware of the user’s psychological, social, physical, and biological characteristics rather than require the user to attempt to adapt to the device. An example of awareness of physical characteristics would be that if a device is worn on the skin, it should not overheat and potentially burn the user. Similarly, it should support psychological characteristics such as cognitive load. Developing a device with an overly complicated interface that is unintuitive for the intended audience is not human-centric. All of these characteristics vary between individuals and so the device should also consider the spectrum of ability of its intended audience, and address those users through accessibility characteristics.

This consideration also dictates that the device should support users in achieving their goals or tasks rather than inhibit them. These tasks might not be directly related to the use of the device but may be done in parallel. As an example, if a mobile, wearable device is designed for active use throughout the day, it should be lightweight so that it does not inhibit mobility as an individual completes his or her day-to-day tasks. The concept of secondary tasks that are not tied with the direct use of the device is often overlooked and can cause a lot of issues with utility and adoption.

9.2 Social

Typically, when thinking about human-centricity, the focus is placed on the considerations made by the individual themselves. However, there are many factors outside of the individual’s immediate control that can also dictate design decisions in human-centric devices. Social considerations are one of the most critical as humans are inherently social creatures by nature. Within the design process, if the device is designed for daily use, the question “is the usage of this device acceptable in public?” should be asked. The development team needs to consider the social lives of its users and how they will be impacted by the technology. As an example, one of the major barriers to the adoption of assistive technology is the social stigma attached to the use of an assistive device [59]. The families of children with disabilities often choose to adopt assistive technology devices due to the perceived increase in attention and visibility.

A secondary consideration to social stigma is whether the device inhibits an individual’s ability to effectively communicate with others. This principle of design has less to do with the external appearance and more to do with the paradigm of interaction. Understanding when to interrupt an individual and when to shift into the background is a key factor in designing socially appropriate devices [15]. As an example, smartphones now have settings that allow the user to set certain do-not-disturb hours where notifications are instantly silenced. This mode can be disengaged when the user decides that they are in a social context where it is acceptable to be interrupted.

9.3 Cultural

Culture at any level, including national, regional, or organizational, can greatly influence success in adoption and implementation of devices [37]. Culture plays an important role in the adoption of technology and mandates standards which often underlie or dictate an individual’s personal views. These standards appear both in the form of legal restrictions on technology and occasionally as unspoken norms. In the design of person-centric devices, it is important to consider whether the assumptions being made in design implicitly or explicitly violate any of these customs. An example of this consideration having been overlooked was with the launch of the Google Glass. Although regulations did not exist prior to the device, new laws were developed as the product violated cultural expectations of privacy.

Culture also plays an important role in determining economic factors around a device that can often also dictate its potential for future success. When current cultural trends are taken into account, technology can actually begin to mold the future direction of societal views. Apple has been a company that has successfully created a subculture that will readily adopt their new technology with little discretion for whether or not it fits into current cultural values. This comes with the success that they had in designing devices that have become iconic trends within current culture.

9.4 Environmental

The user’s external environment is also a major consideration in human-centric design. External context can dictate anything from form factor to actual features and functionality. Obvious questions include: How resistant is the device to the effects of weather, rigorous human activity, crowded environments, dark or bright environments, or low internet signal strength? However, a more obscure consideration would be the choice of materials to use in the design process. Consider sending a device to be used in a remote area of the world. It would be human-centric to use materials that are readily available in that part of the world in the case that the device needs immediate repairs on site. Similarly, one should consider the environmental impacts that the device may have on the user’s environment after they discard the device.

Outside of the natural environment, there is also attention that needs to be paid to the manmade environment. Indoor and outdoor environments have been largely shaped by buildings, sidewalks and roadways which introduce their own set of requirements on human-centric devices. It is important that the device is contextually aware of the user’s surroundings and is designed in a way to adapt to the restrictions set forth by this environment. Returning to the example of the assistive device made for monitoring the gait of individuals with Parkinson’s disease, it is important to distinguish between the gait of an individual who is walking on an incline and might naturally be taking shorter steps versus an individual who is walking on a flat surface and may be taking short or shuffled steps due to a Freezing-of-Gait episode. Similarly, our gait differs in indoor environments such as waiting in line at a grocery store where we might only be able to shuffle forward every few minutes.

10 Universal Design Versus Human-Centric Design

The term “human-centric design” is often mixed up with “universal design” and the two are many times incorrectly equated. The terms both define considerations to be made about the end-user during the design process, but have slightly different goals.

Human-centered design attempts to model human interaction, personal, social, and cultural values into the design of an interface. The developers are expected to make informed design decisions with respect to the uses, benefits, and risks associated with the end user. This design ideology takes into account not only the characteristics of the individual using the device but also the context in which the device will be used. This presents a new set of requirements on technology that are completely external to the user, but are still important in the design of devices that he or she will use. Similarly, the consequences of design decisions with respect to these factors must be considered at each stage and factored into the decisions that are made in the development of the device. A device is not truly human-centric if it does not address all of these concerns.

Universal design, on the other hand, focuses on enabling as many individuals as possible to access that interface. The main principle is to design with the intent of usability to the greatest extent possible by everyone, regardless of their age, ability, or status in life. The concept of ability is viewed as a spectrum rather than a binary variable and the goal of this design ideology is to allow for the greatest distribution of access across that spectrum. This can be addressed in various ways, but the most common method of implementing principles of universal design is adaptability. The iPad is one of the best examples of universal design as it includes various settings and features that allow for the device to be used by individuals who may have visual impairments as well as those who may have auditory impairments. Human-centered design focuses on people’s interaction with society and with one another while universal design focuses on their interaction with technology.

A piece of translation software, for example, can be human-centered in that it translates subtle parts of human speech such as idioms, special phrases, metaphors, etc., to match the cultural and stylistic format of another language; however, if it cannot be accessed by individuals who are blind, it would not be universally designed. Similarly, a handheld electronic device may have accessibility options that allow for usability to the majority of its intended population; however, if it is too heavy to comfortably carry by hand, it may not be user-centric.

11 Human-Centric Wearables

To design human-centric wearables, one must be aware of the context in which they are used. Often these devices are taken out into the public, moved around, and may stay on the body for long periods of time. All of these requirements oblige us to study how humans interact with the world to ensure that wearables can be embedded into these interactions. Within the design and development process, it is vital to include the end-user’s feedback and validates assumptions. The concerns of human-centricity within this context often rely on the context in which the device is used; some general requirements will be covered in this chapter with the most basic division of scope: external to the user versus internal to the user (Fig. 11).

Fig. 11
figure 11

Internal versus external considerations of human-centric wearables

11.1 External Considerations

There are many external factors associated with the use of a wearable computer that are often overlooked in the design process. Although the user may be willing to accept the technology, occasionally it is their external context that places restrictions. It is crucial to understand these facets as they often have a large impact on the ultimate use and adoption of the device.

As an example, cultural appropriateness is something that is often overlooked during the design process of wearables. Would it be culturally acceptable to wear this device in the environment that it was intended to be used in? Many unsupported assumptions are made about what society is willing to accept as appropriate without ever having conversations with the stakeholders. Had this consideration been validated at an earlier point in the design and development process, the wearable could have been made fashionable.

Another example of an external factor would be the wearable computers interference with public interaction. Is it too loud or distracting? Can it lead to awkward social interactions? The example of the Bluetooth earpiece is appropriate. Some devices were so loud that people around an individual could hear the entire conversation from both sides which lead to issues with privacy. Other devices were so small that people in the immediate vicinity were not sure if the individual was talking to them or someone on the phone.

11.2 Internal Considerations

Similarly, there are considerations that need to be made about the internal context of a user in the design of human-centric wearables. It is critical to have a complete understanding of the human factors that surround the intended audience of the wearable and use this understanding as a basis for design decisions. Along with a firm understanding of human factors, it is important to acknowledge the person’s goals and tasks and ensure that the wearable device empowers the completion of these tasks. As stated previously, wearables are often used in a dual-task setting where interaction with the device is interrupting some other primary physical task in the real world. Thus, the wearable should not become a major disruption to this primary task or inhibit the individual from completing it.

Internal considerations should also span the spectrum of ability of users. This means that the device should be adaptable for individuals of various levels of ability and can often mean redundant interfaces so that it is left to the user to decide how they want to receive information. Redundancy is an important factor in accessibility and, by transfer, is an important aspect of human-centric design. As an example, a watch may provide a visual display when an alarm is triggered but may also provide vibrotactile pattern for haptic feedback and play a tone for auditory feedback.

12 Conclusion

As the era of modern wearable technology continues to advance, the onus of adaptation continues to shift from the user to the wearable device, necessitating the integration of the human-centric design considerations reviewed in this chapter into the development process for these devices. In a global market and a highly connected world, the considerations of the individual, society, and world population form layers of influence in wearable design which often conflict, requiring that developers achieve a delicate equilibrium in their design decisions. The concepts of customizability and multimodality continue to be explored within this context as the single-purposed wearables of yesterday and today begin to assume an increasing number of roles under continuing advancements in the size and power of sensors and mobile interfaces. Future research in this field will seek to yield design patterns, frameworks, models, and prototypes which can address the many limitations and concerns of human-centricity highlighted above without sacrificing the basic standards of usability, cost-effectiveness, and portability that form the groundwork for the success of this technology.