Abstract
How can Embodied Virtual Agents (EVAs, often misleadingly called “avatars”) facilitate access to modern information and communication technologies for older people? Several studies and theoretical considerations point out their strong potential benefits, as well as their pitfalls and limitations. This chapter provides a survey of current studies, technologies, and applications, and shall provide guidance as to when and how to employ an EVA for the benefit of older adults. The reviewed studies encompass robotics, EVAs, and specific questions regarding the e-inclusion of the target user group.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
Embodied Virtual Agents (EVAs) are autonomous virtual beings that interact with the virtual and real environment through an expressive virtual body, while pursuing particular goals or fulfilling certain tasks. Usually, they are of human or humanoid appearance, even when they represent animals or objects, as for instance a paper-clip. EVAs interact with people or with other EVAs by means of natural verbal and nonverbal channels, e.g. speech and accompanying facial expressions. Nowadays, EVAs can achieve a remarkable level of visual and behavioral realism due to high quality real-time rendering, appealing movements, and integration of results from artificial intelligence and natural language processing technologies.
As soon as they appeared on stage, EVAs were employed as service assistants on websites, as health advisors, or as social companions, to name only a few functions. Whereas initial research tended to focus on their benefits, as for instance on their capacity to communicate naturally or to enhance trust in a technical system, see e.g. [7, 69, 97], negative aspects could soon not be ignored and some discouraging experiences were made as well.
The following examples may be worth mentioning: In 1995, a Microsoft software package was shipped that included Microsoft Bob. This agent was supposed to assist users with low computer literacy skills. In 2000 Bob was replaced by Clippy, the Microsoft Office Assistant. Albeit Clippy was designed to simplify the use of certain programs and to make them more fun, at the end of the day, it actually elicited mostly negative reactions [116].
The Computers as Social Actors Theory and the studies of Reeves and Nass achieved some prominence in the area [105, 116], since they postulate that media automatically evoke certain social responses because social-psychological rules that govern interaction between humans also extend to the realm of human-computer interaction. Accordingly, violation of these rules would trigger negative feelings. Clippy is an illustration that these rules indeed matter in the field of EVAs – any human service assistant displaying a similar behaviour to Clippy would cause user reactance: Clippy did not follow social norms, did not learn from earlier interactions, did not develop long-term relationships, and was actually not useful [116] – there was no reason to bear its presence.
Yet, while Microsoft has given up on embodied interface agents as main product feature, other EVAs are more successful in eliciting positive user reactions. In spite of rather limited behavioral capabilities, the continuing use of Anna, for years now, points towards a useful design. Anna’s functionalities are not very ambitious: it mainly guides users through IKEA’s website [66].
Anna and Clippy, respectively, are well-known positive and negative examples of employing and designing EVAs. It is important that today’s researchers and developers understand how and when EVAs can be employed usefully for the benefit of older adults and when they are best avoided. This chapter will help making informed decisions on their utilization. The following paragraphs will dwell on aspects that are particularly informative in the context of using EVAs for easing the life of older adults.
The potential of EVAs should be also viewed in the global context of an ageing population and the increase of age related health problems – including mental health, e.g. dementia – and social issues as loneliness. In the next sections, we describe the possible functions of EVAs in the context of old age; we present possible application domains for EVAs; we look at how they should look like and behave; and we dwell on risks and pitfalls.
2 Functions of EVAs
2.1 Direct Emotional Influence
Often, people unconsciously mimic emotions and behaviour of interaction partners. This plays a role in understanding goals, desires, and intentions of the other person [54]. As part of this mechanism, emotions are carried over so that for example a calm expression of the partner induces similar feelings in the viewer [120]. This emotional contagion effect can be found in several scenarios where EVAs are employed, e.g. serious games and fitness or health assistance. It was shown that EVAs displaying empathy, sympathy, compassion, or humour reduce user frustration and interaction times [63, 74, 85, 93, 102]. For example Prendinger et al. frustrated users in a mathematical game and simultaneously presented an EVA that expressed empathy. The presence of the EVA reduced stress and users evaluated the task as easier [102]. Another example is that of an EVA which was designed to accompany physical exercise and thereby leveraged the pressure felt by the user [67].
2.2 Non-verbal Communication
Non-verbal communication is more than just an emotional expression which influences the user. Facial expressions and gestures of the body provide information as well [32]. Nonverbal communication influences decision making [40] and can even override verbal messages [2]. Not surprisingly, this effect becomes more prominent the more difficult it is to understand the voice of an interaction partner, due for instance to a noisy environment.
The most common gestures of humans are redundant [49] and only support the spoken message without adding new content to it. In the context of old age, where hearing impairments are prevalent and cognitive capacities often affected, it is plausible that redundant non-verbal information helps in understanding EVAs [12]. Moreover, less effort is needed to follow the conversation [47], partly because important words can be emphasized [46, 76]. Gestures and non-verbal expressions are also important for the control of turn-taking in conversation [35]. A study by Buisine and Martin showed that different gesture-speech combinations influenced in different ways memorization and the perceived quality of a talk and expressiveness of the EVA [30]. For example a study showed that redundant gestures were most effective for recall performance and for several subjective measures like perceived quality of the information [31]. These are then additional aspects to take into consideration when implementing an EVA.
Another potential function of an EVA’s non-verbal language is to produce sign language for the hard of hearing, cf. e.g. [4, 110]. In a sign language translation system, a hearing person’s spoken or typed words are automatically translated into visual sign language gestures of an EVA. Such a sign language translation system has been tested in communications between deaf people and office personnel responsible for the renewal of driver’s licenses [110]. The communication was mediated by an EVA that translated the hearing person’s typed words into sign language for the deaf interaction partner. It should be noted that the EVA was not evaluated very well by the clients who were unsatisfied with the naturalness of the signs and the complexity of the interface. However, these issues do not obviously generalize to the application scenario per se (Fig. 7.1).
2.3 Increasing Trust in Computer Systems
Trust in technology can be crucial in areas where privacy is a concern, for instance health [5, 88] or personal hygiene. Since older people tend to be more sceptical when interacting with technology [65], promoting trust into technical systems becomes even more important. There is some research investigating whether trust can be created by giving a face to a technical system, see e.g. [17]. However, an experiment showed that, while social presence effects could be established during net communication, an EVA received surprisingly low trust ratings [13]. Whether EVAs and humans receive different levels of trust was tested in a “trusting game”. Behavioural data demonstrated that participants trusted EVAs and humans equally, although underlying cognitive mechanisms may be different, according to brain imaging data [107]. Another study showed that adding an EVA to e-commerce web sites indeed does increase both cognitive and emotional trust [103].
In general, EVAs incite people to personalize technology, and this increases trust [75, 96]. The effect is particularly strong with older users [129]. A system that possesses human features indicates consistent behaviour and thus controllability [119]. These results are complemented by studies showing that EVAs are more trusted if they are expressive [84, 92], show empathy [25], and are of similar ethnicity [91]. These findings suggest that not only the bare presence, but also an EVA’s personality, in a broad sense of the word, matters for creating trust into a technical system.
2.4 Increasing Enjoyment of Human-Computer Interaction
In order to ensure that older people accept and use new technological systems, joy and motivation are crucial factors. Studies of Heerink et al. demonstrated that robots and EVAs can boost the joy of interacting with a system [59, 60], which increases the willingness of older adults to use it. For example Steffie, a website interface [114] for delivering information about e.g. the internet, email, or health insurance, added enjoyment to the interaction by creating social presence, which is correlated with the intention to use the system [59]. It was also shown that the robotic pet Paro had beneficial effects on older adults, like feeling happier and healthier [113]. Other studies demonstrated that EVAs, used for presenting learning material, increased the enjoyment of the learning process [69, 84]. Behaviour of EVAs that is associated with enjoyment is humour [93], smiling, and expressing believable emotions [6]. Further factors that influence enjoyment and the motivation to use a system are (1) social presence, i.e. the feeling that an interaction partner is actually present; (2) that the system displays social behaviour [61]; and (3) that it is able to develop social bounds with the user [90]. Various studies have shown that EVAs can be an effective means for adding these characteristics to technical systems, see e.g. [23, 25, 34, 52, 58, 61, 64, 90, 97, 117].
3 Usage Scenarios of EVAs for Older Adults
3.1 EVAs as Interface Agents to Foster E-Inclusion
With older age, biological, physiological, and cognitive capacities as well as social relationships change [36]. E-Inclusion policies have to take these changes into consideration. E-inclusion means that (a) information technology is made accessible to the entire society and that (b) technology is used to promote inclusion, economic performance, employment opportunities, and the quality of life in general. Although investments are being made to reduce the digital divide, inequalities in terms of access to information technology still exist, particularly with regard to older adults, people with disabilities, and people with low literacy levels [45]. Many technical systems are complex and difficult to use and exclude these groups of people [43].
Accessible systems must be usable, believable, enjoyable, and motivate to use them [44, 61, 90]. There are many approaches to enhance accessibility and usability that rely on an understanding of abstract interfaces. But older users in particular may experience difficulties using abstract interfaces. EVAs can come to rescue here, since they enable a more natural, human like communication and therefore reduce the requirements for the older user to adapt to new systems (this was seen as a paradigm shift by Spierling in [89]). Within the GUIDE project [55], adaptable and interactive EVA’s are being developed for assisting and supporting older adults. The EVAs developed in GUIDE assist older adults through the configuration and personalization of user interfaces; they offer explanations and assist the older users during a configuration process that lead to individually adapted user interfaces (cf. Fig. 7.2). Thus, for instance, people with impaired cognitive functions could benefit from EVAs as user interface because these can translate more abstract cognitive tasks like pushing the correct button of a TV command into a natural social interaction move like telling the EVA what to do. Certainly, the success of an EVA as user interface for older adults will depend much on its capacity to understand and express non-verbal, emotional, and communicative signs, because without these, it will not be possible to maintain the naturalness of the interaction.
3.2 EVAs as Assistants
In old age, a reduced cognitive flexibility can make it difficult to cope both with daily tasks and with unexpected emergency situations. EVAs and physically embodied agents have been employed to alleviate this problem. Examples are assistive robots for managing and planning daily activities related to safety, health, and hygiene, and robots that serve as reminders or to set alarms [36, 37, 80, 101]. In this context, robots were also built for establishing health related diagnoses by analysing users’ reactions to robotic systems [111]. Many of the aforementioned tasks, originally devised for robots, could also be delegated to EVAs.
Higher age is also positively related to motor disabilities as gait and balance disorders, e.g. of post-stroke or Parkinson patients, see e.g. [71] and [109]. Therefore, robots were developed to serve as walking aids to support simple motor skills [57]. EVAs were also envisaged in similar contexts. Examples are a reactive virtual fitness trainer or a virtual physiotherapist: In the first example, the EVA presents exercises that the user is supposed to imitate and provides feedback on the performance [108]. The other example is that of a virtual teacher that also relies on imitation tasks. It could be shown that the system helped to improve the condition of stroke patients with chronic health problems [62].
3.3 EVAs for Increasing Compliance and for Motivating Behaviour Change
EVAs have often been employed as coaches and motivators to change negative behaviour and to enhance the compliance with orders or advices. Bickmore et al. could show that EVAs are beneficial for the development of a therapeutic alliance, which is a prerequisite for successful behaviour change, and there are several robots and virtual interfaces that have demonstrated their effectiveness in health behaviour change interventions or medication compliance [18, 20, 22, 23, 25]. For example, a diet promoting system was shown to be more effective if the interface was an embodied robot, rather than a touch screen or a paper diary [73].
Higher age is correlated to physical inactivity [27]. Yet, physical activity plays a key role in maintaining functional abilities, independent living, health, and well-being [27, 38]. Robots and EVAs have already been developed to support people with disabilities and to promote motor activity by enticing, scheduling, fostering compliance, and monitoring, see e.g. [48, 51, 126]. For example, a system designed to increase the level of exercise of older adults with limited computer literacy was better in motivating behaviour change if an EVA was used as interface, rather than a control group [25]. Interestingly, the same effect was present in young adults [19]. Another example is that of a mobile health counselling agent, a portable EVA that was designed to promote physical activity. However, a pilot study that evaluated its influence on motivating to walk showed that a proactive EVA that delivered feedback based on an accelerometer fostered the building of social bonds between EVA and user, but lead to less walking compared to users who used a passive EVA [24].
3.4 EVAs to Facilitate Learning and Cognitive Fitness
From 50 years of age onwards many cognitive functions decline. The prefrontal cortex shrinks, fluid intelligence decreases [36]. But preserving the ability to learn quickly is a requirement for successful aging [98], and ongoing cognitive activity is also important for the regulation of emotions and behaviour [81]. There are several contexts in which EVAs can be expected to contribute to cognitive activity and help maintaining cognitive flexibility, see e.g. [72]. For instance, it was shown that additional visual and auditory cues in a slide show support the memory of people with episodic memory impairment and also reduce caregiver involvement [83]. Moreover, redundant speech accompanying gestures of EVAs increased both recall and likeability, compared to complementary gestures or to no gestures at all [30]. It could also be shown that EVAs enhanced learning transfer [91], memorization [15], and learning experience [9, 77, 121]. Another example that might inspire systems for the older age groups is that of an intelligent tutoring system that used EVAs and direct instructions to support learning of young children with learning disabilities [68]. In sum, several scenarios of the usage of EVAs have already been studied and have proven that EVAs can be beneficial for the cognitively impaired or for learning tasks and specific applications for old age should be easily derived from these experiences.
3.5 EVAs as Virtual Companions
Does an “artificial social accompaniment” for the older population make sense? Older people usually have less social relations, but those are perceived as more important, so that the sheer number of relationships does not necessarily have a negative impact on well-being. Nevertheless, there is an elevated risk of loneliness in old age, partly due to higher mortality rates of friends and family [36]. Moreover, rewarding social relationships are related to less stress [70], the maintenance of cognitive capabilities [132], and are predictors of well-being [14] and successful aging [19]. Can EVAs, in this situation, contribute to a happier ageing by acting as some sort of social partners, maybe in the vein of cats and dogs rather than an ersatz family?
In spite of rather limited AI-capabilities of EVAs, some mimicry of human behaviour might be beneficial. There are studies showing that the perception of social support is, under certain circumstances, more important than the actual support itself [127]. Several studies have demonstrated that real or robotic pets, which make use of social cues, can elicit user emotions and thus lead to higher levels of well-being [11, 42, 87, 104, 113, 115, 123, 128]. The utilization of Paro the seal robot in an eldercare institution increased the number of social interactions between inhabitants and reduced stress at the same time [124]. Other studies have demonstrated various beneficial social effects of EVAs such as relaxation, reduction of frustration, stress, and loneliness [16, 25, 63, 74, 100, 102]. A longitudinal study of Bickmore has demonstrated that the feeling that the EVA is caring for oneself can last over longer periods of time [21]. Thus, research indicates that the doors of the older population should be wide open for novel kinds of virtual “social companions”, albeit the term “companion” might still require replacement by some more appropriate denomination; to call them “companions” easily raises concerns that naïve attempts could be going on to “replace” family, friends, and serious care and concern by cheap, insufficient surrogates. It is probably wiser to lower expectations and to initially regard the emerging new kind of “companion” EVAs as some sort of pet, or even only as some sort of technological porcelain doll, or maybe as a new kind of toy. It can only be considered as something that has its ways to contribute to enjoyment and beauty in the life of an old adult, but that certainly will never be able to fully replace human warmth (Fig. 7.3).
4 Designing an EVA
4.1 Choosing Appearance
What should your EVA look like and how should it behave? In certain cases, this question might be less relevant than expected at first sight. A meta-analysis of Yee et al. revealed that the most important aspect of the use of an EVAs was the fact that it was actually present and running. The visual quality of the representation was only of secondary importance [130]. Yet, the generalization of this result certainly depends on the behavioural capacities of the EVA, as will be described in the next paragraphs.
Concerning their behaviour and appearance, we will first look at properties that are likely to foster the creation of bonds between an older user and its personal EVA, since this is important for long term acceptance, see e.g. [26].
Humans have a strong need to belong [8] and tend to create bonds to things that display human cues as e.g. speech [105]. Visual characteristics that foster the creation of bonds are attractiveness [41, 79], and similarity [8, 65]. In addition, in line with the Emotional Similarity Hypothesis of Schachter, see e.g. [56], people tend to get closer to interaction partners that are experiencing similar situations and emotional states. Therefore, often enough it might be worth considering to set an EVA into scene with a background story and appearance that emphasizes the similarity of user and EVA, e.g. when both are rehabilitation patients of the same sex, are of similar age, and have similar health issues.
Some researchers maintain that it is crucial to deliver behavioural realism [3, 26, 94], particularly if the EVA has a realistic appearance. Very realistically looking EVAs give rise to expectations about corresponding life-like behaviour [65] and subsequent violation of these expectations will reduce likeability, believability, competence, and enjoyment (cf. the term “uncanny valley” of Mori in [49, 105]). Thus, the level of realism of their appearance and behaviour should be well thought of and overambitious realism can be quite detrimental, see [50] and [112]. An appropriate way to manage this conflict is to design the EVA with sufficient human characteristics for the user to feel motivated to interact socially, while maintaining sufficient non-realistic characteristics to keep expectations low, concerning the intelligence of EVA’s behaviour [49]. Another possibility is to go for cartoon or animal characters, in particular when aiming at long term relationships (cf. remarks above about the role of an EVA as a very limited “companion”).
4.2 Choosing the Behaviour of an EVA
In order to build effective EVAs, several factors like the situation, the usage scenario, the context, the individuality, and the personality etc. must be considered. The most relevant aspects will be explained in the following sections.
Interpersonal differences. Interpersonal differences have to be taken into account because individuals respond to technological systems in different ways [29]. While some users accepted robots as social actors, others did not [92]. Behavioural realism related e.g. to the expression of emotions should at least match the expectations of the user [105]. Meeting these expectations will foster likeability, enjoyment [6, 75], and believability of the EVA [7].
Another consideration is that female and male behaviour of an EVA should be consistent with gender stereotypes [63] and the user’s gender has to be taken into account as well [10]. In contrast to men, women prefer female EVAs and tend to favour more smiling and self-touching behaviours. Another finding of this study was that older people and people with less computer literacy were more nervous during the interaction, and that older people were more attentive if the EVA showed less self-touching behaviour [78]. Moreover, it was shown that personality is a better predictor for subjective feeling and evaluation of the EVA than its concrete behaviour [122], indicating that personality traits of users should be prominent when deciding about design and implementation, cf. e.g. [31]. For example, highly self-conscious people felt more aggression and people with high levels of self-efficacy felt less afraid and less distressed after interacting with an EVA [122]. It was also shown that there are differences concerning the acceptance of an EVA’s monitoring behaviour, depending on the personality trait control orientation. Users thinking that external factors control their success in life (i.e. external locus of control) felt more anxious than people who felt responsible for their success (i.e. internal locus of control) [106].
In conclusion, there are many individual differences and dependencies, making it difficult to design a single most adequate EVA that is able to suit all kinds of users. Furthermore, the opinions of users assessed by self-questionnaires do not always correspond to their actual behaviour [99, 106, 130], which hampers the expectation that it is possible to design adequate EVAs only by asking their users. Taking this and the interdependencies between personality and rating of an EVA into account, an ideal system would be highly adaptive to both user’s personality traits and interaction history, cf. [28] and [39].
Context. Belief and trust are reduced when an EVA’s behaviour is inappropriate or unrealistic in a certain context [53]. Therefore, EVAs should display non-verbal behaviour that is consistent with social norms [86] and should for instance smile in appropriate situations [93].
Behaviour can have different meanings, depending on the social or cultural context. Looking into the eyes of the user can be understood as aggression or, on the contrary, be regarded as a lovely gesture [120], depending on the relationship of the user to the EVA and on the cultural background [37].
Function. The acceptance of the behaviour of an EVA depends much on its specific role. A study that employed EVAs in a learning context has demonstrated that EVAs displaying emotions were accepted when acting as supportive peers but not when they were tutors [10].
Another important aspect is that non-verbal behaviour like gestures and emotions can affect the perception of the spoken message [2] and induce emotions in the viewer [120]. These emotions can lead to a reduction of effort, in particular when they are positive [33]. There are thus contexts where a less friendly, not always smiling EVA might be more appropriate. For instance, when a user is supposed to follow the prescription to take an essential medicine, an angry or sad EVA might be more effective.
Most importantly, the designer of an EVA has to distinguish between short- and long-term interactions. If he/she is creating an EVA for capturing attention and for achieving short term effects, the task will probably be less demanding, see [105].
But when EVAs shall serve as social companion or as a personal health assistant, the requirements on the behaviour design are likely to become tough, see [49]. Then, more realistic simulation of emotional expressions [6, 75], of interpersonal attitudes, and of personality traits [120] are important. Particularly in the context of building relationships, “physical” approximation to the user (displayed e.g. by forward leaning movements), head nods, lively gestures [120], and behaving as if the EVA liked the user [8], should be effective measures because these behaviours are strongly correlated to a desire for emotional proximity [120]. Moreover, in order to facilitate relationship building, EVAs should certainly provide sufficient support, but they probably should also expect and accept support from the user. This assumption is based on the social-psychological Equity Theory, according to which satisfaction is highest if costs and rewards are equal for both interaction partners [1]. A review comprising studies on the utilization of EVAs in psychiatry comes to the conclusion that EVAs should express empathy, involve the user in social dialogue, display humour and happiness, talk about past and future, show appropriate social behaviour, and refer to mutual knowledge in order to build a therapeutic alliance, see [19]. Several studies on the use of EVAs in clinical applications suggest that their behaviour should be variable and dynamic, an EVA should talk about itself (self-disclosure), and should refer to knowledge about prior interactions [19]. Bickmore and Cassel have developed a model of social proximity that summarizes the most relevant factors for building a relationship to an EVA. Mutual familiarity, similarity, and affect are pillars in this model [18].
Arousal level. Older adults usually cope well with a smaller number of stressors that do not last for too long a time period. However, in the presence of many stressors that last longer, older users tend to experience much more stress than younger people, see [36]. Since EVAs can reduce stress (cf. e.g. [102]), there should be certain situations where EVAs can intervene and contribute to stress reduction in a sensitive way whenever higher arousal levels are registered. Certain findings suggest that the presence of other people promotes performance at easy tasks but impairs accomplishment of more difficult tasks. This effect is mediated by the arousal level, i.e. the presence of others increases arousal; this is an advantage when accomplishing familiar activities but detrimental when cognitive effort and focus is required [106, 118, 131]. This social presence effect was replicated when the audience was composed of EVAs – and not of humans –, which strongly suggests that task difficulty must be considered when planning the usage of EVAs to reduce stress [106]. These results imply that the most appropriate level of active intrusion and social presence of an EVA depends on the task difficulty of the usage scenario – the more difficult the task, the more cautious and silent the EVA should be.
4.3 Pitfalls and Risks of Employing EVAs
In this section, we will expose possible dysfunctional aspects of EVAs and point out why it might be better to abandon the idea of using an EVA under certain circumstances.
Distracting effects. The animations of the EVAs could become sources of stress, cf. [93], and EVAs may require attention resources that are more limited at higher ages; as a consequence, they could cause distraction and decrease performance in tasks like recalling information [75, 95, 125]. But other authors have suggested that the distracting effect of EVAs is likely to disappear after several interactions [95]. There are also other studies that do not report on any negative effects on recall performance, stress, or accuracy of answering a questionnaire, see [64, 102, 125].
Overestimation of the capabilities of the system. EVAs employed as interfaces may raise high expectations about the capabilities of the system, cf. [19]. These exaggerated expectations may lead to disappointment, and may even be dangerous under certain circumstances if the older adult does not recognize its true limitations. As an example, consider EVAs that are used for health monitoring: their user could be at risk of not calling medical support in an emergency situation because he/she relies on the EVA, but it is not able to recognize or emotionally reflect the critical situation. Furthermore, the accuracy of the advice of EVAs about e.g. health issues can be low and there are also the additional risks of misunderstanding their messages [19].
Rejecting the ludic aspect. It is not clear when an older adult is likely to reject an EVA because he/she finds the idea ridiculous or awkward. Younger users (16–22 years) were shown to develop closer relationships to virtual pets than older adult users [82], an indicator that possibly some older people will not enjoy interaction with an EVA in general.
Dependencies forming. A risk of employing an EVAs as a companion could increase social isolation because the old person might not feel the necessity to participate in real social interactions anymore cf. [19]. Considering that one of the reasons for older adults to have smaller social networks is the desire to avoid conflicts [36], these users could feel compelled to focus on conflict-free interactions with EVAs. Some severely cognitively impaired people may even become confused as to whether an EVA is real or not. Other related ethical issues are related to confidentiality, privacy, monitoring, and provider liability, see [19].
5 Conclusion
In many situations, the use of an EVA in a technical system makes sense and older people will benefit from it. With recent technological advances and decrease of prices in IT and hardware, we certainly can expect many innovative, dedicated applications to be developed in the very near future. We have seen that emotional effects, communicative advantages, task simplification, or learning effects can speak for EVAs. Generalization of the aforementioned findings to different applications and usage scenarios is difficult and must be done with care, since many aspects will influence their validity for different contexts. For example, the exact user group, its culture, the possible cognitive disorders or health issues of its members, and the application scenario with its specific goals and interaction logic are aspects that will determine whether to employ an EVA is appropriate or not, and which properties it should eventually possess.
Some sort of user involving design process when developing systems with EVAs for old persons is thus necessarily required, but attention must be paid to the fact that the old users’ report may differ considerably from their actual behaviour, and long term effects and usefulness may not be the same as those observed in short terms.
References
Adams, J. S. (1965). Inequity in social exchange. Advances in Experimental Social Psychology, 2, 267–299.
Argyle, M., Trower, P., & Kristal, L. (1979). Person to person: Ways of communicating. London: Harper & Row.
Bailenson, J. N., & Blascovich, J. (2004). Avatars. In Encyclopedia of human-computer interaction. Great Barrington: Berkshire Publishing Group.
Barberis, D., Garazzino, N., Prinetto, P., & Tiotto, G. (2011). Improving accessibility for deaf people: An editor for computer assisted translation through virtual avatars. In The proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility (pp. 253–254). New York: ACM.
Barefoot, J. C., Maynard, K. E., Beckham, J. C., Brummett, B. H., Hooker, K., & Siegler, I. C. (1998). Trust, health, and longevity. Journal of Behavioral Medicine, 21(6), 517–526.
Bartneck, C. (2003). Interacting with an embodied emotional character. In Proceedings of the 2003 international conference on designing pleasurable products and interfaces (pp. 55–60). New York: ACM.
Bates, J. (1994). The role of emotion in believable agents. Communications of the ACM, 37(7), 122–125.
Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117(3), 497.
Baylor, A. L., & Ryu, J. (2003). The effects of image and animation in enhancing pedagogical agent persona. Journal of Educational Computing Research, 28(4), 373–394.
Beale, R., & Creed, C. (2009). Affective interaction: How emotional agents affect users. International Journal of Human Computer Studies, 67(9), 755–776.
Beck, A. M., & Meyers, N. M. (1996). Health enhancement and companion animal ownership. Annual Review of Public Health, 17(1), 247–257.
Benoît, C. (1996). On the production and the perception of audio-visual speech by man and machine. In Multimedia & video coding. New York: Plenum Press.
Bente, G., Rüggenberg, S., Krämer, N. C., & Eschenburg, F. (2008). Avatar-mediated networking: Increasing social presence and interpersonal trust in net-based collaborations. Human Communication Research, 34(2), 287–318.
Berkman, L. F., & Syme, S. L. (1979). Social networks, host resistance, and mortality: A nine-year follow-up study of Alameda County residents. American Journal of Epidemiology, 109(2), 186–204.
Beun, R. J., De Vos, E., & Witteman, C. (2003). Embodied conversational agents: Effects on memory performance and anthropomorphisation. In Intelligent virtual agents (pp. 315–319). Berlin: Springer.
Bickmore, T. W. (2003). Relational agents: Effecting change through human-computer relationships. Cambridge, MA: Massachusetts Institute of Technology.
Bickmore, T., & Cassell, J. (2001). Relational agents: A model and implementation of building user trust. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 396–403). New York: ACM.
Bickmore, T., & Cassell, J. (2005). Social dialogue with embodied conversational agents. In Advances in natural multimodal dialogue systems (pp. 23–54). New York: Kluwer Academic.
Bickmore, T., & Gruber, A. (2010). Relational agents in clinical psychiatry. Harvard Review of Psychiatry, 18(2), 119–130.
Bickmore, T., & Pfeifer, L. (2008). Relational agents for antipsychotic medication adherence. In: CHI’08 workshop on technology in mental health, Florence, Italy.
Bickmore, T. W., & Picard, R. W. (2004). Towards caring machines. In CHI’04 Extended abstracts on human factors in computing systems (pp. 1489–1492). New York: ACM.
Bickmore, T. W., & Picard, R. W. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction (TOCHI), 12(2), 293–327.
Bickmore, T. W., Caruso, L., & Clough-Gorr, K. (2005). Acceptance and usability of a relational agent interface by urban older adults. In CHI’05 Extended abstracts on human factors in computing systems (pp. 1212–1215). New York: ACM.
Bickmore, T. W., Mauer, D., & Brown, T. (2009). Context awareness in a handheld exercise agent. Pervasive and Mobile Computing, 5(3), 226–235.
Bickmore, T. W., Pfeifer, L. M., & Jack, B. W. (2009). Taking the time to care: Empowering low health literacy hospital patients with virtual nurse agents. In Proceedings of the 27th international conference on human factors in computing systems (pp. 1265–1274). New York: ACM.
Blascovich, J., Loomis, J., Beall, A. C., Swinth, K. R., Hoyt, C. L., & Bailenson, J. N. (2002). Immersive virtual environment technology as a methodological tool for social psychology. Psychological Inquiry, 13(2), 103–124.
Booth, M. L., Owen, N., Bauman, A., Clavisi, O., & Leslie, E. (2000). Social-cognitive and perceived environment influences associated with physical activity in older Australians. Preventive Medicine, 31(1), 15–22.
Bosse, T., Siddiqui, G., & Treur, J. (2010). An intelligent virtual agent to increase involvement in financial services. In Intelligent virtual agents (pp. 378–384). Berlin: Springer.
Brewer, M. B., & Hewstone, M. (2004). Emotion and motivation. Malden: Wiley-Blackwell.
Buisine, S., & Martin, J. C. (2007). The effects of speech-gesture cooperation in animated agents’ behavior in multimedia presentations. Interacting with Computers, 19, 484–493.
Buisine, S., & Martin, J. C. (2010). The influence of user’s personality and gender on the processing of virtual agents’ multimodal behavior. Advances in Psychology Research, 65, 1–14.
Burgoon, J. K. (1994). Nonverbal signals. In M. L. Knapp & G. R. Miller (Eds.), Handbook of interpersonal communication (2nd ed., pp. 229–285). Thousand Oaks: Sage.
Carver, C. S., & Scheier, M. F. (2009). Action, affect, and two-mode models of functioning. In Oxford handbook of human action (pp. 298–327). New York: Oxford University Press.
Cassell, J. (2000). Nudge nudge wink wink: Elements of face-to-face conversation for embodied conversational agents. In Embodied conversational agents (pp. 1–27). Cambridge, MA: MIT Press.
Cassell, J., & Thorisson, K. R. (1999). The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Applied Artificial Intelligence, 13(4–5), 519–538.
Charles, S. T., & Carstensen, L. L. (2010). Social and emotional aging. Annual Review of Psychology, 61, 383–409.
Cortellessa, G., Koch-Svedberg, G., Loutfi, A., Pecora, F., Scopelliti, M., & Tiberio, L. (2008). A cross-cultural evaluation of domestic assistive robots. In: Proceedings of the AAAI fall symposium on AI and Eldercare, Arlington, VA.
Crombie, I. K., Irvine, L., Williams, B., McGinnis, A. R., Slane, P. W., Alder, E. M., & McMurdo, M. E. T. (2004). Why older people do not participate in leisure time physical activity: A survey of activity levels, beliefs and deterrents. Age and Ageing, 33(3), 287–292.
Dautenhahn, K. (2004). Robots we like to live with?! – A developmental perspective on a personalized, life-long robot companion. In 13th IEEE international workshop on robot and human interactive communication, 2004. ROMAN 2004 (pp. 17–22). Piscataway: IEEE Press.
de Melo, C., Carnevale, P., & Gratch, J. (2010). The influence of emotions in embodied agents on human decision-making. In Intelligent virtual agents (pp. 357–370). Berlin: Springer.
Dion, K., Berscheid, E., & Walster, E. (1972). What is beautiful is good. Journal of Personality and Social Psychology, 24(3), 285.
DiSalvo, C., Gemperle, F., Forlizzi, J., & Montgomery, E. (2003). The Hug: an exploration of robotic form for intimate communication. In: The 12th IEEE international workshop on robot and human interactive communication, 2003. Proceedings. ROMAN 2003 (pp. 403–408). Canada: Vancouver.
Eizmendi, G., & Craddock, G. M. (2007). Challenges for assistive technology: AAATE 07 (20th ed.). Amsterdam: IOS Press Inc.
Emiliani, P. L., Stephanidis, C., & Vanderheiden, G. (2011). Technology and inclusion – Past, present and foreseeable future. Technology and Disability, 23(3), 101–114.
European Union. (n.a.). E-Inclusion … what next? Embracing the future of social innovation 2010–2015. http://ec.europa.eu. Accessed 30 May 2012.
Fagel, S. (2006). Emotional mcgurk effect. In: Proceedings of the international conference on speech prosody, Dresden (1st ed.).
Fagel, S., & Madany, K. (2008). Computeranimierte Sprechbewegungen in realen Anwendungen. Univ.-Verl. der TU, Univ.-Bibliothek.
Feil-Seifer, D., & Mataric, M. J. (2005). Defining socially assistive robotics. In: 9th international conference on rehabilitation robotics, 2005. ICORR 2005, Chicago, IL (pp. 465–468).
Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3), 143–166.
Garau, M., Slater, M., Vinayagamoorthy, V., Brogni, A., Steed, A., & Sasse, M. A. (2003). The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 529–536). New York: ACM.
Gockley, R., & Mataric, M. J. (2006). Encouraging physical therapy compliance with a hands-off mobile robot. In Proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction (pp. 150–155). New York: ACM.
Gockley, R., Bruce, A., Forlizzi, J., Michalowski, M., Mundell, A., Rosenthal, S., Sellner, B., et al. (2005). Designing robots for long-term social interaction. In IEEE/RSJ international conference on intelligent robots and systems, 2005. (IROS 2005) (pp. 1338–1343). Piscataway: IEE.
Gong, L. (2007). Is happy better than sad even if they are both non-adaptive? Effects of emotional expressions of talking-head interface agents. International Journal of Human Computer Studies, 65(3), 183–191.
Gratch, J., Okhmatovskaia, A., Lamothe, F., Marsella, S., Morales, M., van der Werf, R., & Morency, L. P. (2006). Virtual rapport. In Intelligent virtual agents (pp. 14–27). Berlin: Springer.
Guide. (2007). http://www.guide-project.eu/. Accessed 3 Jan 2013.
Gump, B. B., & Kulik, J. A. (1997). Stress, affiliation, and emotional contagion. Journal of Personality and Social Psychology, 72(2), 305.
Hans, M., Graf, B., & Schraft, R. D. (385). Robotic home assistant care-o-bot: Past-present-future. In . 11th IEEE international workshop on robot and human interactive communication, 2002. Proceedings (pp. 380–385). Piscataway: IEEE.
Heerink, M., Kröse, B., Wielinga, B., & Evers, V. (2006). Studying the acceptance of a robotic agent by elderly users. International Journal of Assistive Robotics and Mechatronics, 7(3), 33–43.
Heerink, M., Kröse, B., Evers, V., & Wielinga, B. (2008). The influence of social presence on enjoyment and intention to use of a robot and screen agent by elderly users. In The 17th IEEE international symposium on robot and human interactive communication, 2008. RO-MAN 2008 (pp. 695–700). Piscataway: IEEE.
Heerink, M., Kröse, B., Wielinga, B., & Evers, V. (2008). Enjoyment, intention to use and actual use of a conversational robot by elderly people. In 3rd ACM/IEEE international conference on human-robot interaction (HRI), 2008 (pp. 113–119). Piscataway: IEEE.
Heerink, M., Kröse, B., Evers, V., & Wielinga, B. (2010). Assessing acceptance of assistive social agent technology by older adults: The almere model. International Journal of Social Robotics, 2(4), 361–375.
Holden, M. K., & Dyar, T. (2002). Virtual environment training: A new tool for neurorehabilitation. Journal of Neurologic Physical Therapy, 26(2), 62.
Hone, K. (2006). Empathic agents to reduce user frustration: The effects of varying agent characteristics. Interacting with Computers, 18(2), 227–245.
Hongpaisanwiwat, C., & Lewis, M. (2003). Attentional effect of animated character. In: Proceedings of the human-computer interaction (pp. 423–430). Switzerland: Zurich.
Hudlicka, E., Becker-Asano, C., Payr, S., Fischer, K., Ventura, R., Leite, I., & von Scheve, C. (2009). Social interaction with robots and agents: Where do we stand, where do we go? In: 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009 (pp. 1–6). Netherlands: Amsterdam.
Inter Ikea Systems B.V. (1999–2012). Ikea, welcome. http://idea.com/us/en/. Accessed 12 Dec 2012.
Ijsselsteijn, W. A., Kort, Y. A. W., Westerink, J., Jager, M., & Bonants, R. (2006). Virtual fitness: Stimulating exercise behavior through media technology. Presence: Teleoperators and Virtual Environments, 15(6), 688–698.
Jensen, A., Wilson, D.-M., Jordine, K., & Sakpal, R. (2012). Using embodied pedagogical agents and direct instruction to improve learning outcomes for young children with learning disabilities. Global TIME 2012, 2012(1), 235–239.
Johnson, W. L., Rickel, J. W., & Lester, J. C. (2000). Animated pedagogical agents: Face-to-face interaction in interactive learning environments. International Journal of Artificial Intelligence in Education, 11(1), 47–78.
Kamarck, T. W., Manuck, S. B., & Jennings, J. R. (1990). Social support reduces cardiovascular reactivity to psychological challenge: A laboratory model. Psychosomatic Medicine, 52(1), 42–58.
Kaminsky, T. A., Dudgeon, B. J., Billingsley, F. F., Mitchell, P. H., Weghorst, S. J., et al. (2007). Virtual cues and functional mobility of people with Parkinson’s disease: A single-subject pilot study. Journal of Rehabilitation Research and Development, 44(3), 437.
Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive robots as social partners and peer tutors for children: A field trial. Human Computer Interaction, 19(1), 61–84.
Kidd, C. D. (2007). Engagement in long-term human-robot interaction. PhD thesis in Media Arts & Sciences, MIT, Cambridge, MA.
Klein, J., Moon, Y., & Picard, R. W. (2002). This computer responds to user frustration: Theory, design, and results. Interacting with computers, 14(2), 119–140.
Koda, T., & Maes, P. (1996). Agents with faces: The effect of personification. In: 5th IEEE international workshop on robot and human communication, 1996 (pp. 189–194). Japan: Tsukuba.
Krahmer, E. J., & Swerts, M. (2006). Hearing and seeing beats: The influence of visual beats on the production and perception of prominence. In: Proceedings of Speech Prosody 2006. Germany: Dresden.
Krämer, N. C., & Bente, G. (2010). Personalizing e-learning. The social effects of pedagogical agents. Educational Psychology Review, 22(1), 71–87.
Krämer, N., Hoffmann, L., & Kopp, S. (2010). Know your users! Empirical results for tailoring an agent’ s nonverbal behavior to different user groups. In Intelligent virtual agents (pp. 468–474). Heidelberg: Springer.
Krämer, N. C., Eimler, S., von der Pütten, A., & Payr, S. (2011). Theory of companions: What can theoretical models contribute to applications and understanding of human-robot interaction? Applied Artificial Intelligence, 25(6), 474–502.
Kriglstein, S., & Wallner, G. (2005). HOMIE: an artificial companion for elderly people. CHI (Bd. 5, S. 02–07). New York: ACM.
Kryla-Lighthall, N., & Mather, M. (2009). The role of cognitive control in older adults’ emotional well-being. In V. L. Bengston, D. Gans, N. M. Putney, & M. Silverstein (Eds.), Handbook of theories of aging. New York: Springer.
Lawson, S. W., & Chesney, T. (2007). The impact of owner age on companionship with virtual pets. In: Eighth international conference on information visualisation (IV’04) (4th ed., pp. 1922–1928). Switzerland: St. Gallen.
Lee, M. L., & Dey, A. K. (2008). Lifelogging memory appliance for people with episodic memory impairment. In: Proceedings of the 10th international conference on ubiquitous computing (pp. 44–53). South Korea: Seoul.
Lester, J. C., Converse, S. A., Kahler, S. E., Barlow, S. T., Stone, B. A., & Bhogal, R. S. (1997). The persona effect: Affective impact of animated pedagogical agents. In: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 359–366). USA: Atlanta, GA.
Looije, R., Cnossen, F., & Neerinex, M. (2006). Incorporating guidelines for health assistance into a socially intelligent robot. In: The 15th IEEE international symposium on robot and human interactive communication, 2006. ROMAN 2006 (pp. 515–520). United Kingdom: Hatfield.
Mark, G. (1999). Designing believable interaction by applying social conventions. Applied Artificial Intelligence, 13(3), 297–320.
McCalley, T., & Mertens, A. (2007). The pet plant: Developing an inanimate emotionally interactive tool for the elderly. In: Proceedings of the 2nd international conference on persuasive technology (pp. 68–79). USA: Palo Alto, CA.
Mohseni, M., & Lindstrom, M. (2007). Social capital, trust in the health-care system and self-rated health: The role of access to health care in a population-based study. Social Science & Medicine, 64(7), 1373–1383.
Morandell, M., Fugger, E., & Prazak, B. (2007). The Alzheimer Avatar-Caregivers’ Faces used as GUI component. In Challenges for assistive technology AAATE (pp. 180–184). Amsterdam: IOS Press.
Morandell, M., Hochgatterer, A., Fagel, S., & Wassertheurer, S. (2008). Avatars in assistive homes for the elderly. In HCI and usability for education and work (pp. 391–402). Berlin: Springer.
Moreno, R., Mayer, R. E., Spires, H. A., & Lester, J. C. (2001). The case for social agency in computer-based teaching: Do students learn more deeply when they interact with animated pedagogical agents? Cognition and Instruction, 19(2), 177–213.
Nass, C., Isbister, K., & Lee, E. J. (2000). Truth is beauty: Researching embodied conversational agents. In Embodied conversational agents (pp. 374–402). Cambridge, MA: MIT Press.
Nijholt, A. (2002, April 15-16). Embodied agents: A new impetus to humor research. In: The April fools' day workshop on computational humour, Trento, Italy (pp. 101--111). Twente Workshops on Language Technology 20. University of Twente. ISSN 0929-0672.
Nowak, K. L., & Biocca, F. (2003). The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators & Virtual Environments, 12(5), 481–494.
Okonkwo, C., & Vassileva, J. (2001). Affective pedagogical agents and user persuasion. in: Proceedings of the 9th international conference on human-computer interaction (pp. 397–401). USA: New Orleans, Louisiana.
Ortiz, A., del Puy Carretero, M., Oyarzun, D., Yanguas, J., Buiza, C., Gonzalez, M., & Etxeberria, I. (2007). Elderly users in ambient intelligence: Does an avatar improve the interaction? In Universal access in ambient intelligence environments (pp. 99–114). Berlin: Springer.
Pandzic, I. S., Ostermann, J., & Millen, D. (1999). User evaluation: Synthetic talking faces for interactive services. The Visual Computer, 15(7), 330–340.
Papalia, D. E., Camp, C. J., & Feldman, R. D. (1996). Adult development and aging. New York: McGraw-Hill.
Parise, S., Kiesler, S., Sproull, L., & Waters, K. (1999). Cooperating with life-like interface agents. Computers in Human Behavior, 15(2), 123–142.
Picard, R. W., & Klein, J. (2002). Computers that recognise and respond to user emotion: Theoretical and practical implications. Interacting with computers, 14(2), 141–169.
Pollack, M. E., Brown, L., Colbry, D., McCarthy, C. E., Orosz, C., Peintner, B., Ramakrishnan, S., & Tsamardinos, I. (2003). Autominder: An intelligent cognitive orthotic system for people with memory impairment. Robotics and Autonomous Systems, 44(3), 273–282.
Prendinger, H., Mori, J., & Ishizuka, M. (2005). Using human physiology to evaluate subtle expressivity of a virtual quizmaster in a mathematical game. International Journal of Human Computer Studies, 62(2), 231–245.
Qiu, L., & Benbasat, I. (2005). Online consumer trust and live help interfaces: The effects of text-to-speech voice and three-dimensional avatars. International Journal of Human Computer Interaction, 19(1), 75–94.
Raina, P., Waltner-Toews, D., Bonnett, B., Woodward, C., & Abernathy, T. (1999). Influence of companion animals on the physical and psychological health of older people: An analysis of a one-year longitudinal study. Journal of the American Geriatrics Society, 47(3), 323–329.
Reeves, B., & Nass, C. (1996). How people treat computers, television, and new media like real people and places. Stanford/New York: CSLI Publications/Cambridge university press.
Rickenberg, R., & Reeves, B. (2000). The effects of animated characters on anxiety, task performance, and evaluations of user interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 49–56). Netherlands: The Hague.
Riedl, R., Mohr, P., Kenning, P., Davis, F., & Heekeren, H. (2011, December 6). Trusting humans and avatars: Behavioral and neural evidence. ICIS 2011 Proceedings (Paper 7). Shanghai. http://aisel.aisnet.org/icis2011/proceedings/hci/7
Ruttkay, Z., Zwiers, J., van Welbergen, H., & Reidsma, D. (2006). Towards a reactive virtual trainer. In Intelligent virtual agents (pp. 292–303). Berlin: Springer.
Salzman, B. (2010). Gait and balance disorders in older adults. American Family Physician, 82(1), 61–68.
San-Segundo, R., López, V., Martín, R., Lufti, S., Ferreiros, J., Córdoba, R., & Pardo, J. M. (2010). Advanced Speech Communication System for Deaf People. In: Eleventh annual conference of the International Speech Communication Association. Japan: Makuhari.
Scassellati, B. (2007). How social robots will help us to diagnose, treat, and understand autism. In Robotics research (pp. 552–563). Berlin: Springer.
Schroeder, R. (2002). The social life of avatars: Presence and interaction in shared virtual environments. London: Springer.
Shibata, T., Wada, K., & Tanie, K. (2003). Statistical analysis and comparison of questionnaire results of subjective evaluations of seal robot in Japan and UK. In: IEEE international conference on robotics and automation, 2003. Proceedings. ICRA’03, 3 (pp. 3152–3157). Taiwan: Taipei.
Stichting Steffie. (2007). www.steffie.nl, zo werkt het. http://steffie.nl. Accessed 3 Jan 2013.
Stiehl, W. D., Breazeal, C., Han, K. H., Lieberman, J., Lalla, L., Maymin, A., Salinas, J., et al. (2006). The huggable: A therapeutic robotic companion for relational, affective touch. In ACM SIGGRAPH 2006 emerging technologies (p. 15). New York: ACM.
Swartz, L. (2003). Why people hate the paperclip: Labels, appearance, behavior, and social responses to user interface agents. Citeseer. Stanford: Stanford University.
Takeuchi, A., & Naito, T. (1995). Situated facial displays: Towards social interaction. In I. Katz, R. Mack, L. Marks, M. B. Rosson, & J. Nielsen (Eds.), Human factors in computing systems: CHI+95 conference proceedings (pp. 450–455). New York: ACM Press.
Teigen, K. H. (1994). Yerkes-Dodson: A law for all seasons. Theory & Psychology, 4(4), 525–547.
Tognazzini, B. (1992). TOG on interface. Reading: Addison-Wesley.
Vinayagamoorthy, V., Gillies, M., Steed, A., Tanguy, E., Pan, X., Loscos, C., & Slater, M. (2006). Building expression into virtual characters. In: Eurographics conference state of the art reports. Goldsmiths Research Online. London.
Voerman, J. L., & FitzGerald, P. J. (2000). Deictic and emotive communication in animated pedagogical agents. In Embodied conversational agents (p. 123). Cambridge, MA: MIT Press.
von der Pütten, A., Krämer, N., & Gratch, J. (2010). How our personality shapes our interactions with virtual characters-implications for research and development. In Intelligent virtual agents (pp. 208–221). Berlin: Springer.
Vormbrock, J. K., & Grossberg, J. M. (1988). Cardiovascular effects of human-pet dog interactions. Journal of Behavioral Medicine, 11(5), 509–517.
Wada, K., & Shibata, T. (2007). Living with seal robots—Its sociopsychological and physiological influences on the elderly at a care house. IEEE Transactions on Robotics, 23(5), 972–980.
Walker, J. H., Sproull, L., & Subramani, R. (1994). Using a human face in an interface. In: Proceedings of the SIGCHI conference on human factors in computing systems: Celebrating interdependence (pp. 85–91). United States: Boston, Massachusetts.
Werry, I., & Dautenhahn, K. (1999). Applying mobile robot technology to the rehabilitation of autistic children. In: Proceedings of the SIRS99, 7th symposium on intelligent robotic systems. Portugal: Coimbra.
Wethington, E., & Kessler, R. C. (1986). Perceived support, received support, and adjustment to stressful life events. Journal of Health and Social Behavior, 27, 78–89.
Wilson, C. C., & Turner, D. C. (1998). Companion animals in human health. Thousand Oaks: Sage Publications Inc.
Wu, P., & Miller, C. (2005). Results from a field study: The need for an emotional relationship between the elderly and their assistive technologies. In: 1st international conference on augmented cognition, Las Vegas.
Yee, N., Bailenson, J. N., & Rickertsen, K. (2007). A meta-analysis of the impact of the inclusion and realism of human-like faces on user experiences in interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1–10). USA: San Jose, California.
Zajonc, R. B. (1965). Social facilitation. Ann Arbor: Research Center for Group Dynamics, Institute for Social Research, University of Michigan.
Zunzunegui, M. V., Alvarado, B. E., Del Ser, T., & Otero, A. (2003). Social networks, social integration, and social engagement determine cognitive decline in community-dwelling Spanish older adults. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences, 58(2), 93–100.
Acknowledgements
This work was partially funded by the GUIDE Project of the European Commission’s Seventh Framework Programme (FP7/2007-2013) under grant agreement 24889.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Noy, D., Ribeiro, P., Iurgel, I.A. (2013). Embodied Virtual Agents as a Means to Foster E-Inclusion of Older People. In: Biswas, P., Duarte, C., Langdon, P., Almeida, L., Jung, C. (eds) A Multimodal End-2-End Approach to Accessible Computing. Human–Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-4471-5082-4_7
Download citation
DOI: https://doi.org/10.1007/978-1-4471-5082-4_7
Published:
Publisher Name: Springer, London
Print ISBN: 978-1-4471-5081-7
Online ISBN: 978-1-4471-5082-4
eBook Packages: Computer ScienceComputer Science (R0)