Abstract
This paper describes an affect-responsive interactive photo-frame application that offers its user a different experience with every use. It relies on visual analysis of activity levels and facial expressions of its users to select responses from a database of short video segments. This ever-growing database is automatically prepared by an offline analysis of user-uploaded videos. The resulting system matches its user’s affect along dimensions of valence and arousal, and gradually adapts its response to each specific user. In an extended mode, two such systems are coupled and feed each other with visual content. The strengths and weaknesses of the system are assessed through a usability study, where a Wizard-of-Oz response logic is contrasted with the fully automatic system that uses affective and activity-based features, either alone, or in tandem.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Agamanolis S (2006) Beyond communication: human connectedness as a research agenda. In: Networked neighbourhoods, pp 307–344
Bailenson J, Pontikakis E, Mauss I, Gross J, Jabon M, Hutcherson C, Nass C, John O (2008) Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int J Hum-Comput Stud 66(5):303–317
Bookstein F (1989) Principal warps: thin-plate splines and the decomposition of deformations. IEEE Trans Pattern Anal Mach Intell 11(6):567–585
Bouguet J (1999) Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm. Intel Corporation, Microprocessor Research Labs, OpenCV Documents 3
Buchanan R, Margolin V (1995) Discovering design: explorations in design studies. University of Chicago Press, Chicago
Bui T, Zwiers J, Poel M, Nijholt A (2006) Toward affective dialogue modeling using partially observable Markov decision processes. In: Proc workshop emotion and computing, 29th annual German conf on artificial intelligence, pp 47–50
Cao J, Wang H, Hu P, Miao J (2008) PAD model based facial expression analysis. In: Advances in visual computing, pp 450–459
Carver C, White T (1994) Behavioral inhibition, behavioral activation, and affective responses to impending reward and punishment: the BIS/BAS scales. J Pers Soc Psychol 67(2):319–333
Dibeklioğlu H, Kosunen I, Ortega M, Salah A, Zuzánek P (2010) An affect-responsive photo frame. In: Salah A, Gevers T (eds) Proc eNTERFACE, pp 58–68
Ekman P, Friesen W, Hager J (1978) Facial action coding system. Consulting Psychologists Press, Palo Alto
Gilroy S, Cavazza M, Chaignon R, Mäkelä S, Niranen M, André E, Vogt T, Urbain J, Seichter H, Billinghurst M et al. (2008) An affective model of user experience for interactive art. In: Proc int conf on advances in computer entertainment technology. ACM, New York, pp 107–110
Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans Syst Man Cybern, Part B, Cybern 39(1):64–84
Ijsselsteijn W, de Kort Y, Poels K (in preparation) The game experience questionnaire: development of a self-report measure to assess the psychological impact of digital games. Manuscript
John O, Donahue E, Kentle R (1991) The Big Five Inventory Versions 4a and 54. Berkeley: University of California, Berkeley, Institute of Personality and Social Research
John O, Naumann L, Soto C (2008) The Big Five trait taxonomy: discovery, measurement, and theoretical issues. In: Handbook of personality: theory and research, pp 114–158
Kaliouby R, Robinson P (2005) Real-time inference of complex mental states from facial expressions and head gestures. In: Real-time vision for human-computer interaction, pp 181–200
Kanade T, Cohn J, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proc AFGR
Lienhart R, Maydt J (2002) An extended set of haarlike features for rapid object detection. In: IEEE international conference on image processing, vol 1, pp 900–903
Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: IJCAI, pp 674–679
Mancas M, Chessini R, Hidot S, Machy C, Ben Madhkour R, Ravet T (2009) Morface: face morphing. Q Prog Sci Rep Numediart Res Program 2(2):33–39
Markopoulos P, Bongers B, Alphen E, Dekker J, Dijk W, Messemaker S, Poppel J, Vlist B, Volman D, Wanrooij G (2006) The PhotoMirror appliance: affective awareness in the hallway. Pers Ubiquitous Comput 10(2):128–135
Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292
Okwechime D, Ong E, Bowden R (2009) Real-time motion control using pose space probability density estimation. In: Proc int workshop on human-computer interaction
Russell J (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178
Salah A, Schouten B (2009) Semiosis and the relevance of context for the AmI environment. In: Proc European conf on computing and philosophy
Schröder M, Bevacqua E, Eyben F, Gunes H, Heylen D, ter Maat M, Pammi S, Pantic M, Pelachaud C, Schuller B et al. (2009) A demonstration of audiovisual sensitive artificial listeners. In: Proc int conf on affective computing & intelligent interaction
Schröder M (2010) The SEMAINE API: towards a standards-based framework for building emotion-oriented systems. In: Advances in human-computer interaction
Sebe N, Lew M, Sun Y, Cohen I, Gevers T, Huang T (2007) Authentic facial expression analysis. Image Vis Comput 25(12):1856–1863
Shan C, Gong S, McOwan P (2007) Beyond facial expressions: learning human emotion from body gestures. In: Proc of the British machine vision conference
Shi J, Tomasi C (1994) Good features to track. In: Proc computer vision and pattern recognition. IEEE, New York, pp 593–600
Tao H, Huang T (1998) Connected vibrations: a modal analysis approach for non-rigid motion tracking. In: Proc computer vision and pattern recognition, pp 735–740
Valenti R, Sebe N, Gevers T (2007) Facial expression recognition: a fully integrated approach. In: Proc 14th int conf of image analysis and processing-workshops. IEEE Computer Society, New York, pp 125–130
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proc computer vision and pattern recognition, vol 1, pp 511–518
Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58
Author information
Authors and Affiliations
Corresponding author
Additional information
This paper is based on [9].
Rights and permissions
Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License (https://creativecommons.org/licenses/by-nc/2.0), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
About this article
Cite this article
Dibeklioğlu, H., Ortega Hortas, M., Kosunen, I. et al. Design and implementation of an affect-responsive interactive photo frame. J Multimodal User Interfaces 4, 81–95 (2011). https://doi.org/10.1007/s12193-011-0057-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-011-0057-5