Abstract
How to generate vivid facial expressions by computers has been an interesting and challenging problem for a long time. Some research adopts an anatomical approach by studying the relationships between the expressions and the underlying bones and muscles. On the other hand, MPEG4’s SNHC (synthetic/natural hybrid coding) provides mechanisms which allow detailed descriptions of facial expressions and animations. Unlike most existing approaches that ask a user to provide 3D head models, a set of reference images, detailed information of facial feature markers, numerous associated parameters, and/or even non-trivial user assistance, our proposed approach is simple, intuitive and interactive, and most importantly, it is still capable of generating vivid 2D facial expressions. With our system, a user is only required to give a single photo and spend a couple of seconds to roughly mark the positions of eyes, eyebrows and mouth in the photo, and then our system could trace more accurately the contours of these facial features through the technique of active contour. Different expressions can then be generated and morphed via the mesh warping algorithm. Another innovation of this paper is to propose a simple music emotion analysis algorithm, which is coupled with our system to further demonstrate the effectiveness of our facial expression generation. Through such an integration, our system could identify the emotions of a music piece, and display the corresponding emotions via aforementioned synthesized facial expressions. Experimental results show that in general the end-to-end facial generation time, from the time an input photo is given, to the time the final facial expressions are generated, is about 1 min.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Beier T (1992) Feature-based image metamorphosis. In: SIGGRAPH ’92, pp 35–42
Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto, CA
Fournier A, Montuno DY (1984) Triangulating simple polygons and equivalent problems. ACM Trans Graphics 3:153–174
Gleicher M (1998) Retargetting motion to new characters. In: SIGGRAPH ’98, pp 33–42
Han X, Xu C, Prince JL (2003) A topology preserving level set method for geometric deformable models. IEEE Trans Pattern Anal Mach Intell 25(6):755–768
Lyons M, Akamatsu S, Kamachi M, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Proceedings of Third IEEE International Conference on Automatic Face and Gesture Recognition. Nara Japan, IEEE Computer Society, pp 200–205
Karpouzis K, Tsapatsoulis N, Kollias S (2000) Moving to continuous facial expression space using the MPEG-4 facial definition parameter(FDP) set. In: SPIE Electronic Imaging 2000, pp 443–450
Kass M, Witkin A, Terzopoulos D (1988) Snakes: active contour models. Int J Comput Vision 1:321–332
Lischinski D (2006) Selected topics in computer graphics course slides. http://www.cs.huji.ac.il/~danix/
Liu Z, Shan Y, Zhang Z (2001) Expressive expression mapping with ratio images. In: SIGGRAPH ’01, pp 271–276
Parke FI (1972) Computer generated animation of faces. In: Proceedings of Annual ACM Conference
Parke FI, Waters K (1996) Computer facial animation. AK Peters, Wellesley, MA
Pighin F, Hecker J, Lischinski D, Szeliski R, Salesin DH (1998) Synthesizing realistic facial expressions from photographs. In: SIGGRAPH ’98, pp 75–84
Press WH, Teukolsky SA, Vetterling WT, Flannery BP (1997) Numerical recipes in C, 2nd edn. Cambridge Univ. Press
Raouzaiou A, Tsapatsoulis N, Karpouzis K, Kollias S (2002) Parameterized facial expression synthesis based on MPEG-4. In: EURASIP ’02, pp 1021–1038
Schubert E (1999) Measurement and time series analysis of emotion in music. PhD thesis, University of New South Wales
Schubert E (2004) Emotionface: prototype facial expression display of emotion in music. In: Proceedings of ICAD-04
Sethian A (1999) Level set methods and fast marching methods, 2nd edn. Cambridge Univ. Press, Cambridge, UK
Shugrina M, Betke M, Collomosse J (2006) Empathic painting: interactive stylization through observed emotional state. In: NPAR ’2006, pp 87–96
Smythe DB (1990) A two-pass mesh warping algorithm for object transformation and image interpolation. Technical Report ILM Technical Memo No. 1030, Computer Graphics Department, Lucasfilm Ltd
Terzopoulos D, Waters K (1993) Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans Pattern Anal Mach Intell 15(6):569–579
Wang H, Ahuja N (2003) Facial expression decomposition. In: Proceedings of Nineth IEEE Int Conference on Comput Vision, pp 958–965
Wang M, Zhang N, Zhu H (2004) User-adaptive music emotion reconition. In: Proceedings of the ICSP’04, pp 1352–1355
Waters K (1987) A muscle model for animating three-dimensional facial expression. In: SIGGRAPH ’87, pp 17–24
Williams D, Shah M (1990) A fast algorithm for active contours. In: The 3rd IEEE Int Conference on Comput Vision, pp 592–595
Wolberg G (1990) Digital image warping. IEEE Society Press, Los Alamitos
Zhang Q, Liu Z, Guo B, Terzopoulos D, Shum H (2006) Geometry-driven photorealistic facial expression synthesis. IEEE Trans Vis Comput Graph 12(1):48–60
Zhang Q, Prakash EC, Sung E (2003) Efficient modeling of an anatomy-based face and fast 3D facial expression synthesis. Comput Graph Forum 22(2):159–169
Zhou C, Lin X (2005) Facial expressional image synthesis controlled by emotional parameters. Pattern Recogn Letters 26(16):2611–2627
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Yang, CK., Chiang, WT. An interactive facial expression generation system. Multimed Tools Appl 40, 41–60 (2008). https://doi.org/10.1007/s11042-007-0184-x
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-007-0184-x