Abstract
For improving full body interaction in an interactive storytelling scenario, we conducted a study to get a user-defined gesture set. 22 users performed 251 gestures while running through the story script with real interaction disabled, but with hints of what set of actions was currently requested by the application. We describe our interaction design process, starting with the conduction of the study, continuing with the analysis of the recorded data including the creation of gesture taxonomy and the selection of gesture candidates, and ending with the integration of the gestures in our application.
Chapter PDF
Similar content being viewed by others
Keywords
References
Álvarez, N., Peinado, F.: Exploring body language as narrative interface. In: Oyarzun, D., Peinado, F., Young, R.M., Elizalde, A., Méndez, G. (eds.) ICIDS 2012. LNCS, vol. 7648, pp. 196–201. Springer, Heidelberg (2012)
Dias, J., Mascarenhas, S., Paiva, A.: Fatima modular: Towards an agent architecture with a generic appraisal framework. In: Proc. of the Int. Workshop on Standards for Emotion Modeling (2011)
Efron, D.: Gesture and Environment. King’s Crown Press, Morningside Heights (1941)
Hofstede, G.J.: Role playing with synthetic cultures: the evasive rules of the game. Experimental Interactive Learning in Industrial Management: New approaches to Learning, Studying and Teaching, 49 (2005)
Kendon, A.: How gestures can become like words. In: Cross-cultural Perspectives in Nonverbal Communication, Hogrefe, pp. 131–141 (1988)
Khoshelham, K., Elberink, S.O.: Accuracy and resolution of Kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012)
Kistler, F., Endrass, B., Damian, I., Dang, C., André, E.: Natural interaction with culturally adaptive virtual characters. Journal on Multimodal User Interfaces 6, 39–47 (2012)
Kistler, F., Sollfrank, D., Bee, N., André, E.: Full body gestures enhancing a game book for interactive story telling. In: André, E. (ed.) ICIDS 2011. LNCS, vol. 7069, pp. 207–218. Springer, Heidelberg (2011)
Kratz, S., Rohs, M.: A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors. In: Proc. IUI 2010, pp. 341–344. ACM, New York (2010)
Kurdyukova, E., André, E., Leichtenstern, K.: Introducing multiple interaction devices to interactive storytelling: Experiences from practice. In: Iurgel, I.A., Zagalo, N., Petta, P. (eds.) ICIDS 2009. LNCS, vol. 5915, pp. 134–139. Springer, Heidelberg (2009)
Kurdyukova, E., Redlin, M., André, E.: Studying user-defined iPad gestures for interaction in multi-display environment. In: Proc. IUI 2012, pp. 1–6 (2012)
McNeill, D.: Head and Mind: What Gestures Reveal About Thought. University of Chicago Press (1992)
Myers, C.S., Rabiner, L.R.: A comparative study of several dynamic time-warping algorithms for connected-word recognition. The Bell System Technical Journal 60(7), 1389–1409 (1981)
Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E.: User-defined body gestures for navigational control of a humanoid robot. In: Ge, S.S., Khatib, O., Cabibihan, J.-J., Simmons, R., Williams, M.-A. (eds.) ICSR 2012. LNCS, vol. 7621, pp. 367–377. Springer, Heidelberg (2012)
Rubine, D.: Specifying gestures by example. SIGGRAPH Comput. Graph. 25(4), 329–337 (1991)
Sezgin, T.M., Davis, R.: HMM-based efficient sketch recognition. In: Proc. IUI 2005, pp. 281–283. ACM, New York (2005)
Suma, E., Lange, B., Rizzo, A., Krum, D., Bolas, M.: FAAST: The flexible action and articulated skeleton toolkit. In: Proc. VR 2011, pp. 247–248 (2011)
Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proc. CHI 2009, pp. 1083–1092. ACM, New York (2009)
Wobbrock, J.O., Wilson, A.D., Li, Y.: Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proc. UIST 2007, pp. 159–168. ACM, New York (2007)
Zinnen, A., Schiele, B.: A new approach to enable gesture recognition in continuous data streams. In: Proc. ISWC 2008, pp. 33–40. IEEE Computer Society, Washington, DC (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 IFIP International Federation for Information Processing
About this paper
Cite this paper
Kistler, F., André, E. (2013). User-Defined Body Gestures for an Interactive Storytelling Scenario. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2013. INTERACT 2013. Lecture Notes in Computer Science, vol 8118. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40480-1_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-40480-1_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40479-5
Online ISBN: 978-3-642-40480-1
eBook Packages: Computer ScienceComputer Science (R0)