Abstract
Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Baudel, T., Beaudouin-Lafon, M.: CHARADE: Remote Control of Objects using Free- Hand Gestures. ACM Journal 36(7), 28–35 (1993)
Bühler, D., Minker, W., Häußler, J., Krüger, S.: Flexible Multimodal Human-Machine Interaction in Mobile Environments. In: ICSLP 2002, pp. 169–172 (2002)
Cohen, P., Johnston, M., McGee, M., Oviatt, S., Pittman, J., Simith, I., Chen, L., Clow, J.: Quickset: Multimodal interaction for distributed applications. In: Proc. of ACM International Multimedia Conference, pp. 31–40 (1997)
Cole, R.A., Mariani, J., Uszkoreit, H., Zaenen, A., Zue, V.: Survey of the State of the Art in Human Language Technology, Ch. 1 (1995)
Heckmann, D.: Introducing Situational Statements as an integrating Data Structure for User Modeling, Context-Awareness and Resource-Adaptive Computing. In: ABIS Workshop on adaptivity and user modelling in interactive software systems (2003)
Kendon, A.: Conducting Interaction: Patterns of behavior in focused encounters. Cambridge University Press, Cambridge (1990)
Kumar, S., Cohen, P.R., Coulston, R.: Multimodal Interaction under Exerted Conditions in a Natural Field Setting. In: Proc. of the Sixth International Conference on Multimodal Interfaces (ICMI), pp. 227–234 (2004)
Krüger, A., Butz, A., Müller, C., Stahl, C., Wasinger, R., Steinberg, K.E., Dirschl, A.: The Connected User Interface: Realizing a Personal Situated Navigation Service. In: Proc. of the 9th International Conference on Intelligent User Interfaces, pp. 161–168 (2004)
Newcomb, E., Pashley, T., Stasko, J.: Mobile Computing in the Retail Arena. In: Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI), pp. 337–344 (2003)
Oviatt, S., DeAngeli, A., Kuhn, K.: Integration and synchronization of input modes during multimodal human-computer interaction. In: Proc. of CHI 1997, pp. 415–422 (1997)
Oviatt, S.: Mutual disambiguation of recognition errors in a multimodal architecture. In: Proceedings of the Conference on Human Factors in Computing Systems, pp. 576–583 (1999)
Pastel, R., Skalsky, N.: Demonstrating Information in Simple Gestures. In: Proc. of the 9th international conference on Intelligent User Interfaces, pp. 360–361 (2004)
Wahlster, W.: Towards Symmetric Multimodality: Fusion and Fission of Speech, Gesture, and Facial Expression. In: Proc. of the 26th German Conference on Artificial Intelligence, pp. 1–18 (2003)
Wasinger, R., Krüger, A.: Multi-modal Interaction with Mobile Navigation Systems. In: Wahlster, W. (ed.) Special Journal Issue Conversational User Interfaces, it – Information Technology, vol. 46(6), pp. 322–331. München, Oldenbourg Wissenschaftsverlag (2004)
Weintraub, M., Taussig, K., Hunicke, K., Snodgrass, A.: Effect of speaking style on LVCSR performance. In: Proc. of the International Conference on Spoken Language Processing, pp. 16–19 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wasinger, R., Krüger, A., Jacobs, O. (2005). Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant. In: Gellersen, H.W., Want, R., Schmidt, A. (eds) Pervasive Computing. Pervasive 2005. Lecture Notes in Computer Science, vol 3468. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11428572_18
Download citation
DOI: https://doi.org/10.1007/11428572_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26008-0
Online ISBN: 978-3-540-32034-0
eBook Packages: Computer ScienceComputer Science (R0)