Abstract
In this paper we propose a model of a robotic storyteller, focusing on its abilities to select the most appropriate gestures to accompany the story, trying to manifest also emotions related to the sentence that is being told. The robot is endowed with a repository of stories together with a set of gestures, inspired by those typically used by humans, that the robot learns by observation. The gestures are annotated by a number N of subjects, according to their particular meaning and considering a specific typology. They are exploited by the robot according to the story content to provide an engaging representation of the tale.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bänziger, T., Scherer, K.R.: The role of intonation in emotional expressions. Speech Commun. 46(3), 252–267 (2005)
Casasanto, D.: Gesture and language processing. In: Encyclopedia of the Mind, pp. 372–374 (2013)
Feldmaier, J., Marmat, T., Kuhn, J., Diepold, K.: Evaluation of a RGB-LED-based emotion display for affective agents. arXiv preprint arXiv:1612.07303 (2016)
Ferreira, M.J., Nisi, V., Melo, F., Paiva, A.: Learning and teaching biodiversity through a storyteller robot. In: International Conference on Interactive Digital Storytelling, pp. 367–371. Springer, Heidelberg (2017)
Holzapfel, H., Nickel, K., Stiefelhagen, R.: Implementation and evaluation of a constraint-based multimodal fusion system for speech and 3D pointing gestures, 01 2004
Johnson, D.O., Cuijpers, R.H., van der Pol, D.: Imitating human emotions with artificial facial expressions. Int. J. Soc. Robot. 5(4), 503–513 (2013)
Landauer, T.K., Foltz, P.W., Laham, D.: An introduction to latent semantic analysis. Discourse Process. 25(2–3), 259–284 (1998)
Leite, I., McCoy, M., Lohani, M., Ullman, D., Salomons, N., Stokes, C., Rivers, S., Scassellati, B.: Narratives with robots: the impact of interaction context and individual differences on story recall and emotional understanding. Front. Robot. AI 4, 29 (2017)
Loehr, D.P.: Gesture and intonation. Ph.D. thesis, Georgetown University, Washington, DC (2004)
McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University of Chicago Press, Chicago (1992)
Pilato, G., D’Avanzo, E.: Data-driven social mood analysis through the conceptualization of emotional fingerprints. Procedia Comput. Sci. 123, 360–365 (2018). 8th Annual International Conference on Biologically Inspired Cognitive Architectures, BICA 2017 (Eighth Annual Meeting of the BICA Society), 1–6 August 2017, Moscow, Russia
Robin, B.: The power of digital storytelling to support teaching and learning. Digit. Educ. Rev. 30, 17–29 (2016)
Strapparava, C., Mihalcea, R.: Semeval-2007 task 14: affective text. In: Proceedings of the 4th International Workshop on Semantic Evaluations. SemEval 2007, pp. 70–74. Association for Computational Linguistics, Stroudsburg (2007)
Strapparava, C., Mihalcea, R.: Learning to identify emotions in text. In: Proceedings of the 2008 ACM Symposium on Applied Computing. SAC 2008, pp. 1556–1560, ACM, New York (2008)
Striepe, H., Lugrin, B.: There once was a robot storyteller: measuring the effects of emotion and non-verbal behaviour. In: International Conference on Social Robotics, pp. 126–136. Springer, Heidelberg (2017)
Wagner, P., Malisz, Z., Kopp, S.: Gesture and speech in interaction: an overview (2014)
Westlund, J.K., Breazeal, C.: The interplay of robot language level with children’s language learning during storytelling. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, pp. 65–66. ACM (2015)
Xu, J., Broekens, J., Hindriks, K., Neerincx, M.A.: Effects of a robotic storyteller’s moody gestures on storytelling perception. In: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 449–455. IEEE (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Augello, A., Infantino, I., Maniscalco, U., Pilato, G., Vella, F. (2019). NarRob: A Humanoid Social Storyteller with Emotional Expression Capabilities. In: Samsonovich, A. (eds) Biologically Inspired Cognitive Architectures 2018. BICA 2018. Advances in Intelligent Systems and Computing, vol 848. Springer, Cham. https://doi.org/10.1007/978-3-319-99316-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-99316-4_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-99315-7
Online ISBN: 978-3-319-99316-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)