Abstract
This paper presents a neural-based approach for generating natural gesticulation movements for a humanoid robot enriched with other relevant social signals depending on sentiment processing. In particular, we take into account some simple head postures, voice parameters, and eyes colors as expressiveness enhancing elements. A Generative Adversarial Network (GAN) allows the proposed system to extend the variability of basic gesticulation movements while avoiding repetitive and monotonous behavior. Using sentiment analysis on the text that will be pronounced by the robot, we derive a value for emotion valence and coherently choose suitable parameters for the expressive elements. In this way, the robot has an adaptive expression generation during talking. Experiments validate the proposed approach by analyzing the contribution of all the factors to understand the naturalness perception of the robot behavior.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aly, A., Tapus, A.: Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human-robot interaction. Auton. Robots 40(2), 193–209 (2016)
Augello, A., Infantino, I., Pilato, G., Rizzo, R., Vella, F.: Binding representational spaces of colors and emotions for creativity. Biologically Inspired Cogn. Architectures 5, 64–71 (2013)
Bänziger, T., Scherer, K.R.: The role of intonation in emotional expressions. Speech Commun. 46(3), 252–267 (2005)
Breazeal, C.: Designing Sociable Robots. Intelligent Robotics and Autonomous Agents. MIT Press, Cambridge (2004)
Cassell, J., Vilhjálmsson, H.H., Bickmore, T.: Beat: the behavior expression animation toolkit. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, PP. 477–486. ACM (2001)
Crumpton, J., Bethel, C.L.: A survey of using vocal prosody to convey emotion in robot speech. Int. J. Soc. Robot. 8(2), 271–285 (2016). https://doi.org/10.1007/s12369-015-0329-4
Feldmaier, J., Marmat, T., Kuhn, J., Diepold, K.: Evaluation of a RGB-LED-based emotion display for affective agents. arXiv preprint arXiv:1612.07303 (2016)
Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014)
Hutto, C.J., Gilbert, E.: Vader: A parsimonious rule-based model for sentiment analysis of social media text. In: Eighth International AAAI Conference on Weblogs and Social Media (2014)
Infantino, I.: Affective human-humanoid interaction through cognitive architecture. In: Zaier, R. (ed.) The Future of Humanoid Robots - Research and Applications. InTech (2012)
Infantino, I., Pilato, G., Rizzo, R., Vella, F.: I feel blue: robots and humans sharing color representation for emotional cognitive interaction. In: Biologically Inspired Cognitive Architectures 2012, pp. 161–166. Springer (2013)
Johnson, D.O., Cuijpers, R.H., van der Pol, D.: Imitating human emotions with artificial facial expressions. Int. J. Soc. Robot. 5(4), 503–513 (2013)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Knight, H.: Eight lessons learned about non-verbal interactions through robot theater. In: Social Robotics, pp. 42–51 (2011)
Lhommet, M., Marsella, S.: Expressing emotion through posture and gesture. In: The Oxford Handbook of Affective Computing, pp. 273–285. Oxford University Press (2015)
McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University of Chicago press (1992)
Neff, M., Kipp, M., Albrecht, I., Seidel, H.P.: Gesture modeling and animation based on a probabilistic re-creation of speaker style. ACM Trans. Graph 27(1), 5:1–5:24 (2008)
Pang, B., Lee, L., et al.: Opinion mining and sentiment analysis. Found. Trends in Inf. Retrieval 2(1–2), 1–135 (2008)
Paradeda, R.B., Hashemian, M., Rodrigues, R.A., Paiva, A.: How facial expressions and small talk may influence trust in a robot. In: International Conference on Social Robotics, pp. 169–178. Springer (2016)
Posner, J., Russell, J.A., Peterson, B.S.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(3), 715–734 (2005)
Rodriguez, I., Martínez-Otzeta, J.M., Lazkano, E., Ruiz, T.: Adaptive emotional chatting behavior to increase the sociability of robots. In: International Conference on Social Robotics, pp. 666–675. Springer (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Rodriguez, I., Manfré, A., Vella, F., Infantino, I., Lazkano, E. (2019). Talking with Sentiment: Adaptive Expression Generation Behavior for Social Robots. In: Fuentetaja Pizán, R., García Olaya, Á., Sesmero Lorente, M., Iglesias Martínez, J., Ledezma Espino, A. (eds) Advances in Physical Agents. WAF 2018. Advances in Intelligent Systems and Computing, vol 855. Springer, Cham. https://doi.org/10.1007/978-3-319-99885-5_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-99885-5_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-99884-8
Online ISBN: 978-3-319-99885-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)