Abstract
Human emotion is an essential nonverbal communication tool. Granting machines this type of ability will improve our communication with technology significantly, thus giving us a more natural experience while interacting with machines. Software systems should have the ability to adapt to such nonverbal cues. The focus of our research is the incorporation of human emotions in co-adaptive software systems. Specifically, how emotionally aware systems should react to human emotions. One of the numerous application areas for this promising technology is affective robotics. In this paper, we propose a Framework for a co-adaptive Emotional Support Robot. This framework adopts facial expression recognition as the main method for detecting emotions. In addition, this human-centric framework has a strong emphasis on the personalization of user experience. We adopt a personalized emotion recognition approach, as not all humans show emotions in the same way. As well as the personalization of the system’s adaptive reactions, based on a reinforced learning approach where the system assesses its own actions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Picard, R.W.: Affective computing. Pattern Anal. Appl. 1, 71–73 (1997). https://doi.org/10.1007/BF01238028
Feil-Seifer, D., Matarić, M.J.: Human robot interaction. In: Encyclopedia of Complexity and Systems Science, pp. 4643–4659. Springer, New York (2009)
Kirby, R., Forlizzi, J., Simmons, R.: Affective social robots. Rob. Auton. Syst. 58, 322–332 (2010). https://doi.org/10.1016/j.robot.2009.09.015
François, D., Polani, D., Dautenhahn, K.: Towards socially adaptive robots: a novel method for real time recognition of human-robot interaction styles. In: 2008 8th IEEE-RAS International Conference Humanoid Robot Humanoids 2008, pp. 353–359 (2008). https://doi.org/10.1109/ICHR.2008.4756004
Carroll, J.D., Mohlenhoff, B.S., Kersten, C.M., et al.: Laws and ethics related to emotional support animals. J. Am. Acad. Psychiatry Law 48(4), 509–518 (2020) https://doi.org/10.29158/JAAPL.200047-20
Brooks, H.L., Rushton, K., Lovell, K,, et al.: The power of support from companion animals for people living with mental health problems: a systematic review and narrative synthesis of the evidence. BMC Psychiatry 18(1), 1–12 (2018). https://doi.org/10.1186/s12888-018-1613-2
Al-Omair, O.M., Huang, S.A.: Comparative study of algorithms and methods for facial expression recognition. In: IEEE International Systems Conference (SysCon), pp. 1–6. Orlando, FL (2019)
Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17, 124–129 (1971). https://doi.org/10.1037/h0030377
Huang, S., Miranda, P.: Incorporating human intention into self-adaptive systems. In: Proceedings IEEE International Conference on Software Engineering, vol. 2, pp. 571–574 (2015). https://doi.org/10.1109/ICSE.2015.196
Hill, N.J., Wolpaw, J.R.: Brain–Computer Interface☆. Ref Modul Biomed Sci. (2016).https://doi.org/10.1016/B978-0-12-801238-3.99322-X
Kaur, B., Singh, D., Roy, P.P.: EEG based emotion classification mechanism in BCI. Procedia Comput. Sci. 132, 752–758 (2018). https://doi.org/10.1016/J.PROCS.2018.05.087
Shu, L., Yu, Y., Chen, W., et al.: Wearable emotion recognition using heart rate data from a smart bracelet. Sensors 20(3), 718 (2020). https://doi.org/10.3390/s20030718
Nwe, T.L., Foo, S.W., De Silva, L.C.: Speech emotion recognition using hidden markov models. Speech Commun. 41, 603–623 (2003). https://doi.org/10.1016/S0167-6393(03)00099-2
Introduction — OpenCV 3.0.0-dev documentation. https://docs.opencv.org/3.0-beta/modules/core/doc/intro.html. Accessed 30 Jan 2018
dlib C++ Library. http://dlib.net/. Accessed 10 Jan 2018
Kazemi, V., Sullivan, J.: One millisecond face alignment with an ensemble of regression trees. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1867–1874. IEEE (2014)
Al-Omair, O.M., Huang, S.: A comparative study on detection accuracy of cloud- based emotion recognition services. In: The International Conference on Signal Processing and Machine Learning. Shanghai, China, pp. 142–148 (2018)
Kephart, J.O., Chess, D.M.: The vision of autonomic computing. Comput. (Long Beach Calif) 36, 41–50 (2003). https://doi.org/10.1109/MC.2003.1160055
Arcaini, P., Scandurra, P.: Modeling and Analyzing MAPE-K Feedback Loops for Self-Adaptation - IEEE Xplore Document (2015)
Chumkamon, S., Masato, K., Hayashi, E.: Facial expression of social interaction based on emotional motivation of animal robot. In: Proceedings - 2015 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2015, pp .185–190. IEEE (2016)
Admoni, H., Srinivasa, S.S.: Predicting user intent through eye gaze for shared autonomy. In: Proceedings of the 2016 AAAI Fall Symposium: Shared Autonomy in Research and Practice. pp. 298–303 (2016)
Admoni, H., Scassellati, B.: Nonverbal behavior modeling for socially assistive robots. In: Proceedings of the 2014 AAAI Fall Symposium: Artificial Intelligence for Human-Robot Interaction (AI-HRI), pp. 7–9 (2014)
Lisetti, C., Amini, R., Yasavur, U., Rishe, N.: I can help you change! an empathic virtual agent delivers behavior change health interventions. ACM Trans. Manag. Inf. Syst. 4, 1–28 (2013). https://doi.org/10.1145/2544103
Boukricha, H., Wachsmuth, I., Carminati, M.N., Knoeferle, P.: A computational model of empathy: empirical evaluation. In: Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013, pp. 1–6. IEEE (2013)
Acknowledgment
This work was supported by the Deanship of Scientific Research, Vice Presidency for Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia [Project No. GRANT669].
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Al-Omair, O.M., Huang, S. (2023). An Emotional Support Robot Framework Using Emotion Recognition as Nonverbal Communication for Human-Robot Co-adaptation. In: Arai, K. (eds) Proceedings of the Future Technologies Conference (FTC) 2022, Volume 3. FTC 2022 2022. Lecture Notes in Networks and Systems, vol 561. Springer, Cham. https://doi.org/10.1007/978-3-031-18344-7_30
Download citation
DOI: https://doi.org/10.1007/978-3-031-18344-7_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-18343-0
Online ISBN: 978-3-031-18344-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)