Skip to main content

Multimodal Emotion Recognition System Using Machine Learning and Psychological Signals: A Review

  • Conference paper
  • First Online:
Soft Computing: Theories and Applications

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1380))

Abstract

In recent years, the study of emotion has increased due to the interaction of human with machine as it is helpful to interpret human actions and to improve the relationship among humans and machines for developing the software that can understand the human states and can take action accordingly. This paper focuses on a preliminary study on emotion recognition using various psychological signals. Different researchers investigated various parameters which include facial expression, eye gaze, pupil size variation, eye movements using EEG, and deep learning techniques to extract the emotional features of humans. Diverse researchers have proposed a method for detecting emotions by using different psychological signals and achieved reliable accuracy. After a thorough analysis, it has been observed that the best accuracy achieved on the individual emotion detection was 90%. However, this experiment does not help to classify the specific emotion. To classify the specific emotion, the best accuracy achieved was 79.63%, which is a comparable accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Guo, J.J., Zhou, R., Zhao, L.M., Lu, B.L.: Multimodal emotion recognition from eye image, eye movement and EEG using deep neural networks. In: Proceedings of Annual International Conference of the IEEE Engineering in Medicine and Biology Society EMBS, pp. 3071–3074 (2019). https://doi.org/10.1109/EMBC.2019.8856563

  2. Liu, J., Meng, H., Li, M., Zhang, F., Qin, R., Nandi, A.K.: Emotion detection from EEG recordings based on supervised and unsupervised dimension reduction. Concurr. Comput. 30(23), 1–13 (2018). https://doi.org/10.1002/cpe.4446

    Article  Google Scholar 

  3. Calvo, R.A., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010). https://doi.org/10.1109/T-AFFC.2010.1

    Article  Google Scholar 

  4. Soleymani, M., Pantic, M., Pun, T.: Multimodal emotion recognition in response to videos (extended abstract). In: 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, 2015, 3(2), 491–497. https://doi.org/10.1109/ACII.2015.7344615.

  5. Zheng, W.L., Zhu, J.Y., Peng, Y., Lu, B.L.: EEG-based emotion classification using deep belief networks. In: Proceedings—IEEE International Conference on Multimedia & Expo, vol. 2014-Septe, no. Septmber, 2014. https://doi.org/10.1109/ICME.2014.6890166

  6. Nie, D., Wang, X.W., Shi, L.C., Lu, B.L.: EEG-based emotion recognition during watching movies. In: 2011 5th International IEEE/EMBS Conference on Neural Engineering NER 2011, pp. 667–670, 2011. https://doi.org/10.1109/NER.2011.5910636

  7. Lin, Y., Wang, C., Wu, T., Jeng, S., Chen, J.: EEG-Based Emotion Recognition In Music Listening : A Comparison of Schemes for Multiclass Support Vector Machine Department of Electrical Engineering, National Taiwan University, Taiwan Cardinal Tien Hospital, Yung-Ho Branch, Taiwan, IEEE, pp. 489–492, 2009, 978-1-4244-2354-5/09.

    Google Scholar 

  8. Granholm, E., Steinhauer, S.R.: Pupillometric measures of cognitive and emotional processes. Int. J. Psychophysiol. 52(1), 1–6 (2004). https://doi.org/10.1016/j.ijpsycho.2003.12.001

    Article  Google Scholar 

  9. Partala, T., Jokiniemi, M., Surakka, V.: Pupillary responses to emotionally provocative stimuli. Proc. Eye Track. Res. Appl. Symp. 2000, 123–129 (2000). https://doi.org/10.1145/355017.355042

    Article  Google Scholar 

  10. Soleymani, M., Asghari-Esfeden, S., Fu, Y., Pantic, M.: Analysis of EEG signals and facial expressions for continuous emotion detection. IEEE Trans. Affect. Comput. 7(1), 17–28 (2016). https://doi.org/10.1109/TAFFC.2015.2436926

    Article  Google Scholar 

  11. Liu, P., Han, S., Meng, Z., Tong, Y.: Facial expression recognition via a boosted deep belief network. Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1805–1812, 2014. https://doi.org/10.1109/CVPR.2014.233.

  12. Nummenmaa, L., Hyönä, J., Calvo, M.G.: Eye movement assessment of selective attentional capture by emotional pictures. Emotion 6(2), 257–268 (2006). https://doi.org/10.1037/1528-3542.6.2.257

    Article  Google Scholar 

  13. Aracena, C., Basterrech, S., Snasel, V., Velasquez, J.: Neural networks for emotion recognition based on eye tracking data. In: Proceedings—2015 IEEE International Conference on Systems, Man, and Cybernetics SMC 2015, pp. 2632–2637, 2016. https://doi.org/10.1109/SMC.2015.460.

  14. L. PJ, International affective picture system (IAPS): affective ratings of pictures and instruction manual. Tech. Rep., 2005.

    Google Scholar 

  15. Mahajan, R.: Emotion recognition via EEG using neural network classifier. Adv. Intell. Syst. Comput. 583, 429–438 (2018). https://doi.org/10.1007/978-981-10-5687-1_38

    Article  Google Scholar 

  16. Srivastava, M., Saini, S., Thakur, A.: Analysis and parameter estimation of microstrip circular patch antennas using artificial neural networks. Adv. Intell. Syst. Comput. 583, 285–292 (2018). https://doi.org/10.1007/978-981-10-5687-1_26

    Article  Google Scholar 

  17. Sheth, S., Ajmera, A., Sharma, A., Patel, S., Kathrecha, C.: Design and development of intelligent AGV using computer vision and artificial intelligence. Adv. Intell. Syst. Comput. 583, 337–349 (2018). https://doi.org/10.1007/978-981-10-5687-1_31

    Article  Google Scholar 

  18. Lalwani, S., Sharma, H., Satapathy, S.C., Deep, K., Bansal, J.C.: A survey on parallel particle swarm optimization algorithms. Arab. J. Sci. Eng. 44(4), 2899–2923 (2019). https://doi.org/10.1007/s13369-018-03713-6

    Article  Google Scholar 

  19. Matlovic, T., Gaspar, P., Moro, R., Simko, J., Bielikova, M.: Emotions detection using facial expressions recognition and EEG. In: Proceedings—11th International Workshop on Semantic and Social Media Adaptation and Personalization SMAP 2016, pp. 18–23, 2016. https://doi.org/10.1109/SMAP.2016.7753378.

  20. Lian, Z., Li, Y., Tao, J.H., Huang, J., Niu, M.Y.: Expression analysis based on face regions in read-world conditions. Int. J. Autom. Comput. 17(1), 96–107 (2020). https://doi.org/10.1007/s11633-019-1176-9

    Article  Google Scholar 

  21. Picard, R.: Affective Computing. MA MIT Press (1995).

    Google Scholar 

  22. Zhao, Y., Wang, X., Petriu, E.M.: Facial expression anlysis using eye gaze information. IEEE Int. Conf. Comput. Intell. Meas. Syst. Appl. Proc., pp. 7–10 (2011) https://doi.org/10.1109/CIMSA.2011.6059936.

  23. Zhao, Y., Shen, X., Georganas, N.D.: Facial expression recognition by applying multi-step integral projection and SVMs. In: 2009 IEEE Instrumentation and Measurement Technology Conference I2MTC 2009, no. May, pp. 686–691, 2009. https://doi.org/10.1109/IMTC.2009.5168537.

  24. Lanatà, A., Armato, A., Valenza, G., Scilingo, E.P.: Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. In: 2011 5th International Conference on Pervasive Computing Technologies for Healthcare and Workshops PervasiveHealth 2011, pp. 78–84, 2011. https://doi.org/10.4108/icst.pervasivehealth.2011.246056.

  25. Partala, T., Surakka, V.: Pupil size variation as an indication of affective processing. Int. J. Hum. Comput. Stud. 59(1–2), 185–198 (2003). https://doi.org/10.1016/S1071-5819(03)00017-X

    Article  Google Scholar 

  26. Lohse, G.L., Johnson, E.J.: A comparison of two process tracing methods for choice tasks. Proc. Annu. Hawaii Int. Conf. Syst. Sci. 4(1), 86–97 (1996). https://doi.org/10.1109/HICSS.1996.495316

    Article  Google Scholar 

  27. Lanatà, A., Valenza, G., Scilingo, E.P.: Eye gaze patterns in emotional pictures. J. Ambient Intell. Humaniz. Comput. 4(6), 705–715 (2013). https://doi.org/10.1007/s12652-012-0147-6

    Article  Google Scholar 

  28. Giri, J.P., Giri, P.J., Chadge, R.: Neural network-based prediction of productivity parameters. Adv. Intell. Syst. Comput. 583, 83–95 (2018). https://doi.org/10.1007/978-981-10-5687-1_8

    Article  Google Scholar 

  29. Ramirez, R., Vamvakousis, Z.: Detecting emotion from EEG signals using the emotive Epoc device. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 7670 LNAI, 175–184 (2012). https://doi.org/10.1007/978-3-642-35139-6_17.

  30. Hinton, G.E., Krizhevsky, A., Sutskever, I.: Imagenet classification with deep convolutional neural networks (2012)

    Google Scholar 

  31. Graves, A., Schmidhuber, J.: Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 18(5–6), 602–610 (2005). https://doi.org/10.1016/j.neunet.2005.06.042

    Article  Google Scholar 

  32. Li, T.H., Liu, W., Zheng, W.L., Lu, B.L.: Classification of five emotions from EEG and eye movement signals: discrimination ability and stability over time. Int. IEEE/EMBS Conf. Neural Eng. NER, vol. 2019-March, pp. 607–610, 2019. https://doi.org/10.1109/NER.2019.8716943.

  33. Duan, R.N., Zhu, J.Y., Lu, B.L.: Differential entropy feature for EEG-based emotion classification. Int. IEEE/EMBS Conf. Neural Eng. NER, pp. 81–84, 2013. https://doi.org/10.1109/NER.2013.6695876.

  34. Lu, B.-L. Lu, Y., Zheng, W.-L., Li, B.: Combining eye movements and EEG to enhance emotion recognition (2015)

    Google Scholar 

  35. Zheng, W.L. Dong, B.N., Lu, B.L.: Multimodal emotion recognition using EEG and eye tracking data. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society EMBC 2014, pp. 5040–5043, 2014. https://doi.org/10.1109/EMBC.2014.6944757

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rishu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rishu, Singh, J., Gill, R. (2022). Multimodal Emotion Recognition System Using Machine Learning and Psychological Signals: A Review. In: Sharma, T.K., Ahn, C.W., Verma, O.P., Panigrahi, B.K. (eds) Soft Computing: Theories and Applications. Advances in Intelligent Systems and Computing, vol 1380. Springer, Singapore. https://doi.org/10.1007/978-981-16-1740-9_54

Download citation

Publish with us

Policies and ethics