Skip to main content

A Deep Learning Approach for Hand Gestures Recognition

  • Conference paper
  • First Online:
Advances in Machine Intelligence and Computer Science Applications (ICMICSA 2022)

Abstract

Hand gestures are part of communication tools that allows people to express their ideas and feelings. Those gestures can be used to insure a communication not only between people but also to replace traditional devices in human-machine interaction (HCI). This last leads us to use this technology in the E-learning domain. COVID’19 pandemic has attest the importance of E-learning. However, the Practical Activities (PA), as an important part of the learning process, are absent in the majority of E-learning plateforms. Therefore, this paper proposes a convolution neural network (CNN) method to ensure the detection of the hand gestures so the user can control and manipulate the virtual objects in the PA environment using a simple camera. To achieve this goal two datasets have been merged. Also the skin model and background subtraction were applied to obtain a performed training and testing datasets for the CNN. Experimental evaluation shows an accuracy rate of 97,2.%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Learning styles and e-learning. Ph.D. thesis (2008)

    Google Scholar 

  2. Hand gesture recognition based on HU moments in interaction of virtual reality (2012). https://doi.org/10.1109/IHMSC.2012.42

  3. Tran, D.-S., Ho, N.-H., Yang, H.-J., Kim, S.-H., Lee, G.S.: Real-time virtual mouse system using RGB-D images and fingertip detection. Multimedia Tools Appl. 80(7), 10473–10490 (2020). https://doi.org/10.1007/s11042-020-10156-5

    Article  Google Scholar 

  4. Akash: ASL Alphabet

    Google Scholar 

  5. Auer, M., Pester, A.: Toolkit for Distributes Online-Lab Kits. Adv. Remote Lab. e-learn. Exp. 6, 285–296 (2007)

    Google Scholar 

  6. Dong, G., Yan, Y., Xie, M.: Vision-based hand gesture recognition for human-vehicle interaction. In: Proc of the International conference on Control Automation and Computer Vision (2000)

    Google Scholar 

  7. El Kabtane, H., El Adnani, M., Sadgal, M., Mourdi, Y.: Virtual reality and augmented reality at the service of increasing interactivity in MOOCs. Educ. Inf. Technol. 25(4), 2871–2897 (2020). https://doi.org/10.1007/s10639-019-10054-w

    Article  Google Scholar 

  8. Gallo, L., Placitelli, A.P., Ciampi, M.: Controller-free exploration of medical image data: experiencing the Kinect. In: Proceedings - IEEE Symposium on Computer-Based Medical Systems. pp. 1–6 (2011). https://doi.org/10.1109/CBMS.2011.5999138

  9. Inc, L.M.: Leap Motion

    Google Scholar 

  10. Padmalatha, E., Sailekya, S., Ravinder Reddy, R., Anil Krishna, C., Divyarsha, K.: Machine learning methods for sign language recognition: a critical review and analysis. Intell. Syst. Appl. 12 (2021). https://doi.org/10.35940/ijrte.C4565.098319

  11. Rautaray, S.S., Agrawal, A.: Interaction with virtual game through hand gesture recognition. In: 2011 International Conference on Multimedia, Signal Processing and Communication Technologies, IMPACT 2011 (2011). https://doi.org/10.1109/MSPCT.2011.6150485

  12. Shaik, K.B., Ganesan, P., Kalist, V., Sathish, B.S., Jenitha, J.M.M.: Comparative study of skin color detection and segmentation in HSV and YCbCr color space. Procedia Comput. Sci. (2015). https://doi.org/10.1016/j.procs.2015.07.362

  13. Soares Beleboni, M.G.: A brief overview of Microsoft Kinect and its applications. In: Interactive Multimedia Conference 2014. p. 6 (2014)

    Google Scholar 

  14. Stančić, I., Musić, J., Grujić, T.: Gesture recognition system for real-time mobile robot control based on inertial sensors and motion strings. Eng. Appl. Artif. Intell. 66, 33–48 (2017). https://doi.org/10.1016/j.engappai.2017.08.013

    Article  Google Scholar 

  15. Sugandi, B., Octaviani, S.E., Pebrianto, N.F.: Visual tracking-based hand gesture recognition using backpropagation neural network. Int. J. Innov. Comput. Inf. Control 16(1), 301–313 (2020). https://doi.org/10.24507/ijicic.16.01.301

  16. Sun, P.C., Tsai, R.J., Finger, G., Chen, Y.Y., Yeh, D.: What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 50(4), 1183–1202 (2008)

    Article  Google Scholar 

  17. Thalmic Labs: Myo

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fatima Zohra Ennaji .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ennaji, F.Z., El Kabtane, H. (2023). A Deep Learning Approach for Hand Gestures Recognition. In: Aboutabit, N., Lazaar, M., Hafidi, I. (eds) Advances in Machine Intelligence and Computer Science Applications. ICMICSA 2022. Lecture Notes in Networks and Systems, vol 656. Springer, Cham. https://doi.org/10.1007/978-3-031-29313-9_11

Download citation

Publish with us

Policies and ethics