Abstract
Hand gestures are part of communication tools that allows people to express their ideas and feelings. Those gestures can be used to insure a communication not only between people but also to replace traditional devices in human-machine interaction (HCI). This last leads us to use this technology in the E-learning domain. COVID’19 pandemic has attest the importance of E-learning. However, the Practical Activities (PA), as an important part of the learning process, are absent in the majority of E-learning plateforms. Therefore, this paper proposes a convolution neural network (CNN) method to ensure the detection of the hand gestures so the user can control and manipulate the virtual objects in the PA environment using a simple camera. To achieve this goal two datasets have been merged. Also the skin model and background subtraction were applied to obtain a performed training and testing datasets for the CNN. Experimental evaluation shows an accuracy rate of 97,2.%.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Learning styles and e-learning. Ph.D. thesis (2008)
Hand gesture recognition based on HU moments in interaction of virtual reality (2012). https://doi.org/10.1109/IHMSC.2012.42
Tran, D.-S., Ho, N.-H., Yang, H.-J., Kim, S.-H., Lee, G.S.: Real-time virtual mouse system using RGB-D images and fingertip detection. Multimedia Tools Appl. 80(7), 10473–10490 (2020). https://doi.org/10.1007/s11042-020-10156-5
Akash: ASL Alphabet
Auer, M., Pester, A.: Toolkit for Distributes Online-Lab Kits. Adv. Remote Lab. e-learn. Exp. 6, 285–296 (2007)
Dong, G., Yan, Y., Xie, M.: Vision-based hand gesture recognition for human-vehicle interaction. In: Proc of the International conference on Control Automation and Computer Vision (2000)
El Kabtane, H., El Adnani, M., Sadgal, M., Mourdi, Y.: Virtual reality and augmented reality at the service of increasing interactivity in MOOCs. Educ. Inf. Technol. 25(4), 2871–2897 (2020). https://doi.org/10.1007/s10639-019-10054-w
Gallo, L., Placitelli, A.P., Ciampi, M.: Controller-free exploration of medical image data: experiencing the Kinect. In: Proceedings - IEEE Symposium on Computer-Based Medical Systems. pp. 1–6 (2011). https://doi.org/10.1109/CBMS.2011.5999138
Inc, L.M.: Leap Motion
Padmalatha, E., Sailekya, S., Ravinder Reddy, R., Anil Krishna, C., Divyarsha, K.: Machine learning methods for sign language recognition: a critical review and analysis. Intell. Syst. Appl. 12 (2021). https://doi.org/10.35940/ijrte.C4565.098319
Rautaray, S.S., Agrawal, A.: Interaction with virtual game through hand gesture recognition. In: 2011 International Conference on Multimedia, Signal Processing and Communication Technologies, IMPACT 2011 (2011). https://doi.org/10.1109/MSPCT.2011.6150485
Shaik, K.B., Ganesan, P., Kalist, V., Sathish, B.S., Jenitha, J.M.M.: Comparative study of skin color detection and segmentation in HSV and YCbCr color space. Procedia Comput. Sci. (2015). https://doi.org/10.1016/j.procs.2015.07.362
Soares Beleboni, M.G.: A brief overview of Microsoft Kinect and its applications. In: Interactive Multimedia Conference 2014. p. 6 (2014)
Stančić, I., Musić, J., Grujić, T.: Gesture recognition system for real-time mobile robot control based on inertial sensors and motion strings. Eng. Appl. Artif. Intell. 66, 33–48 (2017). https://doi.org/10.1016/j.engappai.2017.08.013
Sugandi, B., Octaviani, S.E., Pebrianto, N.F.: Visual tracking-based hand gesture recognition using backpropagation neural network. Int. J. Innov. Comput. Inf. Control 16(1), 301–313 (2020). https://doi.org/10.24507/ijicic.16.01.301
Sun, P.C., Tsai, R.J., Finger, G., Chen, Y.Y., Yeh, D.: What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 50(4), 1183–1202 (2008)
Thalmic Labs: Myo
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ennaji, F.Z., El Kabtane, H. (2023). A Deep Learning Approach for Hand Gestures Recognition. In: Aboutabit, N., Lazaar, M., Hafidi, I. (eds) Advances in Machine Intelligence and Computer Science Applications. ICMICSA 2022. Lecture Notes in Networks and Systems, vol 656. Springer, Cham. https://doi.org/10.1007/978-3-031-29313-9_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-29313-9_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-28845-6
Online ISBN: 978-3-031-29313-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)