Abstract
Hand tracking is relevant to such a variety of applications including human-robot interaction (HRI), human-computer interaction (HCI), virtual reality (VR), and augmented reality (AR). Accurate and robust hand tracking however is challenging due to the intricacies of dynamic motion within small space and the complex interactions with nearby objects, coupled with the hurdles in real-time hand mesh reconstruction. In this paper, we conduct a comprehensive examination and analysis of existing hand tracking technologies. Through the review of major works in the literature, we have discovered numerous studies employing a diverse array of sensors, leading us to propose their categorization into seven types: vision, soft wearable, encoder, magnetic, inertial measurement unit (IMU), electromyography (EMG), and the fusion of sensor modalities. Our findings indicate that no singular solution surpasses all others, attributing to the inherent limitations of using a single sensor modality. As a result, we assert that integrating multiple sensor modalities presents a viable path toward devising a superior hand tracking solution. Ultimately, this survey paper aims to bolster interdisciplinary research efforts across the spectrum of hand tracking technologies, thereby contributing to the advancement of the field.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
J. Lee and T. L. Kunii, “Constraint-based hand animation,” Models and Techniques in Computer Animation, pp. 110–127, 1993.
I. Oikonomidis, N. Kyriazis, and A. A. Argyros, “Markerless and efficient 26-DOF hand pose recovery,” Proc. of Asian Conference on Computer Vision, Berlin, pp. 744–757, 2010.
C. Qian, X. Sun, Y. Wei, X. Tang, and J. Sun, “Realtime and robust hand tracking from depth,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1106–1113, 2014.
J. Romero, D. Tzionas, and M. J. Black, “Embodied hands: Modeling and capturing hands and bodies together,” arXiv preprint arXiv.2201.02610, 2022.
R. A. Potamias, S. Ploumpis, S. Moschoglou, V. Triantafyllou, and S. Zafeiriou, “Handy: Towards a high fidelity 3D hand shape and appearance model,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4670–4680, 2023.
G. Buckingham, “Hand tracking for immersive virtual reality: opportunities and challenges,” Frontiers in Virtual Reality, vol. 2, 728461, 2021.
B. Cha, Y. Bae, C. Lee, D. Jeong, and J. Ryu, “Design and user evaluation of haptic augmented virtuality system for immersive virtual training,” International Journal of Control, Automation, and Systems, vol. 20, no. 9, pp. 3032–3044, 2022.
ANA Avatar XPRIZE, https://www.xprize.org/prizes/avatar/, Accessed March 2024.
Tesla Optimus, https://www.youtube.com/Stesla/, Accessed March 2024.
Apptronik Apollo, https://apptronik.com/apollo/, Accessed March 2024.
A. Rajeswaran, V. Kumar, A. Gupta, G. Vezzani, J. Schulman, E. Todorov, and S. Levine, “Learning complex dexterous manipulation with deep reinforcement learning and demonstrations,” arXiv preprint arXiv:1709.10087, 2017.
Z. Q. Chen, K. van Wyk, Y. Chao, W. Yang, A. Mousavian, A. Gupta, and D. Fox, “Dextransfer: Real world multi-fingered dexterous grasping with minimal human demonstrations,” arXiv preprint arXiv:2209.14284,2022.
Y. Qin, Y. Wu, S. Liu, H. Jiang, R. Yang, Y. Fu, and X. Wang, “Dexmv: Imitation learning for dexterous manipulation from human videos,” Proc. of European Conference on Computer Vision, pp. 570–587, 2022.
S. Dasari, A. Gupta, and V. Kumar, “Learning dexterous manipulation from exemplar object trajectories and pre-grasps,” Proc. of IEEE International Conference on Robotics and Automation (ICRA), pp. 3889–3896, 2023.
S. P. Arunachalam, I. Güzey, S. Chintala, and L. Pinto, “Holo-dex: Teaching dexterity with immersive mixed reality,” Proc. of 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 5962–5969, 2023.
S. P. Arunachalam, S. Silwal, B. Evans, and L. Pinto, “Dexterous imitation made easy: A learning-based framework for efficient dexterous manipulation,” Proc. of 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 5954–5961, 2023.
T. Zhang, H. Xia, C. Zhang, and Z. Zeng, “MultiModal, robust and accurate hand tracking,” Proc. of IEEE 6th International Conference on Computer and Communications (ICCC), pp. 1886–1890, 2020.
G. Park, A. Argyros, J. Lee, and W. Woo, “3D hand tracking in the presence of excessive motion blur,” IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 5, pp. 1891–1901, 2020.
Y. Lee, W. Do, H. Yoon, J. Heo, W. Lee, and D. Lee, “Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact,” Science Robotics, vol. 6, no. 58, 2021.
N. Gosala, F. Wang, Z. Cui, H. Liang, O. Glauser, S. Wu, and O. Sorkine-Hornung, “Self-Calibrated Multi-Sensor Wearable for Hand Tracking and Modeling,” IEEE Transactions on Visualization and Computer Graphics, vol. 29, no. 3, pp. 1769–1784, 2023.
A. Erol, G. Bebis, M. Nicolescu, R. D. Boyle, and X. Twombly, “Vision-based hand pose estimation: A review,” Computer Vision and Image Understanding, vol. 108, no. 1–2, pp. 52–73, 2007.
B. Doosti, “Hand pose estimation: A survey,” arXiv preprint arXiv 1903.01013, 2019.
X. Hu, Y. Xu, H. Zhang, J. Xie, D. Niu, Z. Zhao, and X. Qu, “The fiber bragg grating (FBG) sensing glove: A review,” IEEE Sensors Journal, vol. 23, no. 11, pp. 11374–11382, 2023.
T. Li and H. Yu, “Visual-Inertial Fusion-Based Human Pose Estimation: A Review,” IEEE Transactions on Instrumentation and Measurement, vol. 72, 4007816, 2023.
W. Chen, C. Yu, C. Tu, Z. Lyu, J. Tang, S. Ou, Y. Fu, and Z. Xue, “A survey on hand pose estimation with wearable sensors and computer-vision-based methods,” Sensors, vol. 20, no. 4, 1074, 2020.
S. Baek, K. Kim, and T. Kim, “Pushing the envelope for rgb-based dense 3d hand pose estimation via neural rendering,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1067–1076, 2019.
F. Hu, P. He, S. Xu, Y. Li, and C. Zhang, “FingerTrak: Continuous 3D hand pose tracking by deep learning hand silhouettes captured by miniature thermal cameras on wrist,” Proc. of ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 4, no. 2, pp. 1–24, 2020.
S. Hampali, S. D. Sarkar, M. Rad, and V. Lepetit, “Keypoint transformer: Solving joint identification in challenging hands and object interactions for accurate 3D pose estimation,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11090–11100, 2022.
M. Li, L. An, H. Zhang, L. Wu, F. Chen, T. Yu, and Y. Liu, “Interacting attention graph for single image two-hand reconstruction,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2761–2770, 2022.
J. Park, Y. Oh, G. Moon, H. Choi, and K. M. Lee, “Handoccnet: Occlusion-robust 3D hand mesh estimation network,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1486–1495, 2022.
G. Pavlakos, D. Shan, I. Radosavovic, A. Kanazawa, D. Fouhey, and J. Malik, “Reconstructing Hands in 3D with Transformers,” arXiv preprint arXiv:2312.05251, 2023.
H. Xu, T. Wang, X. Tang, and C. Fu, “H2ONet: Handocclusion-and-orientation-aware network for real-time 3D hand mesh reconstruction,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 17048–17058, 2023.
G. Pavlakos, V. Choutas, N. Ghorbani, T. Bolkart, A. A. Osman, D. Tzionas, and M. J. Black, “Expressive body capture: 3D hands, face, and body from a single image,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10967–10977, 2019.
M. Loper, N. Mahmood, J. Romero, G. Pons-Moll, and M. J. Black, “SMPL: A skinned multi-person linear model,” Seminal Graphics Papers: Pushing the Boundaries, vol. 2, pp. 851–866, 2023.
T. Li, T. Bolkart, M. J. Black, H. Li, and J. Romero, “Learning a model of facial shape and expression from 4D scans,” ACM Transanctions on Graphics (ToG), vol. 36, no. 6, 194, 2017.
I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial nets,” Advances in Neural Information Processing Systems, vol. 27, 2014.
S. Sridhar, H. Rhodin, H. Seidel, A. Oulasvirta, and C. Theobalt, “Real-time hand tracking using a sum of anisotropic gaussians model,” Proc. of The 2nd International Conference on 3D Vision, vol. 1, pp. 319–326, 2014.
S. Sridhar, A. Oulasvirta, and C. Theobalt, “Interactive markerless articulated hand motion tracking using RGB and depth data,” Proc. of IEEE International Conference on Computer Vision, pp. 2456–2463, 2013.
T. Sharp, C. Keskin, D. Robertson, J. Taylor, J. Shotton, D. Kim, C. Rhemann, I. Leichter, A. Vinnikov, Y. Wei, et al., “Accurate, robust, and flexible real-time hand tracking,” Proc. of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3633–3642, 2015.
S. Sridhar, F. Mueller, A. Oulasvirta, and C. Theobalt, “Fast and robust hand tracking using detection-guided optimization,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 3213–3221, 2015.
A. Makris, N. Kyriazis, and A. A. Argyros, “Hierarchical particle filtering for 3d hand tracking,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 8–17, 2015.
F. Mueller, D. Mehta, O. Sotnychenko, S. Sridhar, D. Casas, and C. Theobalt, “Real-time hand tracking under occlusion from an egocentric RGB-D sensor,” Proc. of IEEE International Conference on Computer Vision, pp. 1154–1163, 2017.
C. Zimmermann and T. Brox, “Learning to estimate 3D hand pose from single RGB images,” Proc. of IEEEInternational Conference on Computer Vision, pp. 4903–4911, 2017.
J. Y. Chang, G. Moon, and K. M. Lee, “V2V-PoseNet: Voxel-to-voxel prediction network for accurate 3D hand and human pose estimation from a single depth map,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 5079–5088, 2018.
J. Tompson, M. Stein, Y. Lecun, and K. Perlin, “Real-time continuous pose recovery of human hands using convolutional networks,” ACM Transactions on Graphics (ToG), vol. 33, no. 5, pp. 1–10, 2014.
Y. Cai, L. Ge, J. Cai, and J. Yuan, “Weakly-supervised 3D hand pose estimation from monocular RGB images,” Proc. of European Conference on Computer Vision (ECCV), pp. 666–682, 2018.
J. Zhang, J. Jiao, M. Chen, L. Qu, X. Xu, and Q. Yang, “A hand pose tracking benchmark from stereo matching,” Proc. of IEEE International Conference on Image Processing (ICIP), pp. 982–986, 2017.
F. Mueller, F. Bernard, O. Sotnychenko, D. Mehta, S. Sridhar, D. Casas, and C. Theobalt, “GANerated hands for real-time 3D hand tracking from monocular RGB,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 49–59, 2018.
J. Zhang, J. Jiao, M. Chen, L. Qu, X. Xu, and Q. Yang, “3D hand pose tracking and estimation using stereo matching,” arXiv preprint arXiv.1610.07214, 2016.
C. Wan, T. Probst, L. V. Gool, and A. Yao, “Self-supervised 3D hand pose estimation through training by fitting,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10853–10862, 2019.
L. Ge, Z. Ren, Y. Li, Z. Xue, Y. Wang, J. Cai, and J. Yuan, “3D hand shape and pose estimation from a single RGB image,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10825–10834, 2019.
D. Xiang, H. Joo, and Y. Sheikh, “Monocular total capture: Posing face, body, and hands in the wild,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10957–10966, 2019.
H. Joo, T. Simon, and Y. Sheikh, “Total capture: A 3D deformation model for tracking faces, hands, and bodies,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 8320–8329, 2018.
S. Sridhar, F. Mueller, M. Zollhöfer, D. Casas, A. Oulasvirta, and C. Theobalt, “Real-time joint tracking of a hand manipulating an object from RGB-D input,” Proc. of European Conference on Computer Vision (ECCV), pp. 294–310, 2016.
Y. Li, Z. Xue, Y. Wang, L. Ge, Z. Ren, and J. Rodriguez, “End-to-end 3D hand pose estimation from stereo cameras,” Proc. of The British Machine Vision Conference (BMVC), 2019.
S. Hampali, M. Rad, M. Oberweger, and V. Lepetit, “Honnotate: A method for 3D annotation of hand and object poses,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3193–3203, 2020.
F. Zhang, V. Bazarevsky, A. Vakunov, A. Tkachenka, G. Sung, C. L. Chang, and M. Grundmann, “Mediapipe hands: On-device real-time hand tracking,” arXiv preprint arXiv.2006.10214, 2020.
G. Moon, S. I. Yu, H. Wen, T. Shiratori, and K. Mu. Lee, “InterHand2.6M: A dataset and baseline for 3D interacting hand pose estimation from a single RGB image,” Proc. of European Conference on Computer Vision (ECCV), pp. 548–564, 2020.
G. Park, T. Kim, and W. Woo, “3D hand pose estimation with a single infrared camera via domain transfer learning,” Proc. of IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 588–599, 2020.
S. Han, B. Liu, R. Cabezas, C. D. Twigg, P. Zhang, J. Petkau, T. H. Yu, C. J. Tai, M. Akbay, Z. Wang, et al., “MEgATrack: monochrome egocentric articulated hand-tracking for virtual reality,” ACM Transactions on Graphics (ToG), vol. 39, no. 4, 2020.
Z. Tu, Z. Huang, Y. Chen, D. Kang, L. Bao, B. Yang, and J. Yuan, “Consistent 3d hand reconstruction in video via self-supervised learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023.
C. Zimmermann, D. Ceylan, J. Yang, B. Russell, M. Argus, and T. Brox, “Freihand: A dataset formarkerless capture of hand pose and shape from single RGB images,” Proc. of IEEE/CVF International Conference on Computer Vision, pp. 813–822, 2019.
Z. Jiang, H. Rahmani, S. Black, and B. M. Williams, “A probabilistic attention model with occlusion-aware texture regression for 3D hand reconstruction from a single RGB image,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 758–767, 2023.
Z. Yu, S. Huang, C. Fang, T. P. Breckon, and J. Wang, “ACR: Attention collaboration-based regressor for arbitrary two-hand reconstruction,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12955–12964, 2023.
K. Karunratanakul, S. Prokudin, O. Hilliges, and S. Tang, “HARP: Personalized hand reconstruction from a monocular RGB video,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12802–12813, 2023.
Y. W. Chao, W. Yang, Y. Xiang, P. Molchanov, A. Handa, J. Tremblay, Y. S. Narang, K. Van Wyk, U. Iqbal, S. Birchfield, et al., “DexYCB: A benchmark for capturing hand grasping of objects,” Proc. of IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9040–9049, 2021.
Alexandre F. Da S., A. F. Gonçalves, P. M. Mendes, and J. H. Correia, “FBG sensing glove for monitoring hand posture,” IEEE Sensors Journal, vol. 11, no. 10, pp. 2442–2448, 2011.
G. Saggio, “A novel array of flex sensors for a goniometric glove,” Sensors and Actuators A Physical, 205, 2014.
J. Lee, S. Kim, J. Lee, D. Yang, B. C. Park, S. Ryu, and I. Park, “A stretchable strain sensor based on a metal nanoparticle thin film for human motion detection,” Nanoscale, vol. 6, no. 20, 2014.
J. B. Chossat, Y. Tao, V. Duchaine, and Y. L. Park, “Wearable soft artificial skin for hand motion detection with embedded microfluidic strain sensing,” Proc. of IEEE International Conference on Robotics and Automation (ICRA), pp. 2568–2573, 2015.
Z. Shen, J. Yi, X. Li, M. H. P. Lo, M. ZQ. Chen, Y. Hu, and Z. Wang, “A soft stretchable bending sensor and data glove applications,” Robotics and Biomimetics, vol. 3, no. 1, 22, 2016.
Y. Zheng, Y. Peng, G. Wang, X. Liu, X. Dong, and J. Wang, “Development and evaluation of a sensor glove for hand function assessment and preliminary attempts at assessing hand coordination,” Measurement, vol. 93, pp. 1–12, 2016.
H. O. Michaud, L. Dejace, S. De Mulatier, and S. P. Lacour, “Design and functional evaluation of an epidermal strain sensing system for hand tracking,” Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3186–3191, 2016.
M. Bianchi, R. Haschke, G. Büscher, S. Ciotti, N. Carbonaro, and A. Tognetti, “A multi-modal sensing glove for human manual-interaction studies,” Electronics, vol. 5, no. 3, 42, 2016.
S. Ciotti, E. Battaglia, N. Carbonaro, A. Bicchi, and M. Tognetti, A. and Bianchi, “A synergy-based optimally designed sensing glove for functional grasp recognition,” Sensors, vol. 16, no. 6, 811, 2016.
W. Park, K. Ro, S. Kim, and J. Bae, “A soft sensor-based three-dimensional (3-D) finger motion measurement system,” Sensors, vol. 17, no. 2, 420, 2017.
A. Atalay, V. Sanchez, O. Atalay, D. M. Vogt, F. Haufe, R. J. Wood, and C. J. Walsh, “Batch fabrication of customizable silicone-textile composite capacitive strain sensors for human motion tracking,” Advanced Materials Technologies, vol. 2, no. 9, 1700136, 2017.
T. K. Chan, Y. K. Yu, H. C. Kam, and K. H. Wong, “Robust hand gesture input using computer vision, inertial measurement unit (IMU) and flex sensors,” Proc. of IEEE International Conference on Mechatronics, Robotics and Automation (ICMRA), pp. 95–99, 2018.
H. Ryu, S. Park, J. Park, and J. Bae, “A knitted glove sensing system with compression strain for finger movements,” Smart Materials and Structures, vol. 27, no. 5, 055016, 2018.
O. Glauser, S. Wu, D. Panozzo, O. Hilliges, and O. Sorkine-Hornung, “Interactive hand pose estimation using a stretch-sensing soft glove,” ACM Transactions on Graphics (ToG), vol. 38, no. 4, pp. 1–15, 2019.
Y. Jiang, V. Reimer, T. Schossig, M. Angelmahr, and W. Schade, “Fiber optical multifunctional human-machine interface for motion capture, temperature, and contact force monitoring,” Optics and Lasers in Engineering, vol. 128, 106018, 2020.
J. S. Kim, B. K. Kim, M. Jang, K. Kang, D. E. Kim, B. K. Ju, and J. Kim, “Wearable hand module and real-time tracking algorithms for measuring finger joint angles of different hand sizes with high accuracy using FBG strain sensor,” Sensors, vol. 20, no. 7, 1921, 2020.
C. K. Jha, K. Gajapure, and A. L. Chakraborty, “Design and evaluation of an FBG sensor-based glove to simultaneously monitor flexure of ten finger joints,” IEEE Sensors Journal, vol. 21, no. 6, pp. 7620–7630, 2020.
E. Ayodele, S. A. R. Zaidi, J. Scott, Z. Zhang, A. Hayajneh, S. Shittu, and D. McLernon, “A weft knit data glove,” IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1–12, 2021.
L. Behnke, L. Sanchez-Botero, W. R. Johnson, A. Agrawala, and R. Kramer-Bottiglio, “Dynamic hand proprioception via a wearable glove with fabric sensors,” Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 149–154, 2023.
Y. Lin, P. B. Shull, and J. B. Chossat, “Design of a wearable real-time hand motion tracking system using an array of soft polymer acoustic waveguides,” Soft Robotics, vol. 11, no. 2, pp. 282–295, 2023.
M. A. Zhou and P. Ben-Tzvi, “RML glove—An exoskeleton glove mechanism with haptics feedback,” IEEE/ASME Transactions on Mechatronics, vol. 20, no. 2, pp. 641–652, 2014.
I. Sarakoglou, A. Brygo, D. Mazzanti, N. G. Hernandez, D. Caldwell, and N. G. Tsagarakis, “Hexotrac: A highly under-actuated hand exoskeleton for finger tracking and force feedback,” Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1033–1040, 2016.
M. Palagi, G. Santamato, D. Chiaradia, M. Gabardi, S. Marcheschi, M. Solazzi, A. Frisoli, and D. Leonardis, “A mechanical hand-tracking system with tactile feedback designed for telemanipulation,” IEEE Transactions on Haptics, vol. 16, no. 4, pp. 594–601, 2023.
Y. Ma, Z. H. Mao, W. Jia, C. Li, J. Yang, and M. Sun, “Magnetic hand tracking for human-computer interface,” IEEE Transactions on Magnetics, vol. 47, no. 5, pp. 970–973, 2011.
K. Chen, S. N. Patel, and S. Keller, “Finexus: Tracking precise motions of multiple fingertips using magnetic sensing,” Proc. of CHI Conference on Human Factors in Computing Systems, pp. 1504–1514, 2016.
F. S. Parizi, E. Whitmire, and S. Patel, “Auraring: Precise electromagnetic finger tracking,” Proc. of ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 3, no. 4, pp. 1–28, 2019.
R. Casas, K. Martin, M. Sandison, and P. S. Lum, “A tracking device for a wearable high-DOF passive hand exoskeleton,” Proc. of The 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 6643–6646, 2021.
F. Santoni, A. De Angelis, A. Moschitta, and P. Carbone, “MagIK: A hand-tracking magnetic positioning system based on a kinematic model of the hand,” IEEE Transactions on Instrumentation and Measurement, vol. 70, 9507313, 2021.
Polhemus, https://polhemus.com/motion-tracking/hand-and-finger-trackers/, Accessed March 2024.
J. K. Perng, B. Fisher, S. Hollar, and K. S. J. Pister, “Acceleration sensing glove (ASG),” Proc. of the 3rd International Symposium on Wearable Computers, vol. 178–180, 1999.
J. Kim, N. D. Thang, and T. Kim, “3-D hand motion tracking and gesture recognition using a data glove,” Proc. of IEEE International Symposium on Industrial Electronics, pp. 1013–1018, 2009.
B. O’Flynn, J. T. Sanchez, J. Connolly, J. Condell, K. Curran, P. Gardiner, and B. Downes, “Integrated smart glove for hand motion monitoring,” Proc. of The 6th International Conference on Sensor Device Technologies and Applications, 2015.
T. L. Baldi, S. Scheggi, L. Meli, M. Mohammadi, and D. Prattichizzo, “GESTO: A glove for enhanced sensing and touching based on inertial and magnetic sensors for hand tracking and cutaneous feedback,” IEEE Transactions on Human-Machine Systems, vol. 47, no. 6, pp. 1066–1076, 2017.
Y. Lee, M. Kim, Y. Lee, J. Kwon, Y. L. Park, and D. Lee, “Wearable finger tracking and cutaneous haptic interface with soft sensors for multi-fingered virtual manipulation,” IEEE/ASME Transactions on Mechatronics, vol. 24, no. 1, pp. 67–77, 2018.
H. T. Chang and J. Y. Chang, “Sensor glove based on novel inertial sensor fusion control algorithm for 3-D realtime hand gestures measurements,” IEEE Transactions on Industrial Electronics, vol. 67, no. 1, pp. 658–666, 2020.
Y. Liu, S. Zhang, and M. Gowda, “NeuroPose: 3D hand pose tracking using EMG wearables,” Proc. of the Web Conference 2021, pp. 1471–1482, 2021.
Y. Liu, C. Lin, and Z. Li, “WR-Hand: Wearable armband can track user’s hand,” Proc. of ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 5, no. 3, pp. 1–27, 2021.
gForce EMG Armband, https://www.oymotion.com/, Accessed March 2024.
M. Bouzit, G. Burdea, G. Popescu, and R. Boian, “The Rutgers Master II-new design force-feedback glove,” IEEE/ASME Transactions on Mechatronics, vol. 7, no. 2, pp. 256–263, 2002.
C. N. Schabowsky, S. B. Godfrey, R. J. Holley, and P. S. Lum, “Development and pilot testing of HEXORR: Hand EXOskeleton rehabilitation robot,” Journal of Neuroengineering and Rehabilitation, vol. 7, pp. 1–16, 2010.
H. In, K. Cho, K. Kim, and B. Lee, “Jointless structure and under-actuation mechanism for compact hand exoskeleton,” Proc. of IEEE International Conference on Rehabilitation Robotics, pp. 1–6, 2011.
J. Iqbal, H. Khan, N. G. Tsagarakis, and D. G. Caldwell, “A novel exoskeleton robotic system for hand rehabilitation—conceptualization to prototyping,” Biocybernetics and Biomedical Engineering, vol. 34, no. 2, pp. 79–89, 2014.
H. In, B. B. Kang, M. Sin, and K. Cho, “Exo-glove: A wearable robot for the hand with a soft tendon routing system,” IEEE Robotics & Automation Magazine, vol. 22, no. 1, pp. 97–105, 2015.
F. H. Raab, E. B. Blood, T. O. Steiner, and H. R. Jones, “Magnetic position and orientation tracking system,” IEEE Transactions on Aerospace and Electronic Systems, no. 5, pp. 709–718, 1979.
J. B. Kuipers, “SPASYN-an electromagnetic relative position and orientation tracking system,” IEEE Transactions on Instrumentation and Measurement, vol. 29, no. 4, pp. 462–466, 1980.
C. Harrison and S. E. Hudson, “Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices,” Proc. of The 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 121–124, 2009.
D. Ashbrook, P. Baudisch, and S. White, “Nenya: Subtle and eyes-free mobile input with a magnetically-tracked finger ring,” Proc. of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2043–2046, 2011.
K. Chen, K. Lyons, S. White, and S. Patel, “uTrack: 3D input using two magnetic sensors,” Proc. of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 237–244, 2013.
L. Chan, R. Liang, M. Tsai, K. Cheng, C. Su, M. Y. Chen, W. Cheng, and B. Chen, “FingerPad: Private and subtle interaction using fingertips,” Proc. of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 255–260, 2013.
J. McIntosh, P. Strohmeier, J. Knibbe, S. Boring, and K. Hornbæk, “Magnetips: Combining fingertip tracking and haptic feedback for around-device interaction,” Proc. of CHI Conference on Human Factors in Computing Systems, pp. 1–12, 2019.
Y. Du, Y. Wong, W. Jin, W. Wei, Y. Hu, M. S. Kankanhalli, and W. Geng, “Semi-Supervised Learning for Surface EMG-based Gesture Recognition,” Proc. of International Joint Conference on Artificial Intelligence, 2017.
F. Quivira, T. Koike-Akino, Y. Wang, and D. Erdogmus, “Translating sEMG signals to continuous hand poses using recurrent neural networks,” Proc. of IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), pp. 166–169, 2018.
S. Raurale, J. McAllister, and J. M. del Rincon, “EMG acquisition and hand pose classification for bionic hands from randomly-placed sensors,” Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1105–1109, 2018.
I. Sosin, D. Kudenko, and A. Shpilman, “Continuous gesture recognition from sEMG sensor data with recurrent neural networks and adversarial domain adaptation,” Proc. of The 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), pp. 1436–1441, 2018.
A. D. Silva, M. V. Perera, K. Wickramasinghe, A. M. Naim, T. D. Lalitharatne, and S. L. Kappel, “Real-time hand gesture recognition Using temporal muscle activation maps of multi-channel sEMG signals,” Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1299–1303, 2020.
D. Oh and Y. Jo, “Classification of hand gestures based on multi-channel EMG by scale Average wavelet transform and convolutional neural network,” International Journal of Control, Automation, and Systems, vol. 19, no. 3, pp. 1443–1450, 2021.
Leap Motion, https://www.ultraleap.com/product/leap-motion-controller/, Accessed March 2024.
Cyber Glove Systems, https://www.cyberglovesystems.com/, Accessed March 2024.
Dexmo, https://www.dextarobotics.com/en-us/, Accessed March 2024.
Sense Glove, https://www.senseglove.com/, Accessed March 2024.
Manus VR, https://manus-vr.com/, Accessed March 2024.
HaptX Gloves, https://haptx.com/, Accessed March 2024.
Quester Motion Glove, https://quester.kr/, Accessed March 2024.
M. Caeiro-Rodríguez, I. Otero-González, F. A. Mikic-Fonte, and M. Llamas-Nistal, “A systematic review of commercial smart gloves: Current status and applications,” Sensors, vol. 21, no. 8, 2667, 2021.
Meta Quest, https://www.meta.com/, Accessed March 2024.
Apple Vision Pro, https://www.apple.com/apple-vision-pro/, Accessed March 2024.
StretchSense MoCap Gloves, https://stretchsense.com/, Accessed March 2024.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Dongjun Lee is a Senior Editor of International Journal of Control, Automation, and Systems. Senior Editor status has no bearing on editorial consideration. The authors declare that there is no competing financial interest or personal relationship that could have appeared to influence the work reported in this paper.
Additional information
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korean Government (MSIT) (RS-2023-00208052).
Jinuk Heo received his B.S. degree in mechanical engineering from Seoul National University, Seoul, Korea in 2019, where he is currently working toward a Ph.D. degree in mechanical engineering. His research interests include hand tracking, humanrobot interaction, and interactive simulation.
Hyelim Choi received her B.S. degree in mechanical engineering from Seoul National University, Seoul, Korea in 2020, where she is currently working toward a Ph.D. degree in mechanical engineering. Her research interests include hand tracking, visual perception, and sensor fusion.
Yongseok Lee received his B.S. degree in mechanical and aerospace engineering and a Ph.D. degree in mechanical engineering from Seoul National University, Seoul, Korea in 2021. He is currently a Staff Engineer with the Samsung Research, Seoul, Korea. His research interests include the hand tracking, human-machine interaction, and generative artificial intelligence.
Hyunsu Kim received his B.S. degree in mechanical engineering and a B.A. degree in psychology from Sungkyunkwan University, Seoul, Korea in 2021. He is currently working toward a Ph.D. degree in mechanical engineering at Seoul National University, Seoul, Korea. His research interests include design of haptic device and haptic simulation.
Harim Ji received his B.S. degree in mechanical engineering from Seoul National University, Seoul, Korea in 2023, where he is currently working toward a Ph.D. degree in mechanical engineering. His research interests include hand tracking, computer graphics, and interactive simulation.
Hyunreal Park received his B.S. degree in mechanical engineering from Seoul National University, Seoul, Korea in 2023, where he is currently working toward a Ph.D. degree in mechanical engineering. His research interests include hand tracking and haptics.
Youngseon Lee received her B.S. degree in energy resources engineering from Seoul National University, Seoul, Korea in 2021, where she is currently working toward a Ph.D. degree in mechanical engineering. Her research interests include dexterous manipulation, control of robotic hands, and haptics.
Cheongkee Jung received his B.S. degree in civil engineering from Korea Military Academy, Seoul, Korea, in 2017. He is currently working toward an M.S. degree in mechanical engineering at Seoul National University, Seoul, Korea. His research interests include human-robot interaction, teleoperation, control of swarm robots, and interactive simulation.
Hai-Nguyen Nguyen obtained his B.Eng. degree in mechatronics and an M.Sc. degree in engineering mechanics from the Hanoi University of Science and Technology, Hanoi, Vietnam, in 2008 and 2011, respectively. He received a Ph.D. degree in mechanical and aerospace engineering from Seoul National University, Seoul, Korea, in 2018. He is currently Senior Researcher with Department of Mechanical Engineering, Seoul National University, Seoul, Korea. His research interests include on the dynamics, control, and planning of mechatronic and robotic systems, with a special emphasis on aerial robotics.
Dongjun Lee received his B.S. degree in mechanical engineering and an M.S. degree in automation and design from the Korea Advanced Institute of Science and Technology, Daejeon, Korea, and a Ph.D. degree in mechanical engineering from the University of Minnesota at Twin Cities, Minneapolis, MN, USA, 2004. He is currently a Professor with the Department of Mechanical Engineering, Seoul National University, Seoul, Korea. His research interests include the dynamics and control of robotic and mechatronic systems with emphasis on aerial/mobile robots, teleoperation/haptics, physics simulation, multirobot systems, and industrial control applications.
Rights and permissions
About this article
Cite this article
Heo, J., Choi, H., Lee, Y. et al. Hand Tracking: Survey. Int. J. Control Autom. Syst. 22, 1761–1778 (2024). https://doi.org/10.1007/s12555-024-0298-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12555-024-0298-1