Abstract
The paper deals with an experimental assessment of the Leap Motion Controller\(^{\textregistered }\). This device is able to track the user’s hands in a real environment. Due to low-invasiveness and easiness of use, it is promising for the integration in virtual or augmented reality, research and entertainment scenarios. The assessment is performed in a real context using volunteers that were asked to point with the fingertips to a set of predefined locations in space. A specific test rig has been designed and built. It is comprised of a transparent plate supported by adjustable pillars and mounted over the Leap. The data are processed to assess the errors in tracking the five fingertips of the right hand. Results show that the accuracy and precision of the Leap is suitable for robust tracking of the user’s hand. The results also unveil that there are preferable zones in which the tracking performance is better.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The Leap Motion Controller is a hand-tracking device produced by an American start-up company and released in 2013. It is a small box (about 12.7 \(\times \) 12.7 \(\times \) 5.1 cm) which can be connected via USB to a desktop computer. The device includes three infrared LEDs and two cameras and its working tracking principle is the stereoscopy (the reflected light from the LEDs is seen from two different points of view). Thanks to the proprietary SDK libraries, the user gains access to the tracking information of both hands in the space above the box in a height range of 15–60 cm. The library functions are able to recognize both hands and retrieve information about the location and pose of each bony segment.
Although quite new, the Leap Motion Controller has raised great interest for its promising capabilities in integration into virtual environments and affordable cost. The tracking of human hand is at the base of several implementation involving human–computer interface and natural gesture recognition [1]. This need is amplified in virtual reality and augmented reality implementations in which new methodologies for exploring and managing the scene are pursued in order to increase the realism, interactivity levels [2, 3] with the purpose of developing applications for supporting design activities [4, 5] and physical simulations [6, 7]. According to this objective, during the last years, the researchers have begun to develop the so-called “natural interface” concept [8]. The purpose of the natural interaction is to achieve the communication between the user and the computer without the use of invasive sensors to be worn. In this way, the user is more at ease and can communicate his intent using simple gestures and movements and the development of interactive environments has boosted [9, 10].
Recent studies showed the increasing interest of the scientific community about the wide possibility of implementation around the Leap controller. As examples, Katahira and Soga [11] proposed an integration of the Leap Motion Controller in an augmented reality environment for implementing a realistic display and gripping. The possibilities in integrating Leap Controller and augmented reality is also witnessed in [12]. In 2015, another research team [13] described the integration of the controller in a virtual reality environment by using the Oculus head-mounted display. Their work focused on the interaction between the user and a CAD model by picking and dragging components. The combination between hand tracking and shape rendering has been proposed in [14]. The study aimed to discuss and implement a system for physical rendering virtual shapes. Specific applications have been developed also in different scientific fields such as medicine [15, 16], language learning [17, 18], forensic [19], cultural heritage [20].
One of the key points for any tracking device is related to the accuracy and precision [21]. Some initial papers demonstrated the effort in assessing such performances also for the Leap Motion Controller. Since the device is able to track the free hands (or the hands grabbing a tool) in space, such assessment is not an easy task. On the one hand, there is the need of limiting the presence of external objects in the environment in order to make the recognition of the segments easier. On the other hand, there is the need to perform the tests reaching a specific and known position and/or pose for comparison. For this purpose, in 2013 Weichert and his co-authors [22] assessed the robustness and accuracy of the device by using a Kuka Robot holding a tool. Although their assessment was very precise and systematic, it was limited to the tracking of rigid tool rather than on the recognition and tracking of a human hand. One year later, Guna and his co-workers [23] studied the precision and reliability of the Leap by using a plastic hand model which can be fixed in space. Also in this case, the assessment is accurate, but it is limited to a surrogate hand with fixed shape and pose. In 2014 and 2015, three other studies [24–26] compared the standard mouse and Leap computer pointing capabilities. The studies were focused only on simple interaction (pointing and/or manipulation) with pictorial shapes projected on a monitor but no three-dimensional quantitative evaluations have been included.
All these studies (except the last two) about the assessment of Leap controller’s accuracy are performed considering the device as a standard measurement tool. Actually, it is conceived as an interface between the user and the computer, so its main requirement is to capture and interpret the pose of the user’s hand and fingers that the user wants to communicate. For this reason, it seems more appropriate to measure the accuracy with a different and experimental procedure able to include the contribution of the real user.
The studies in the literature testify an increasing interest in developing applications and systems using a compact and low-cost hand tracking device and a scientific curiosity about its actual performances to enlarge its potential. Fuelled by this motivation, the current study aims to perform a direct assessment of the accuracy of the Leap controller in fingertip tracking in a real context, including the user’s contribution. It means that the study is performed using real hands in a real environment and taking into account the variability of a group of male and female subjects. For this purpose, a specific experimental setup has been designed and built and the experimental tests have involved a group of volunteers. To the authors’ best knowledge, this specific assessment is original.
The paper is organized as follows. In a first part the experimental setup and the testing procedure are described; in a second part, the results of tests are presented and discussed both using summarizing graphs and tables; in a third part the conclusions are drawn.
2 Experimental setup and preliminary consideration for assessing the accuracy
The experimental procedure is conceived with the idea of testing the Leap controller while it is tracking a real hand of a human user in a real context. This is not a simple task because, on the one hand, we need to perform the assessment considering precise (and repeatable) positions in space and on the other hand, we have to disturb the real context in a very limited way. The presence of external objects may alter the recognition or produce occlusions, mining the efficiency of the device. For this reason, we designed a specific test rig.
With reference to Fig. 1, the rig is comprised of a Plexiglas transparent plate (4 mm of thickness) mounted on four adjustable supports (pillars in white and red). The height of the supports can be adjusted with four screws and they behave in a telescopic way for varying the distance between the plate and the base. A matrix of blue points (circles of 2 mm of diameter) is printed on one side of the plate. The points are spaced of 30 mm in both x and y directions (11 points along x direction, 9 along y direction). The Leap controller is placed on the base and under the plate, in the middle of it. The reference systems of the plate (with the superscript “p”) and the Leap (with the superscript “l”) are depicted in Fig. 1 and are used for the subsequent evaluations.
With this arrangement, the user can freely move the hands in the space above the plate without obstacles and can touch the plate with the fingertips at precise and repeatable locations.
The exact position of the points with respect to the Leap controller has been measured by using the Microscribe G2X digitizer (Fig. 2, on the left). According to manufacturer’s specifications, its probe is able to acquire the position in space with an accuracy of 0.23 mm [27].
The collimation between Leap controller and the matrix of point is necessary for taking into account also small errors of positioning or misalignment, in order to fully map the position of points \(\mathbf{P}_i \) in the Leap controller’s acquisition frame \(\mathbf{P}_i =\left\{ {{\begin{array}{lll} {x_i }&{} {y_i }&{} {z_i } \\ \end{array} }} \right\} _l^T \).
With reference to Figs. 1 and 2, there are three reference systems: that of the Leap, that of the digitizer and that of the plate on which the mesh is printed. The coordinate transformation of a generic point \(\mathbf{P}_i \) can be computed as:
where the subscript i refers to the Leap reference system, d refers to the digitizer reference system and p to the plate reference system. The matrices \(\left[ T \right] _j^k \) are the 4 by 4 transformation matrix from reference system j to reference system k. The use of 4 by 4 transformation matrices allows for taking into account both rotation and translation components.
By using the Eq. (1), it is then possible to compute the absolute coordinates of a generic point \(\mathbf{P}_i \), tracked by the Leap Controller in its relative reference system. Another important assessment, before proceeding to the tests is about the effect of refraction on the measurement. The Leap controller works using optical triangulation. In common application, it is expected that between the Leap and the hands there is only air. On the other hand, the plate introduces a discontinuity in the optical path that may affect the measurement. For this reason, the contribution of the refraction has been assessed as a preliminary activity in order to compute the correction to be applied to the measured coordinates.
With reference to Fig. 3 (the figure is depicted as two-dimensional drawing for simplicity, but the actual computation is still valid in three-dimensions), when a real point \(\mathbf{P}\) is triangulated from two points of view (\(\mathbf{O}_\mathbf{1} \) and \(\mathbf{O}_\mathbf{2} )\), it appears at a different location (apparent location \(\mathbf{P}_\mathbf{a} )\) for the bending of the rays due to the passage through different materials. In order to correct the measurement, it is necessary to compute the distance between \(\mathbf{P}\) and \(\mathbf{P}_\mathbf{a} \). This computation can be accomplished with the following geometrical considerations.
We assume the location of triangulation points \(\mathbf{O}_\mathbf{1} \) and \(\mathbf{O}_\mathbf{2} \) to be known, since they depend on the device embodiment (they can be retrieved by the manufacturer’s specifications). We also know the location of the real point \(\mathbf{P}\) (since it has been measured before the test begins), the distance between the triangulation points and the bottom surface of the plate Hand its thickness T.
In order to compute the apparent location \(\mathbf{P}_\mathbf{a} \), we have to impose the well-known condition stated by the Snell’s refraction law: “the ratio of the sines of the angles of incidence (\(\alpha \) and \(\beta )\) and refraction (\(\alpha ^{\prime }\) and \(\beta ^{\prime })\) is equivalent to the reciprocal of the ratio of the indices of refraction”. In terms of vectors, it can be written considering the cross products as:
where \(\mathbf{n}\) is the normal vector to the plate surface (supposed constant due to planarity) and IR is the index of refraction of Plexiglas with respect to the air (assumed 1.49).
The formulas (2) and (3) are two scalar equations with six scalar unknowns (the spatial coordinates of points \(\mathbf{A}\) and \(\mathbf{B})\). In order to make the system solvable, we need to add four more scalar equations. The first two involve the y coordinate of both points \(\mathbf{A}\) and \(\mathbf{B}\), constraining that the two points lay on the bottom surface of the plate:
The other two equations express the condition that the two groups of points \((\mathbf{A}-\mathbf{P}-\mathbf{O}_\mathbf{1} -\mathbf{O}_\mathbf{2} )\) and \((\mathbf{B}-\mathbf{P}-\mathbf{O}_\mathbf{1} -\mathbf{O}_\mathbf{2} )\) lay on two planes. These conditions can be written as:
The system of Eqs. (2)–(7) can be then solved for \(\mathbf{A}\) and \(\mathbf{B}\) (six equations in six unknowns).
The location of the apparent point \(\mathbf{P}_\mathbf{a} \) can be found as the intersection of two lines \(\mathbf{AO}_\mathbf{1} \) and \(\mathbf{BO}_\mathbf{2} \).
As a final step, the correction \(\mathbf{C}_\mathbf{P} \) to be applied to the measurement coming from the device can be computed as:
It is important to underline that the correction \(\mathbf{C}_\mathbf{P} \) is a spatial vector, so it modifies x, y and z coordinates of the measured point. Moreover, the correction varies with the point, depending on the distance from the observation points and the attitude with respect to the distance \(\mathbf{O}_\mathbf{1} \mathbf{O}_\mathbf{2} \). For the evaluation of the corrections the system of equation has been solved numerically.
Concerning with the experimental procedure (see Fig. 2, on the right), a group of ten users (5 male and 5 female) has been involved in the testing. Every subject was asked to touch with the five fingertips of the right hand each point on the plate in a sequence. The tests have been repeated five times (50 overall assessments for each finger). The entire test is then repeated varying the height between the plate and the Leap at 200, 400 and 600 mm. The data are then collected and analysed using statistics, by computing the mean values and standard deviations. Appendix 1 reports the main part of the C\(++\) code used for retrieving the location of the fingertip from the Leap Motion Controller. The code makes use of the free Leap Motion SDK [28].
3 Results and discussion
Figures 4, 5 and 6 show the results of the accuracy study in a graphical way. In all the figures, the green dots represent the physical points printed on the Plexiglas plate. The blue points are the mean values of the 50 assessments and the blue circles are the zones delimited by the 3\(\sigma \) deviation.
All the tracking acquisitions show robustness and the user’s hand is always recognized. Considering that all the subjects involved in the tests had no previous experience with the Leap, no training or repetition of mistaken tests have been necessary. No relevant differences are pointed out between male and female tracked subjects.
As an overall consideration on the results, the tracking is rather accurate and the errors of mean values are bounded within 4–5 mm, that is acceptable for finger tracking purposes and for the device typology. Moreover, the precision is acceptable as well, as testified by limited amplitude of the standard deviations, in the same order of the error amplitude. There is no systematic error in tracking and the refraction is properly corrected. Closer assessments (Plexiglas plate at 200 mm) is slightly more accurate than those at 400 and 600 mm. The index fingertip is acquired with higher accuracy with respect to the other fingers. Thumb shows the worst accuracy.
In general, there is no evident accuracy difference between near and far points with respect to the Leap centre. On the other hand, it is interesting to discuss that, if we decompose the testing region in four quadrants, the portion of points with positive x and negative y coordinates (the quadrant at the bottom right) shows greater tracking errors, especially the farthest points. This is probably due to the fact that when pointing to such locations, the right hand is partially out of the range (or at limit) of the view field of the Leap and its recognition is not very accurate. Similar (and opposite) behaviour occurs if the points of the second quadrant (negative x and y coordinates) are touched with left hand fingertips.
Table 1 summarizes the results of the investigations in a numerical aggregate way, distinguishing between the four quadrants. Results are presented for the three different distances from the device.
4 Conclusion
An original experimental procedure for assessing the accuracy of the Leap Motion Controller in the tracking of fingertips is presented. The methodology makes use of a specific test rig, which is designed and built for the purpose of the test. The assessment is performed on the field, using human subjects in a real context. The accuracy of all five fingers of the right hand is assessed, considering the pointing at a transparent grid of reference points placed at three different distances from the device.
Results show that the tracking is robust and stable and errors in fingertip estimation are bounded within 4–5 mm. This performance is very suitable for the use of the device in interactive virtual applications such as virtual and augmented reality, virtual object manipulation and virtual prototyping. Although no systematic error in tracking is experienced, it is found that when tracking the right hand, the pointing at the lower right quadrant gives worst results. Vertical distances from 200 to 600 mm from the device does not produce relevant difference in tracking although, closer distance allows slightly more accurate tracking. No differences between male and female tracked subjects have been experienced in the tests. These results encourage the use of the Leap Motion device for implementing interactive applications also in the filed of geometrical modelling, design and manufacturing and the integration with both virtual and augmented reality methodologies.
References
Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev 37, 311–324 (2007)
Merienne, F.: Human factors consideration in the interaction process with virtual environment. Int. J. Interact. Des. Manuf. 4(2), 83–86 (2000)
Bowman, D.A., Kruijf, E., La Viola, J., Poupyrev, I.: 3D User Interfaces: Theory and Practice. Addison-Wesley, New York (2005)
Valentini, P.P.: Interactive virtual assembling in augmented reality. Int. J. Interact. Des. Manuf. 3, 109–119 (2009)
Valentini, P.P.: Interactive cable harnessing in augmented reality. Int. J. Interact. Des. Manuf. 5, 45–53 (2011)
Valentini, P.P., Pezzuti, E.: Design and interactive simulation of cross-axis compliant pivot using dynamic splines. Int. J. Interact. Des. Manuf. 7(4), 261–269 (2013)
Mariti, L., Valentini, P.P.: Efficiency and precise interaction for multibody simulations in augmented reality. Multibody Dyn. Comput. Methods Appl. Ser. Comput. Methods Appl. Sci. (Springer) 28, 173–192 (2013)
Valentini, P.P.: Natural interface in augmented reality interactive simulations. Virtual Phys. Prototyp. 7, 137–151 (2012)
Valentini, P.P.: Human Factors in Augmented Reality Environments. Enhancing user role in augmented reality interactive simulations. Springer, New York (2013)
Kim, M., Lee, J.Y.: Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimed. Tools Appl. (in press) (2016)
Katahira, R., Soga, M.: Development and evaluation of a system for AR enabling realistic display of gripping motions using Leap Motion Controller. Proc. Comput. Sci. 60, 1595–1603 (2015)
Regenbrecht, H., Collins, J., Hoermann, S.: A leap-supported, hybrid AR interface approach. In: Proceedings of the 25th Australian Computer–Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, 25–29 November 2013, pp. 81–284 (2013)
Beattie, N., Horan, B., McKenzie, S.: Taking the LEAP with the Oculus HMD and CAD—plucking at thin air? Proc. Technol. 20, 149–154 (2015)
Covarrubias, M., Bordegoni, M., Cugini, U.: A hand gestural interaction system for handling a desktop haptic strip for shape rendering. Sens. Actuators A 233, 500–511 (2015)
Oropesa, I., de Jong, T.L., Sánchez-González, P., Dankelman, J., Gómez, E.J.: Feasibility of tracking laparoscopic instruments in a box trainer using a Leap Motion Controller. Measurement 80, 115–124 (2016)
Bizzotto, N., Costanzo, A., Bizzotto, L., Regis, D., Sandri, A., Magnan, B.: Leap motion gesture control with OsiriX in the operating room to control imaging: first experiences during live surgery. Surg. Innov. 21(6), 655–656 (2014)
Potter, L.E., Araullo, J., Carter, L.: The Leap Motion Controller: a view on sign language. In: Proceedings of the 25th Australian Computer–Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, 25–29 November 2013, pp. 175–178 (2013)
Vikram, S., Li, L., Russell, S.: Writing and sketching in the air, recognizing and controlling on the fly. In: CHI ’13 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, pp. 1179–1184 (2013)
Ebert, L.C., Flach, P.M., Thali, M.J., Ross, S.: Out of touch—a plugin for controlling OsiriX with gestures using the Leap Controller. J. Forensic Radiol. Imaging 2, 126–128 (2014)
Ridel, B., Reuter, P., Laviole, J., Mellado, N., Couture, N., et al.: The revealing flashlight: interactive spatial augmented reality for detail exploration of cultural heritage artifacts. J. Comput. Cult. Herit. Assoc. Comput. Mach. 7(2), 1–18 (2014)
MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, pp. 9–16 (2001)
Weichert, F., Bachmann, D., Rudak, B., Fisseler, D.: Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors 13(5), 6380–6393 (2013)
Guna, J., Jakus, G., Pogačnik, M., Tomažič, S., Sodnik, J.: An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2), 3702–3720 (2014)
Apostolellis, P., Bortz, B., Peng, M., Polys, N., Hoegh, A.: Exploring the integrality and separability of the Leap Motion Controller for direct manipulation 3D interaction. In: Proceedings of the IEEE Symposium on 3D User Interfaces (3DUI), Minneapolis, 29–30 March 2014, pp. 153–154 (2014)
Bachmann, D., Weichert, F., Rinkenauer, G.: Evaluation of the Leap Motion Controller as a new contact-free pointing device. Sensors 15(1), 214–233 (2015)
Coelho, J.C., Verbeek, F.J.: The leap motion controller in a 3D virtual environment: explorations and evaluations of poiting tasks. In: Proceedings of the 8th International Conference on Interfaces and Human Computer Interaction—Part of the Multi Conference on Computer Science and Information Systems, MCCSIS, 15–17 July 2014, Lisbon (2014)
Microscribe: Microscribe Specifications. RevWare, Raleigh (2012)
Leap Motion: Documentation Developers’ Guide—API Overview. The Leap Motion Inc., San Francisco (2014)
Author information
Authors and Affiliations
Corresponding author
Appendix 1: Useful numerical code for fingertip tracking
Appendix 1: Useful numerical code for fingertip tracking
In this Appendix a portion of the numerical code in C\(++\) used for the retrieving the fingertips position from the Leap Motion Controller is reported. The code is based on the use of Leap Motion SDK [28].
Rights and permissions
About this article
Cite this article
Valentini, P.P., Pezzuti, E. Accuracy in fingertip tracking using Leap Motion Controller for interactive virtual applications. Int J Interact Des Manuf 11, 641–650 (2017). https://doi.org/10.1007/s12008-016-0339-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12008-016-0339-y