1 Introduction

The Leap Motion Controller is a hand-tracking device produced by an American start-up company and released in 2013. It is a small box (about 12.7 \(\times \) 12.7 \(\times \) 5.1 cm) which can be connected via USB to a desktop computer. The device includes three infrared LEDs and two cameras and its working tracking principle is the stereoscopy (the reflected light from the LEDs is seen from two different points of view). Thanks to the proprietary SDK libraries, the user gains access to the tracking information of both hands in the space above the box in a height range of 15–60 cm. The library functions are able to recognize both hands and retrieve information about the location and pose of each bony segment.

Although quite new, the Leap Motion Controller has raised great interest for its promising capabilities in integration into virtual environments and affordable cost. The tracking of human hand is at the base of several implementation involving human–computer interface and natural gesture recognition [1]. This need is amplified in virtual reality and augmented reality implementations in which new methodologies for exploring and managing the scene are pursued in order to increase the realism, interactivity levels [2, 3] with the purpose of developing applications for supporting design activities [4, 5] and physical simulations [6, 7]. According to this objective, during the last years, the researchers have begun to develop the so-called “natural interface” concept [8]. The purpose of the natural interaction is to achieve the communication between the user and the computer without the use of invasive sensors to be worn. In this way, the user is more at ease and can communicate his intent using simple gestures and movements and the development of interactive environments has boosted [9, 10].

Recent studies showed the increasing interest of the scientific community about the wide possibility of implementation around the Leap controller. As examples, Katahira and Soga [11] proposed an integration of the Leap Motion Controller in an augmented reality environment for implementing a realistic display and gripping. The possibilities in integrating Leap Controller and augmented reality is also witnessed in [12]. In 2015, another research team [13] described the integration of the controller in a virtual reality environment by using the Oculus head-mounted display. Their work focused on the interaction between the user and a CAD model by picking and dragging components. The combination between hand tracking and shape rendering has been proposed in [14]. The study aimed to discuss and implement a system for physical rendering virtual shapes. Specific applications have been developed also in different scientific fields such as medicine [15, 16], language learning [17, 18], forensic [19], cultural heritage [20].

One of the key points for any tracking device is related to the accuracy and precision [21]. Some initial papers demonstrated the effort in assessing such performances also for the Leap Motion Controller. Since the device is able to track the free hands (or the hands grabbing a tool) in space, such assessment is not an easy task. On the one hand, there is the need of limiting the presence of external objects in the environment in order to make the recognition of the segments easier. On the other hand, there is the need to perform the tests reaching a specific and known position and/or pose for comparison. For this purpose, in 2013 Weichert and his co-authors [22] assessed the robustness and accuracy of the device by using a Kuka Robot holding a tool. Although their assessment was very precise and systematic, it was limited to the tracking of rigid tool rather than on the recognition and tracking of a human hand. One year later, Guna and his co-workers [23] studied the precision and reliability of the Leap by using a plastic hand model which can be fixed in space. Also in this case, the assessment is accurate, but it is limited to a surrogate hand with fixed shape and pose. In 2014 and 2015, three other studies [2426] compared the standard mouse and Leap computer pointing capabilities. The studies were focused only on simple interaction (pointing and/or manipulation) with pictorial shapes projected on a monitor but no three-dimensional quantitative evaluations have been included.

All these studies (except the last two) about the assessment of Leap controller’s accuracy are performed considering the device as a standard measurement tool. Actually, it is conceived as an interface between the user and the computer, so its main requirement is to capture and interpret the pose of the user’s hand and fingers that the user wants to communicate. For this reason, it seems more appropriate to measure the accuracy with a different and experimental procedure able to include the contribution of the real user.

The studies in the literature testify an increasing interest in developing applications and systems using a compact and low-cost hand tracking device and a scientific curiosity about its actual performances to enlarge its potential. Fuelled by this motivation, the current study aims to perform a direct assessment of the accuracy of the Leap controller in fingertip tracking in a real context, including the user’s contribution. It means that the study is performed using real hands in a real environment and taking into account the variability of a group of male and female subjects. For this purpose, a specific experimental setup has been designed and built and the experimental tests have involved a group of volunteers. To the authors’ best knowledge, this specific assessment is original.

The paper is organized as follows. In a first part the experimental setup and the testing procedure are described; in a second part, the results of tests are presented and discussed both using summarizing graphs and tables; in a third part the conclusions are drawn.

2 Experimental setup and preliminary consideration for assessing the accuracy

The experimental procedure is conceived with the idea of testing the Leap controller while it is tracking a real hand of a human user in a real context. This is not a simple task because, on the one hand, we need to perform the assessment considering precise (and repeatable) positions in space and on the other hand, we have to disturb the real context in a very limited way. The presence of external objects may alter the recognition or produce occlusions, mining the efficiency of the device. For this reason, we designed a specific test rig.

With reference to Fig. 1, the rig is comprised of a Plexiglas transparent plate (4 mm of thickness) mounted on four adjustable supports (pillars in white and red). The height of the supports can be adjusted with four screws and they behave in a telescopic way for varying the distance between the plate and the base. A matrix of blue points (circles of 2 mm of diameter) is printed on one side of the plate. The points are spaced of 30 mm in both x and y directions (11 points along x direction, 9 along y direction). The Leap controller is placed on the base and under the plate, in the middle of it. The reference systems of the plate (with the superscript “p”) and the Leap (with the superscript “l”) are depicted in Fig. 1 and are used for the subsequent evaluations.

Fig. 1
figure 1

Experimental setup for assessing the Leap motion controller’s performances

Fig. 2
figure 2

Accurate measurement of the reference points on the plate (on the left) and a snapshot taken during experimental test measurements (on the right)

With this arrangement, the user can freely move the hands in the space above the plate without obstacles and can touch the plate with the fingertips at precise and repeatable locations.

The exact position of the points with respect to the Leap controller has been measured by using the Microscribe G2X digitizer (Fig. 2, on the left). According to manufacturer’s specifications, its probe is able to acquire the position in space with an accuracy of 0.23 mm [27].

The collimation between Leap controller and the matrix of point is necessary for taking into account also small errors of positioning or misalignment, in order to fully map the position of points \(\mathbf{P}_i \) in the Leap controller’s acquisition frame \(\mathbf{P}_i =\left\{ {{\begin{array}{lll} {x_i }&{} {y_i }&{} {z_i } \\ \end{array} }} \right\} _l^T \).

With reference to Figs. 1 and 2, there are three reference systems: that of the Leap, that of the digitizer and that of the plate on which the mesh is printed. The coordinate transformation of a generic point \(\mathbf{P}_i \) can be computed as:

$$\begin{aligned}&{} \mathbf{P}_{i,l} =\left[ T \right] _l^p \mathbf{P}_{i,p}\nonumber \\&{} \mathbf{P}_{i,d} =\left[ T \right] _d^p \mathbf{P}_{i,p}\nonumber \\&{} \mathbf{P}_{i,d} =\left[ T \right] _d^p ( {\left[ T \right] _l^p } )^{-1}{} \mathbf{P}_{i,l} \end{aligned}$$
(1)

where the subscript i refers to the Leap reference system, d refers to the digitizer reference system and p to the plate reference system. The matrices \(\left[ T \right] _j^k \) are the 4 by 4 transformation matrix from reference system j to reference system k. The use of 4 by 4 transformation matrices allows for taking into account both rotation and translation components.

By using the Eq. (1), it is then possible to compute the absolute coordinates of a generic point \(\mathbf{P}_i \), tracked by the Leap Controller in its relative reference system. Another important assessment, before proceeding to the tests is about the effect of refraction on the measurement. The Leap controller works using optical triangulation. In common application, it is expected that between the Leap and the hands there is only air. On the other hand, the plate introduces a discontinuity in the optical path that may affect the measurement. For this reason, the contribution of the refraction has been assessed as a preliminary activity in order to compute the correction to be applied to the measured coordinates.

Fig. 3
figure 3

Geometrical nomenclature for assessing the correction to apply to the measurement due to refraction. The figure represents a planar case for simplicity, but the parameters can be extended to a three-dimensional case

With reference to Fig. 3 (the figure is depicted as two-dimensional drawing for simplicity, but the actual computation is still valid in three-dimensions), when a real point \(\mathbf{P}\) is triangulated from two points of view (\(\mathbf{O}_\mathbf{1} \) and \(\mathbf{O}_\mathbf{2} )\), it appears at a different location (apparent location \(\mathbf{P}_\mathbf{a} )\) for the bending of the rays due to the passage through different materials. In order to correct the measurement, it is necessary to compute the distance between \(\mathbf{P}\) and \(\mathbf{P}_\mathbf{a} \). This computation can be accomplished with the following geometrical considerations.

We assume the location of triangulation points \(\mathbf{O}_\mathbf{1} \) and \(\mathbf{O}_\mathbf{2} \) to be known, since they depend on the device embodiment (they can be retrieved by the manufacturer’s specifications). We also know the location of the real point \(\mathbf{P}\) (since it has been measured before the test begins), the distance between the triangulation points and the bottom surface of the plate Hand its thickness T.

In order to compute the apparent location \(\mathbf{P}_\mathbf{a} \), we have to impose the well-known condition stated by the Snell’s refraction law: “the ratio of the sines of the angles of incidence (\(\alpha \) and \(\beta )\) and refraction (\(\alpha ^{\prime }\) and \(\beta ^{\prime })\) is equivalent to the reciprocal of the ratio of the indices of refraction”. In terms of vectors, it can be written considering the cross products as:

$$\begin{aligned}&\frac{\left\| {\mathbf{AO}_\mathbf{1} \times \mathbf{n}} \right\| }{\left\| {\mathbf{AO}_\mathbf{1} } \right\| \left\| \mathbf{n} \right\| }-IR\frac{\left\| {\mathbf{PA}\times \mathbf{n}} \right\| }{\left\| {\mathbf{PA}} \right\| \left\| \mathbf{n} \right\| }=0 \end{aligned}$$
(2)
$$\begin{aligned}&\frac{\left\| {\mathbf{BO}_\mathbf{2} \times \mathbf{n}} \right\| }{\left\| {\mathbf{BO}_\mathbf{2} } \right\| \left\| \mathbf{n} \right\| }-IR\frac{\left\| {\mathbf{PB}\times \mathbf{n}} \right\| }{\left\| {\mathbf{PB}} \right\| \left\| \mathbf{n} \right\| }=0 \end{aligned}$$
(3)

where \(\mathbf{n}\) is the normal vector to the plate surface (supposed constant due to planarity) and IR is the index of refraction of Plexiglas with respect to the air (assumed 1.49).

The formulas (2) and (3) are two scalar equations with six scalar unknowns (the spatial coordinates of points \(\mathbf{A}\) and \(\mathbf{B})\). In order to make the system solvable, we need to add four more scalar equations. The first two involve the y coordinate of both points \(\mathbf{A}\) and \(\mathbf{B}\), constraining that the two points lay on the bottom surface of the plate:

$$\begin{aligned} \mathbf{A}_y= & {} H \end{aligned}$$
(4)
$$\begin{aligned} \mathbf{B}_y= & {} H \end{aligned}$$
(5)

The other two equations express the condition that the two groups of points \((\mathbf{A}-\mathbf{P}-\mathbf{O}_\mathbf{1} -\mathbf{O}_\mathbf{2} )\) and \((\mathbf{B}-\mathbf{P}-\mathbf{O}_\mathbf{1} -\mathbf{O}_\mathbf{2} )\) lay on two planes. These conditions can be written as:

$$\begin{aligned} \det [ {{\begin{array}{lll} {\mathbf{PA}}&{} {\mathbf{PO}_\mathbf{1} }&{} {\mathbf{PO}_\mathbf{2} } \\ \end{array} }} ]=0 \end{aligned}$$
(6)
$$\begin{aligned} \det [ {{\begin{array}{lll} {\mathbf{PB}}&{} {\mathbf{PO}_\mathbf{1} }&{} {\mathbf{PO}_\mathbf{2} } \\ \end{array} }} ]=0 \end{aligned}$$
(7)

The system of Eqs. (2)–(7) can be then solved for \(\mathbf{A}\) and \(\mathbf{B}\) (six equations in six unknowns).

The location of the apparent point \(\mathbf{P}_\mathbf{a} \) can be found as the intersection of two lines \(\mathbf{AO}_\mathbf{1} \) and \(\mathbf{BO}_\mathbf{2} \).

As a final step, the correction \(\mathbf{C}_\mathbf{P} \) to be applied to the measurement coming from the device can be computed as:

$$\begin{aligned} \mathbf{C}_\mathbf{P} =-\mathbf{P}_\mathbf{a} \mathbf{P} \end{aligned}$$
(8)

It is important to underline that the correction \(\mathbf{C}_\mathbf{P} \) is a spatial vector, so it modifies x, y and z coordinates of the measured point. Moreover, the correction varies with the point, depending on the distance from the observation points and the attitude with respect to the distance \(\mathbf{O}_\mathbf{1} \mathbf{O}_\mathbf{2} \). For the evaluation of the corrections the system of equation has been solved numerically.

Concerning with the experimental procedure (see Fig. 2, on the right), a group of ten users (5 male and 5 female) has been involved in the testing. Every subject was asked to touch with the five fingertips of the right hand each point on the plate in a sequence. The tests have been repeated five times (50 overall assessments for each finger). The entire test is then repeated varying the height between the plate and the Leap at 200, 400 and 600 mm. The data are then collected and analysed using statistics, by computing the mean values and standard deviations. Appendix 1 reports the main part of the C\(++\) code used for retrieving the location of the fingertip from the Leap Motion Controller. The code makes use of the free Leap Motion SDK [28].

3 Results and discussion

Figures 4, 5 and 6 show the results of the accuracy study in a graphical way. In all the figures, the green dots represent the physical points printed on the Plexiglas plate. The blue points are the mean values of the 50 assessments and the blue circles are the zones delimited by the 3\(\sigma \) deviation.

Fig. 4
figure 4

Experimental results on fingertip tracking when the transparent plate is at 200 mm far from the Leap (green dots are the targets, blue dots the mean value among all the tests and the blue dotted circle are the 3\(\sigma \) interval)

Fig. 5
figure 5

Experimental results on fingertip tracking when the transparent plate is at 400 mm far from the Leap (green dots are the targets, blue dots the mean value among all the tests and the blue dotted circle are the 3\(\sigma \) interval)

Fig. 6
figure 6

Experimental results on fingertip tracking when the transparent plate is at 600 mm far from the Leap (green dots are the targets, blue dots the mean value among all the tests and the blue dotted circle are the 3\(\sigma \) interval)

All the tracking acquisitions show robustness and the user’s hand is always recognized. Considering that all the subjects involved in the tests had no previous experience with the Leap, no training or repetition of mistaken tests have been necessary. No relevant differences are pointed out between male and female tracked subjects.

As an overall consideration on the results, the tracking is rather accurate and the errors of mean values are bounded within 4–5 mm, that is acceptable for finger tracking purposes and for the device typology. Moreover, the precision is acceptable as well, as testified by limited amplitude of the standard deviations, in the same order of the error amplitude. There is no systematic error in tracking and the refraction is properly corrected. Closer assessments (Plexiglas plate at 200 mm) is slightly more accurate than those at 400 and 600 mm. The index fingertip is acquired with higher accuracy with respect to the other fingers. Thumb shows the worst accuracy.

Table 1 Numerical results of the investigations

In general, there is no evident accuracy difference between near and far points with respect to the Leap centre. On the other hand, it is interesting to discuss that, if we decompose the testing region in four quadrants, the portion of points with positive x and negative y coordinates (the quadrant at the bottom right) shows greater tracking errors, especially the farthest points. This is probably due to the fact that when pointing to such locations, the right hand is partially out of the range (or at limit) of the view field of the Leap and its recognition is not very accurate. Similar (and opposite) behaviour occurs if the points of the second quadrant (negative x and y coordinates) are touched with left hand fingertips.

Table 1 summarizes the results of the investigations in a numerical aggregate way, distinguishing between the four quadrants. Results are presented for the three different distances from the device.

4 Conclusion

An original experimental procedure for assessing the accuracy of the Leap Motion Controller in the tracking of fingertips is presented. The methodology makes use of a specific test rig, which is designed and built for the purpose of the test. The assessment is performed on the field, using human subjects in a real context. The accuracy of all five fingers of the right hand is assessed, considering the pointing at a transparent grid of reference points placed at three different distances from the device.

Results show that the tracking is robust and stable and errors in fingertip estimation are bounded within 4–5 mm. This performance is very suitable for the use of the device in interactive virtual applications such as virtual and augmented reality, virtual object manipulation and virtual prototyping. Although no systematic error in tracking is experienced, it is found that when tracking the right hand, the pointing at the lower right quadrant gives worst results. Vertical distances from 200 to 600 mm from the device does not produce relevant difference in tracking although, closer distance allows slightly more accurate tracking. No differences between male and female tracked subjects have been experienced in the tests. These results encourage the use of the Leap Motion device for implementing interactive applications also in the filed of geometrical modelling, design and manufacturing and the integration with both virtual and augmented reality methodologies.