Keywords

1 Introduction

Recently, image based surgical robot system has been rapidly developed resultant of the development of computer and optical technology [1]. Due to the application of image guided system on robot surgery, sensor based accurate robot control, and tracking of the positions of surgical tools and patient, navigation of robot surgery is now capable. General image based optical tracking system uses a rigid body marker with retro-reflective spheres attached to it. The system performs tracking by calculating the position and pose of the marker through geometric analysis of the marker spheres using stereo camera [2]. In comparison with the old days, the size of the rigid body marker is small. However, it quite disturbs the user when it is attached to a small tool, especially when applying it in elaborate work as surgery. If rigid body marker attached to the patient or surgical instruments overlaps each other, the identification of each rigid body marker is difficult. Also as the distance between the marker and camera becomes distant, the accuracy of measurement of position and pose decreases. Because most of the tracking systems use similar methods, the tracking systems all have these kinds of problems. Because the basic technique is similar to each other, it is hard to improve the weaknesses of the optical tracker. But surgical robot systems require a more accurate tracking system to improve the performance and safety. Therefore a new method for tracking system that can overcome these weaknesses is required, and this is introduced in this paper.

The idea came from the bokeh phenomenon that is used in bokode [3]. Bokode uses the bokeh phenomenon by using two separate optical lenses. A lenslet is in front of pattern that contains datamatrices and the camera observes this with the camera lens out of focused. The bokode pattern appear as tiny dots to the human eye and camera in sharp focus, but appears much bigger to the out of focused camera. Therefore, the pattern can be identified accurately even with a distance of few meters. The advantage of the bokode is that it can contain much more data in smaller area compared to barcodes or other tags. There is an application using bokode to measure position and angle of the bokode too. However, the method is not accurate and the position and angle are dependent to each other. Which means it measures the relative position and angle. Therefore it can’t be used as a tracking method.

In this paper, afocal tracking system that uses the bokeh phenomenon [4] and the pattern is introduced. The system has two cameras as a sensor, an afocal marker to be attached on tracking target. The afocal tracking system measures the absolute position and pose of the marker, which means it is possible to use it as a tracking system. The system can use smaller markers than traditional optical trackers and can maintain the accuracy of measurement independent of distance. Also for this method is a new one, it has strong possibility to be improved.

2 Method

2.1 Basic Idea

Before details of the measurement method, understanding of the bokode is desirable. It is based on the bokeh effect of ordinary camera lenses. The concept of bokode is to use more data with a smaller code pattern. Also bokode can read the data of the codes from farther distances. But the most important point we focused on was that the patterns in the bokeh changes by the position and angle of the bokode. But in the study, the pattern fades as the angle between the camera and the bokode increases by lens aberration. In the bokode study, there is an application for measuring the angle and the position of the bokode. But the position and angle measured are related to each other, so if the position and the angle changes at the same time, it can’t know the real position and rotation of the bokode. For example, if the position of the bokode changes without rotation, the measured angle changes too because the measured angle is just the angle between the camera lens optical axis and the bokode lenslet optical axis. Also the depth is measured by the diameter of the diameter of the bokeh, but the change rate of the diameter of the bokeh related to the depth is small, so the measuring resolution is low. Though, the study of the bokode has big significance in tracking because it is a new method of measuring positions and angles. In this paper, a tracking method using the concept of the bokeh to measure the accurate position and the accurate pose of the marker is introduced. We call it an afocal tracking system.

The difference of the measuring method is that afocal tracking system measures the absolute position and pose of the marker using two cameras. Also the viewable angle is improved by using fisheye lens in front of the marker pattern. It will be mentioned as afocal marker.

Before configuring the system, three methods of tracking was considered. The three methods are shown in Fig. 15.1. One is one camera with stereo afocal marker, another is stereo camera with stereo afocal marker, and the other is stereo camera with one afocal marker. System (A) has an advantage in cost because it requires only one camera, but has a disadvantage that the marker size is doubled for it uses two afocal markers. System (B) has an advantage in accuracy. The afocal tracking system has better accuracy of measuring the vertical and horizontal measurement than the depth measurement. System (B) calculates the depth using the vertical and horizontal measurement therefore performs higher accuracy in measuring the depth. But it has a disadvantage of cost and size of marker for it uses two cameras and two afocal marker. System (C) has an advantage of the marker size for it uses one afocal marker. The cost is higher than system (A) but lower than system (B). Considering surgical application, which is important to use a small target marker for convenience of surgery, System (C) was applied and developed.

Fig. 15.1
figure 1

Configurations of tracking systems for afocal tracking system [(A) One camera with stereo afocal marker, (B) Stereo camera with stereo afocal marker, (C) Stereo camera with one afocal marker]

2.2 Hardware

The hardware configuration of afocal tracking system is very important to perform accurate tracking in proper area. The afocal marker has a fisheye lens in front, a glass pattern with datamarices printed on, and an infrared LED behind the glass pattern. The design of the afocal marker is shown in Fig. 15.2.

Fig. 15.2
figure 2

Concept of afocal marker for afocal tracking system (a Fisheye lens. b Glass pattern. c Infra-red LED)

The reason of using infrared light is so that we can improve the accuracy and speed of image processing. There is a infra-red filter in front of the camera lens so that the images obtained from cameras only includes the bokeh and the pattern from the afocal marker. To do tracking of proper tracking area, the aperture of the camera lenses, the focal length of the camera lenses, the focal length of the fisheye lens, and the geometrical position of the two cameras has to be carefully modulated. Also the printing size of the pattern on the glass pattern and the pixel size of the camera CCD has to be considered for it decides the resolution of pattern in the obtained images. The resolution has to be high enough to decode the datamatrices. In this study, for high resolution we used a 1 mm by 1 mm sized pattern glass that has 91,809 datamatrices printed on it. A photomicrograph of data matrix printed on glass pattern is shown in Fig. 15.3.

Fig. 15.3
figure 3

Photomicrograph of data matrices printed on glass pattern

The developed in this application is configured for FOV (Field Of View) of 200 mm by 200 mm in 300 mm distance from the stereo camera. The system considering the FOV is shown in Fig. 15.4.

Fig. 15.4
figure 4

Tracking system considering the FOV

2.3 Measuring of Position and Pose of Afocal Marker

For position measuring, two methods are developed. The first one is based on stereo camera system and the second one is using the geometrical relations of patterns and camera CCD.

To use stereo camera system in afocal tracking system, calibration is somewhat difficult. Because the camera lenses of afocal tracking system are set to out of focus, ordinary camera calibration using chessboards can’t be used. Therefore a calibration board is designed with corner points formed by pinholes and LED light illuminating behind it. The LED light goes through the pinholes, blurs in the image as a circle. Through circle fitting, the center points of these blurred circles are found and Zhang’s calibration [5]. But this calibration board is not yet manufactured. Instead for temporary method linear motor stages are used to change the afocal marker’s position to position it on all calibration center points, and the center points are extracted by finding the center of the bokeh. After calibration, measuring the position of the afocal marker is done by stereo camera system. Then measuring the pose of the afocal marker is done by the geometrical relation of one afocal marker and the camera which is shown in Fig. 15.5. The geometrical relation can be expressed as equation.

$$ \begin{aligned} {\text{S}}\left[ {\begin{array}{*{20}c} {u^{\prime} } \\ {v^{\prime} } \\ 1 \\ \end{array} } \right] = \left[ A \right]\left[ R \right]\left[ C \right]\left[ {\begin{array}{*{20}c} {\begin{array}{*{20}c} u \\ v \\ \end{array} } \\ {\begin{array}{*{20}c} 1 \\ \end{array} } \\ \end{array} } \right] & = \left[ {\begin{array}{*{20}c} { - \frac{{f_{c} }}{pw.}} & 0 & {u_{c}^{\prime} } \\ 0 & { - \frac{{f_{c} }}{ph.}} & {v_{c}^{\prime} } \\ 0 & 0 & 1 \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} {\begin{array}{*{20}c} {r_{11} } & {r_{12} } \\ {r_{21} } & {r_{22} } \\ {r_{31} } & {r_{32} } \\ \end{array} } & {\begin{array}{*{20}c} {r_{13} } \\ {r_{23} } \\ {r_{33} } \\ \end{array} } \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} {\begin{array}{*{20}c} 1 \\ 0 \\ 0 \\ \end{array} } & {\begin{array}{*{20}c} 0 \\ 1 \\ 0 \\ \end{array} } & {\begin{array}{*{20}c} { - u_{c} } \\ { - v_{c} } \\ {f_{b} } \\ \end{array} } \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} {\begin{array}{*{20}c} u \\ v \\ \end{array} } \\ {\begin{array}{*{20}c} 1 \\ \end{array} } \\ \end{array} } \right] \\ (s & = r_{31} u + r_{32} v + r_{33} f_{b} ) \\ \end{aligned} $$
(15.1)
Fig. 15.5
figure 5

Geometrical relation of afocal marker and camera

Equation (15.1) is similar to that of Zhang’s calibration [5]. By obtaining a few images of the afocal marker with random position and poses, and then using similar calculation of Zhang’s calibration we can know the matrix H, A and C. with Eq. (15.2) the pose of the afocal marker is obtained.

$$ \left[ H \right] = \left[ A \right]\left[ R \right]\left[ C \right] $$
(15.2)
$$ \left[ R \right] = \left[ A \right]^{ - 1} \left[ H \right]\left[ C \right]^{ - 1} $$
(15.3)

Another method is using the geometrical relations of the afocal tracking system. This method measures the position and the pose simultaneously. The geometrical relations of afocal tracking system are shown in Fig. 15.6. For better understanding, the method to obtain the position and pose is described in two-dimensional coordinate. The small lens (fisheye lens of afocal marker) and the dotted line (glass pattern of afocal marker) are rotated from 0 to the positive direction with θ. L1 and L2 is the center of the camera lenses of camera1 and camera2. P1 is the position of the datamatrix shown in the center of the bokeh of camera1 and C1 is the position of the bokeh center CCD pixel of camera1. P2 is the position of the datamatrix shown in the center of the bokeh of camera2 and C2 is the position of the bokeh center CCD pixel of camera2.

Fig. 15.6
figure 6

Geometrical relations in afocal tracking system (a when there is no rotation of the afocal marker and b when there is rotation of the afocal marker)

The center of the fisheye lens is (X, Y), Point P1 is \( (f_{f} \cdot \cos \theta - u^{\prime}_{2} \cdot \sin \theta + X, \, \sin \theta \cdot f_{f} + u^{\prime}_{2} \cdot \cos \theta + Y) \), point P2 is \( (f_{f} \cdot \cos \theta - u^{\prime}_{1} \cdot \sin \theta + X, \, \sin \theta \cdot f_{f} + u^{\prime}_{1} \cdot \cos \theta + Y) \), L1 is (0, 0), L2 is (0, L), C1 is (−f c , −u 1) and C2 is (−f c , −u 2). f c is the focal length of the camera lens, f f is the focal length of the fisheye lens, u 1 is the bokeh center pixel of camera1, u 2 is the bokeh center pixel of camera1, \( u^{\prime}_{1} \) is the distance between the glass pattern center and the datamatrix at point P1, \( u^{\prime}_{2} \) is the distance between the glass pattern center and the datamatrix at point P2 and L is the distance between the two camera lenses. Point P1, the center of the fisheye lens, L1 and C1 are on line1. Point P2, the center of the fisheye lens, L2 and C2 are on line2. Therefore equations of straight line made of the four points on line1 equals each other, and the equations of straight line made by the four points on line2 equals each other. Solving the system of equations, the angle θ, X and Y are obtained.

$$ \tan \theta = \frac{{ - f_{f} \left( {u_{2} - u_{1} } \right) - f_{c} (u^{\prime}_{2} - u^{\prime}_{1} )}}{{u^{\prime}_{1} u_{1} - u^{\prime}_{2} u_{2} }} $$
(15.4)
$$ {\text{X }} = \frac{{\left[ {\left( {{\text{u}}_{1} + {\text{u}}_{2} } \right){\text{f}}_{\text{f}} - \left( {{\text{u}}^{\prime}_{1} - {\text{u}}^{\prime}_{2} } \right){\text{f}}_{\text{c}} } \right]\cos \theta - \left( {{\text{u}}^{\prime}_{1} {\text{u}}_{1} - {\text{u}}^{\prime}_{2} {\text{u}}_{2} } \right)\sin \theta - L{\text{f}}_{\text{c}} }}{{{\text{u}}_{1} - {\text{u}}_{2} }} $$
(15.5)
$$ {\text{Y }} = \frac{{\left[ {\left( {{\text{u}}^{\prime}_{1} {\text{u}}_{2} - {\text{u}}^{\prime}_{2} {\text{u}}_{1} } \right){\text{f}}_{\text{c}} - 2{\text{u}}_{1} {\text{u}}_{2} {\text{f}}_{\text{f}} } \right]\cos \theta + \left[ {\left( {{\text{u}}^{\prime}_{1} + {\text{u}}^{\prime}_{2} } \right){\text{u}}_{1} {\text{u}}_{2} - \left( {{\text{u}}_{1} + {\text{u}}_{2} } \right){\text{f}}_{f} {\text{f}}_{\text{c}} } \right]\sin \theta + L{\text{f}}_{\text{c}} {\text{u}}_{1} }}{{({\text{u}}_{1} - {\text{u}}_{2} ){\text{f}}_{\text{c}} }} $$
(15.6)

Doing the same process in three-dimensional coordinates, the position and the pose of the afocal marker are obtained.

3 Experiment

3.1 Position Measurement

Experiments are done for the first method which is proposed in Sect. 15.2. The experimental setup for the second method is in progress and the experiment and analysis is left for further work.

To do this experiment, a step motor stage was used to change the position of the afocal marker. First, two step-motor stages were mounted for the vertical and horizontal direction on a straight rail which was position for the depth direction as shown in Fig. 15.7. Then the vertical and horizontal direction position was measured by afocal tracking system with various depth by changing the depth position with the rail.

Fig. 15.7
figure 7

The experiment for position measurement

After that the two step motor stage was mounted for the vertical and depth direction on a straight rail which was position for the horizontal direction. Then the depth position was measured by afocal tracking system with various horizontal and vertical direction. The measurement of each direction was done 150 times. Error between real moving distance of marker using motor stage and measured moving distance of marker using afocal system is calculated. And standard deviation of calculated error is shown in Table 15.1. The ‘y’ for the vertical direction, ‘x’ for the horizontal direction and ‘z’ for the depth direction. The comparison of performance with commercialized optical tracker named ‘polaris vicra’ of NDI corp. (http://www.ndigital.com) is shown in Table 15.2.

Table 15.1 Standard deviation of position measurements of afocal system
Table 15.2 Comparison of position measurements with commercialized optical tracker

In case of afocal system, average position error is 200 μm. And standard deviation of error is 219 μm. But standard deviation of position error is 452 μm by using Polaris vicar. So it proves that afocal system measures more accurately better than Polaris vicar.

3.2 Pose Measurement

To do this experiment, a rotation stage was used as shown in Fig. 15.8. As it was difficult to make a roll, pitch and yaw controlling stage, the measurement experiment was done in one directional rotation axis. The rotation stage was rotated from 11º to −15º with every 1º increment in x-axis. As the manual rotation stage was operated by hand, the rotational accuracy of the stage was assumed as almost 0.03º. The results are shown in Table 15.3 and Fig. 15.9.

Fig. 15.8
figure 8

The experiment for pose measurement

Table 15.3 Results of pose measurements
Fig. 15.9
figure 9

Result of pose measurement with x-axis rotation. a X-axis. b Y-axis. c Z-axis

In Fig. 15.9, because only x axis is rotated by the manual rotation stage, values of y and z axis are almost steady. The rotation angle of x axis is changed by 1º increment consistently at every sensing time, and the sensing tests are performed in every change.

Average error and error standard deviation of the rotation measurements show 0.02º and 0.15º in x-axis. However, the results in y-axis show somewhat large error. It results from the misalignment between coordinates of the sensor and the rotation stage.

The obtained result of position and pose measurement shows that the proposed system is tracking in a reasonable accuracy. In the case of pose measurement, it uses only one camera for calculation. The accuracy of pose measurement will surely be improved using two cameras and this is in progress. After the whole system is implemented, experiments will be done to compare the accuracy with other commercial tracking systems [6] and inspect that the proposed system is usable for surgical application.

4 Conclusion

Expectation and demand of surgical robot application are steadily increasing. Therefore accurate image based robot controlling for safe and accurate robot surgery is becoming more and more important. Also, the size of markers which are attached to patients or surgical instruments needs to be smaller for convenience of surgery. Therefore in this paper, a novel method of tracking that is possible to miniaturize the marker and has accurate measurement of position and pose is proposed. Afocal tracking system also maintains the measurement accuracy independent of the distance of the target. Although, manufacturing a smaller sized afocal marker is left for further work. The fisheye lens used in the proposed system is a commercial lens and it takes almost half of the size of the afocal marker. The development of a much smaller fisheye lens that can be mounted in the afocal marker is in progress. The accuracy of pose measurement will surely be improved after using two cameras for measurement. The importance of this work is that it is a novel method of tracking, and therefore has strong possibility of improvement.