Abstract
Described here is a novel method for automatic detection and enhancement of needles under 3D ultrasound guidance. We develop a detector consisting of a linear learning-based pixel classifier that utilizes Histogram of Oriented Gradients descriptors extracted from local phase projections. The detector automatically identifies slices of the volume that contain needle data, reducing the needle search space. Needle tip enhancement is performed on a projection of the extracted sub-volume, followed by automatic tip localization using spatially distributed image statistics within the trajectory constrained region. Evaluation of the proposed method on 40 volumes of ex vivo bovine tissue shows \(88\%\) detection precision, \(98\% \) recall rate, mean classification time per slice of 0.06 s and mean tip localization error of \(0.44\pm 0.13\) mm. The promising results indicate potential of the method for further evaluation on clinical pain management procedures.
Access provided by CONRICYT-eBooks. Download conference paper PDF
Similar content being viewed by others
1 Introduction
Ultrasound (US) guidance for regional anesthesia has gained popularity in clinical practice because of its radiation-free, low-cost and real-time nature. With two-dimensional (2D) US, which is the current standard, it is often difficult to align the needle with the scan plane. Needle localization is even more difficult for deep or steep insertions. This may impair therapeutic efficacy or cause injury. To address this challenge, three-dimensional (3D) US has emerged as a viable alternative [1]. 3D US permits simultaneous multi-planar visualization of the needle without probe adjustment, hence orientation of the needle with respect to the scan plane need not be perfect. However, needle visibility in 3D US is affected by low dimension of the needle with respect to the US volume, signal attenuation, high intensity artifacts and speckle noise.
Previously, algorithms for needle enhancement and localization in 3D US were reported. These include: the 3D Hough transform (HT) [2], projection-based methods such as parallel integration projection (PIP) [3] and iterative model-fitting methods based on the random sample consensus (RANSAC) algorithm [4]. These methods generally suffer from computational complexity due to the large amount of volume data that must be processed [5]. Further, since these methods are intensity-based, challenges may arise under difficult imaging conditions or in the presence of high intensity US imaging artifacts.
Although the RANSAC based ROI-RK method proposed in [4, 5] reduces calculation time, it is not robust to high intensity artifacts and steep insertion angles. The limitations of intensity-based methods can be overcome with the use of local phase features. A qualitative comparison of local phase, HT and RANSAC based needle-axis localization is presented in Fig. 1, where we observe that when the only high intensity feature present is the needle, all methods give accurate localization, short of which only local phase features consistently yield accurate localization. In [6], oscillation of a needle stylet was modeled into a projection-based localization framework, providing a more robust solution. However, oscillating the stylet during US guided needle insertion is difficult in a single operator scenario, especially for shallow angles.
Recently, a robust, intensity-invariant algorithm for needle enhancement and localization in 2D US was proposed [7]. Needle shaft and tip were enhanced by incorporating US signal transmission models in an optimization problem. The needle trajectory was estimated from local phase-based projections of the enhanced B-mode image [8]. However, incorrect tip localization arose when high intensity soft tissue artifacts were present along the needle trajectory. The algorithm also required proper alignment of the needle with the scan plane. In this paper, we address the limitations in [7] by extending this promising method to 3D. Our main contributions are: (1) A learning based classifier that utilizes local phase descriptors to detect needle-containing slices in the US volume. (2) A technique that computes multi-planar reconstructions for needle tip localization in 3D. Our specific clinical focus is needle guidance in spinal injections such as lumbar facet joint and medial branch blocks in obese patients. Preliminary qualitative and quantitative validation results on ex vivo volumes demonstrate that our method is robust and has a low execution time, making it suitable for clinical evaluation in these pain management procedures.
2 Methods
We propose a two-stage framework illustrated in Fig. 2. We first detect slices (2D frames acquired from a motorized 3D transducer) with needle data. This is followed by needle enhancement and multi-planar tip localization. The following sub-sections describe this process in detail.
2.1 Needle Detection
Previously, locally normalized histograms of oriented gradients (HOG) descriptors were shown to be efficient at capturing gradient information [9]. They are also invariant to translations or rotations, demonstrating performance similar to Scale Invariant Feature Transformation (SIFT) descriptors. As such, locally normalized HOG descriptors make robust feature sets for needle detection. In our design, we extract intensity-invariant local phase descriptors and use them to derive HOG descriptors.
Local Phase Descriptors for Needles: We apply orientation tuned intensity-invariant local phase filter banks to each slice of the 3D volume (hereafter denoted as \(US_{volume}\)) to extract a needle phase descriptor, hereafter denoted as NPD(x, y). The filter banks are constructed from 2D Log-Gabor filters, whose parameters are selected automatically using the framework proposed in [8]. It is assumed that the insertion side of the needle is known a priori, and the calculation is limited to an automatically selected region of interest (ROI) on the insertion side. It is expected that the ROI contains a visible part of the shaft. The output of the filter operation gives a phase-based descriptor called phase symmetry, PS(x, y), which is used as an input to the Maximum Likelihood Estimation SAmple Consensus algorithm (MLESAC) [10]. We use MLESAC to prune false positive pixels and connect inliers to yield NPD(x, y). Figure 3 shows examples of slices with and without NPD(x, y). Investigating Fig. 3 (first and last columns), we note that slices without needle data do not contain NPD(x, y), while slices with needle data (middle 7 columns) possess NPD(x, y), existing as bright intensity straight features, commensurate with a rigid needle.
Detector Architecture: For details of the HOG algorithm, we refer the reader to [9]. Specifically, we use \(L_{2}\)–Hys (Lowe-style clipped \(L_{2}\)–norm) contrast normalization on overlapping \(3\times 3\) cell blocks of \(4\times 4\) pixel cells: From the unnormalized descriptor vector v, \(L_{2}\)–Hys is determined by clipping the \(L_{2}\)–norm, \(\mathbf v \rightarrow \mathbf{v /\sqrt{\parallel \mathbf v \parallel ^2_{2}+ {\epsilon }^2}} \) where \(\epsilon \) is a small constant. This normalization is done to achieve invariance to geometric transformations. HOG computation is performed using a \(64\times 128\) sliding detection window, and the resulting descriptor is fed to a linear support vector machine (SVM) baseline classifier.
The detector is applied to each of the slices in \(US_{volume}\) after preprocessing to elicit needle phase descriptors similar to those used in training the detector. The resulting sub-volume, \(US^{*}_{volume}\), consists of only slices that contain needle data. Volume reduction saves computing load in the needle enhancement and localization steps that follow. It also removes slices that have artifacts which would degrade needle enhancement. Figure 3 (bottom row) illustrates an example of needle detection from volume data. Detected needles are shown with rectangular annotation.
2.2 Needle Enhancement
The goal of this step is to remove speckle, reverse attenuation effects, and minimize the effect of artifacts in the sub-volume \(US^{*}_{volume}\) so as to improve visibility of the needle shaft and tip. We design our approach to suit tip localization for in-plane insertion. In [7], it was shown that the needle tip and shaft can be enhanced by modeling US signal transmission using \(L_{1}\)-norm based contextual regularization. We follow a similar approach, where US signal transmission in each slice is modeled as \({ S(x,y)=S_{t}(x,y)S_{e}(x,y)+(1-S_{t}(x,y))\kappa .}\) Here, S(x, y) is a slice in \(US^{*}_{volume}\), \(S_{t}(x,y)\) is the signal transmission map, \(S_{e}(x,y)\) is the desired enhanced image while \(\kappa \) is the average intensity of the tissue surrounding the needle in attenuated regions. \(S_{t}(x,y)\) is obtained by minimizing the objective function:
Here, \(S_{a}(x,y)\) is a patch-wise transmission function representing boundary constraints imposed on the image by attenuation and orientation of the needle, \(\zeta \) is an index set of image pixels, \(\circ \) is element wise multiplication, and \(\star \) is a convolution operator. \(R_{i}\) a bank of high order differential filters consisting of eight Kirsch filters and one Laplacian filter, and \(\varGamma _{i}\) is a weighting matrix calculated from \(\varGamma _{i}(x,y)=exp(-\mid R_{i}(x,y) \star S(x,y)\mid ^2)\). Details of how \(S_{a}(x,y)\) is obtained are presented in [7]. After calculating \(S_{t}(x,y)\) using (1), \(S_{e}(x,y)\) is extracted from:
Here, \(\varepsilon \) is a small constant and \(\rho \) is related to the attenuation co-efficient of the tissue. To minimize the effect of high intensity artifacts aligned with the needle trajectory, each enhanced slice is subjected to a Top-hat filter operation using a linear structuring element. The final enhanced slices constitute the enhanced sub-volume denoted as \(USE^{*}_{volume}\).
2.3 Tip Localization
In our workflow, the needle tip location is displayed in two planar visualizations, parallel and normal to the needle insertion direction. We consider a 3D US volume where x, y, z denote the lateral, axial and elevation directions respectively (Fig. 4). Our interest is determining \(\varOmega (x',y',z',\chi )\), the 3D tip location, where \(\chi \) is the characteristic intensity of the tip in \(USE^{*}_{volume}\).
2D Tip Localization: If needle insertion is in the y–z plane, then the x–y plane is parallel to the needle insertion direction. We determine \(x'\) and \(y'\) from a projection \(P_{x,y}\) since \(x'\) and \(y'\) have the same value in all slices. \(P_{x,y}\) is calculated as the maximum intensity projection (MIP) of \(USE^{*}_{volume}\), by extracting maximum intensity values along optical paths in the z direction. From this projection, the needle tip is localized following the algorithm in [8]. In summary, we determine the phase symmetry PS(x, y) of \(P_{x,y}\) in a region limited to the needle trajectory, apply the MLESAC algorithm for inlier detection and geometrical optimization, followed by feature extraction on the resultant point cloud using a combination of spatially distributed image statistics which enhance the needle tip. This yields the projection enhanced needle image denoted as PE(x, y). \((x',y')\) is determined from the first maximum intensity pixel at the distal end of the needle trajectory in PE(x, y).
Scan Plane Determination: In this context, scan plane means the slice containing the needle tip, which is the most advanced portion of the needle in the elevation (z) direction of the volume. The scan plane is determined by calculating \(\sum _{i=-\gamma }^{+\gamma }\sum _{j=-\gamma }^{+\gamma }I(x'+i, y'+j)\), the sum of pixel intensities in a bounded square patch of length \(2\gamma \) centered at \((x',y')\) in each slice within \(USE^{*}_{volume}\). The scan plane is estimated as the slice with the maximum intensity sum. The result gives us \(z'\). Figure 4 shows the tip localization process and qualitative results for different imaging conditions as well as the imaging coordinates used during tip localization.
2.4 Data Acquisition and Experimental Validation
3D US volumes were acquired using the SonixTouch system (Analogic Corporation, Peabody, MA, USA) equipped with a 4DL14-5/38 broadband volumetric probe. A 17-gauge (1.5 mm diameter, 90 mm length) Tuohy epidural needle (Arrow International, Reading, PA, USA) was inserted into freshly excised bovine tissue. The transducer motor was automatically controlled during insertion to achieve a Field of View (FOV) of \(10^\circ \) for sweeps of \(0.244^\circ \) per frame and 41 frames per volume. Multiple experiments were performed at various needle depths (40–80 mm) and orientations (\(30^\circ \)–\(70^\circ \)) with the needle in a native axial/elevation (y–z) direction of the volume. A total of 80 volumes were collected. The US system settings were fixed for all imaging sessions. The volumes were divided into 2 sets without overlap: 40 for training and 40 for validation.
The proposed method was implemented in MATLAB on a 3.6 GHz Intel(R) \(\mathrm {Core^{TM}}\) i7 CPU, 16 GB RAM Windows PC. The Log-Gabor filter parameters were determined automatically using the method proposed in [8]. In (2), \(\kappa =0.5\times I_{max}\), where \(I_{max}\) is the maximum intensity in S(x, y), \(\rho =2\) and \(\varepsilon =0.0005\). These values were empirically determined and fixed during validation. For the training dataset, 150 positive and 100 negative samples for NPD(x, y) were manually selected. Performance of the needle detector was evaluated by calculating Precision (P) and Recall Rate (RR), where \(P = True \ Positive/(True\ Positive + False\ Positive)\) and \(RR = True\ Positive/(True\ Positive + False\ Negative)\). To determine localization accuracy, the ground truth tip location was segmented manually by an expert user in volumes where the tip was visible. Tip localization error was determined by calculating the Euclidean Distance (ED) between the automatically localized tip and the manually segmented tip.
3 Results
Qualitative results (Fig. 4) show that our method gives accurate tip localization for moderate to steep insertion angles, including cases when the shaft is discontinuous (Fig. 4 middle and bottom rows). Quantitative results revealed average precision of \(88\%\), recall rate of \(98\%\), detector execution time (per slice) of 0.06 s, overall execution time (for both slice detection and tip localization) of 3.5 s, tip localization error of \(0.44\,\pm \,0.13\) mm and maximum localization error of 1.62 mm.
4 Discussion and Conclusions
We have proposed a novel learning-based method for automatic detection and localization of needles in US volumes. The low slice classification time potentially suits real-time applications and can complement previous approaches such as those reported in [2,3,4,5,6]. Considering the anatomy of our focus application (medial branch nerves are typically about 1 mm in diameter), a tip localization error of less than 1 mm is clinically acceptable. In [7], analysis of US data from porcine, bovine, kidney and liver tissue showed that local phase features are not affected by the intensity variations caused by different tissue types. Since the detector uses HOG descriptors derived from local phase features, detection accuracy is independent of tissue type. On account of including needle data from pertinent slices, accurate tip localization is possible when the needle is misaligned with the scan plane. The sufficiently high recall rate demonstrates that the detected volume always contains sufficient needle data to support the localization process.
The method is validated on epidural needles with minimal bending. For enhancement of bending needles, the proposed model can be updated by incorporating bending information into the framework. In future, we will investigate automating parameter selection for the algorithm, performance of the proposed method on needles of different gauges, real-time implementation of the proposed method, and a 3D classifier, in which needle detection is performed in a single extraction step applied to the entire volume.
References
Clendenen, S.R., Riutort, K., Ladlie, B.L., Robards, C., Franco, C.D., Greengrass, R.A.: Real-time three-dimensional ultrasound-assisted axillary plexus block defines soft tissue planes. Anesth. Analg. 108, 1347–50 (2009)
Zhou, H., Qiu, W., Ding, M., Zhang, S.: Automatic needle segmentation in 3D ultrasound images using 3D improved Hough transform. In: Proceedings of SPIE Medical Imaging, vol. 6918, pp. 691821-1–691821-9 (2008)
Barva, M., Uhercik, M., Mari, J.M., Kybic, J., Duhamel, J.R., Liebgott, H., Hlavac, V., Cachard, C.: Parallel integral projection transform for straight electrode localization in 3-D ultrasound images. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 55(7), 1559–1569 (2008)
Zhao, Y., Bernard, A., Cachard, C., Liebgott, H.: Biopsy needle localization and tracking using ROI-RK method. Abstr. Appl. Anal. 2014, 1–7 (2014). Article ID 973147. doi:10.1155/2014/973147
Zhao, Y., Shen, Y., Bernard, A., Cachard, C., Liebgott, H.: Evaluation and comparison of current biopsy needle localization and tracking methods using 3D ultrasound. Ultrasonics 73, 206–20 (2017)
Beigi, P., Rohling, R., Salcudean, T., Lessoway, V.A., Ng, G.C.: Needle trajectory and tip localization in real-time 3-D ultrasound using a moving stylus. Ultrasound Med. Biol. 41(7), 2057–2070 (2015)
Mwikirize, C., Nosher, J.L., Hacihaliloglu, I.: Enhancement of needle tip and shaft from 2D ultrasound using signal transmission maps. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9900, pp. 362–369. Springer, Cham (2016). doi:10.1007/978-3-319-46720-7_42
Hacihaliloglu, I., Beigi, P., Ng, G., Rohling, R.N., Salcudean, S., Abolmaesumi, P.: Projection-based phase features for localization of a needle tip in 2D curvilinear ultrasound. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9349, pp. 347–354. Springer, Cham (2015). doi:10.1007/978-3-319-24553-9_43
Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: IEEE CVPR (2005)
Torr, P.H.S., Zisserman, A.: MLESAC: a new robust estimator with application to estimating image geometry. Comput. Vis. Image Underst. 78(1), 138–156 (2000)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Mwikirize, C., Nosher, J.L., Hacihaliloglu, I. (2017). Local Phase-Based Learning for Needle Detection and Localization in 3D Ultrasound. In: Cardoso, M., et al. Computer Assisted and Robotic Endoscopy and Clinical Image-Based Procedures. CARE CLIP 2017 2017. Lecture Notes in Computer Science(), vol 10550. Springer, Cham. https://doi.org/10.1007/978-3-319-67543-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-67543-5_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-67542-8
Online ISBN: 978-3-319-67543-5
eBook Packages: Computer ScienceComputer Science (R0)