Abstract
Orthopedic surgery is a widely performed clinical procedure that deals with problems in relation to the bones, joints, and ligaments of the human body, such as musculoskeletal trauma, spine diseases, sports injuries, degenerative diseases, infections, tumors, and congenital disorders. Surgical navigation is generally recognized as the next generation technology of orthopedic surgery. The development of orthopedic navigation systems aims to analyze pre-, intra- and/or postoperative data in multiple modalities and provide an augmented reality 3-D visualization environment to improve clinical outcomes of surgical orthopedic procedures. This chapter investigates surgical navigation techniques and systems that are currently available in orthopedic procedures. In particular, optical tracking, electromagnetic localizers and stereoscopic vision, as well as commercialized orthopedic navigation systems are thoroughly discussed. Moreover, advances and development trends in orthopedic navigation are also discussed in this chapter. While current orthopedic navigation systems enable surgeons to make precise decisions in the operating room by integrating surgical planning, instrument tracking, and intraoperative imaging, it still remains an active research field which provides orthopedists with various technical disciplines, e.g., medical imaging, computer science, sensor technology, and robotics, to further develop current orthopedic navigation methods and systems.
Access provided by CONRICYT-eBooks. Download chapter PDF
Similar content being viewed by others
Keywords
4.1 Introduction
The human body consists of muscular and skeletal systems that compose of bones, cartilages, ligaments, and other tissues to connect all anatomical structures together [14]. These structures are responsible for support, balance, and stamina. The human body relies on the skeleton and muscles for the maintenance of shape, posture, movement, rigidity, and others [19]. Several deficiencies possibly result in the ineffective ability of the skeletal and muscular system to function in the provision of the requirements of the human body. This is known to be one of the leading causes of several types of disabilities including both the long-term and short-term ones.
Orthopedic surgery is a field of surgery which is essentially devoted to the appendicular and axial skeletons also with the anatomical structures that are related to. This surgical field contains several subfields or subspecialties ranging from arthritis, tumors, metabolic conditions, inherited conditions, congenital conditions, soft tissue process, fractures, and others [28]. More specifically, orthopedic surgery is to treat several conditions arising from the malfunctioning of the muscular and skeletal tissues. These treatments usually contain implant placements, reduction of fractures, reconstruction of torn ligaments, resection of tumors, osteotomy, drilling of bones (amputations), and others [22]. Unfortunately, these treatment procedures with surgical risks can lead to permanent conditions such as loss of the body parts or even loss of lives if proper precautions are not taken to avoid certain errors that arise truly from human errors. During the two decades, to reduce or avoid surgical risks, various medical instruments and surgery methods have been developed to assist surgeons to obtain successful orthopedic surgery, especially to provide surgeons with real-time and 3-D visualization and monitoring of the surgical procedure [10].
Computer-assisted orthopedic surgery is generally a new clinical procedure that is associated with various computer technologies such as computer vision and computer graphics. Such a surgical procedure is employed for surgical planning and simulation, navigation or guidance, monitoring, and visualization [12]. More specifically, surgical navigation is the core element of computer-assisted orthopedic surgery systems. When surgeons manipulate surgical instruments during orthopedic intervention, surgical navigation can accurately track and intuitively visualize these instruments in real time related to anatomical targets. Furthermore, such navigation provides surgeons with useful information such as the location of the anatomical structures, identification of regions of interest, real-time feedback, and monitoring of the surgical instrument’s position and orientation in six degrees of freedom (6DoF).
This chapter reviews various aspects of surgical navigation in orthopedics. First, the concept of navigation in surgery is defined. Next, we thoroughly discuss the surgical workflow of orthopedic navigation, followed by showing currently available systems. Finally, the future development of orthopedic navigation is described.
4.2 Navigation in Surgery
The initial idea of effective and precise location of surgical instruments associated with anatomical structures in the human body during surgery can be traced back to the late nineteenth century [8]. With a large amount of technological advances in the past three decades, this initial idea has been evolved to navigation in surgery. Such evolution has been heavily helped with the exponential growth of the processing power of computers coupled with the availability of specialized and dedicated systems capable of providing real-time processing and feedback during surgical procedures. Additionally, clinical technicians have constantly pushed for the development of new technologies to solve the challenges faced in surgical procedures.
Navigation in surgery is generally a new concept to improve and boost various surgical and therapeutic procedures, e.g., neurosurgery [9], endoscopic interventions [17], and image-guided radiation therapy [3]. This concept can be decomposed into two major problems arisen from various surgical procedures [18]: (1) Where are the anatomical structures in the body? (2) How do surgeons safely and quickly get to and locate them? Surgical navigation can solve the two problems by precisely and simultaneously localizing the internal structures, organs, and surgical tools in a safe and minimally invasive way. Nowadays surgical navigation is widely used in orthopedic surgery to locate the anatomical targets and to empower surgeons to recognize where they are and how they manipulate surgical instruments to accurately reach the targets.
4.3 Orthopedic Navigation: Methodology and Systems
According to a report from Datamonitor Healthcare,Footnote 1 approximately 5.3 million orthopedic surgeries were implemented in 2010, and those numbers are estimated to be 6.6 millions in 2020. To meet with this large increase, innovative medical approaches and systems are steadily required worldwide in the field of orthopedics. On the other hand, surgeons expect precise and efficient placement and alignment during orthopedic surgery such as knee or hip implantation. Moreover, several main technical problems, such as length restoration, offset of bones, or accurate implant positioning, remain challenging in the scientific community. All those aspects requires novel approaches and medical devices to improve orthopedic surgery.
Orthopedic navigation is generally recognized as the new generation of orthopedic surgery. The new generation can provide surgeons with preoperative planning and accurate positioning of operational implants during intraoperative intervention [32]. Furthermore, surgical navigation in orthopedics enables to collect data in specific structural replacements (e.g., knee joint). These collected data, such as measurable values mostly in a discrete representation form and the laxity of the knee joint when it is being moved over a range of positions, are vital information to assist surgeons in reacting soft tissues and balancing in real time during surgery.
This section discusses the surgical workflow of orthopedic navigation and reviews currently available orthopedic navigation systems.
4.3.1 Surgical Workflow
Orthopedic navigation surgery generally involves several essential steps (Fig. 4.1): (1) preoperative data collection and processing, (2) patient-to-model registration, (3) intraoperative tracking, and (4) direct visualization. In addition, postoperative validation is also very important. All these steps are discussed in the following.
4.3.1.1 Multimodal Modeling
Orthopedic navigation surgery usually involves multiple information modalities including preoperative and intraoperative data. Computed tomography (CT) scans or magnetic resonance (MR) images are most commonly preoperative data used to analyze orthopedic abnormalities before surgery. Intraoperative data such as fluoroscopic images and external tracking data are very important for surgical navigation.
While CT provides much good bony details, MR shows very useful tissue information. Both CT and MR images are segmented to obtain regions of interest. Based on the segmented results, surface models of orthopedic structures such as bones and knee can be reconstructed in 3-D [31]. These models are not only used for preoperative planning but also used as 3-D maps during orthopedic navigation.
4.3.1.2 Spatial Registration
Spatial registration refers to as an conversion or transformation of coordinates within one space and those within another space so as to achieve correspondence between the two coordinates in two different spaces, i.e., it estimates the transformation relationship between the physical spaces of preoperative and intraoperative data and maps two spaces of virtual and real coordinate systems such that anatomical structures in the virtual space correspond to the same structures in the real space.
In particular, this procedure aims to associate preoperative 3-D model representation of the internal anatomy with intraoperative imaging that usually uses fluoroscopy to directly visualize the operative sites in the patient. The registration step is quite important for orthopedic navigation. Before 3-D CT or MR images acquired preoperatively can be used for surgical guidance, these images or prebuilt 3-D anatomical models must be registered to a patient’s anatomical system defined on the surgical field. In this respect, it is also called image-to-patient registration. Spatial registration is usually performed with the aid of geometric features such as landmarks, frames, surfaces, and others. Two main approaches have been reported in the literature: (1) fiducial-based registration and (2) fiducial-free registration.
(1) Fiducial-Based Registration. As the most commonly used approach, it exactly performs a surface-matching procedure. This requires to reconstruct sufficient bone surface from preoperative images and predefine fiducials or markers.
There are two kinds of fiducials that can be used for the registration. Anatomical or natural markers are manually determined. Surgeons usually select 4–5 points or markers from preoperative images or the 3-D surface model and align these points to their corresponding points on the patient during intervention. The point-pair matching accuracy ranges from 0.3 to 0.8 mm as reported in [16, 29]. CT-fluoroscopy matching was also proposed. Two intraoperative fluoroscopic images were automatically aligned to CT slices after a manual adjustment [25]. On the other hand, artificial markers can be placed before CT scan. Cho et al. [4] proposed to place more than three K-wires around the bones and then perform CT scanning. These K-wires also can be easily visualized on fluoroscopic images. Subsequently, Cho et al. [5] also used resorbable pins as artificial markers, while they employed a MR scanner to avoid radiation. The registration error was reported from 0.3 to 1.7 mm.
(2) Fiducial-Free Registration. Intraoperative 3-D imaging can directly generate 3-D shape or surface of the bony structure. This provides another alternative that performs 3-D surface matching between intraoperatively generated and preoperatively reconstructed surfaces without using any anatomical or artificial fiducials. Intraoperative 3-D surface generation commonly uses tracked and calibrated fluoroscopic and ultrasound images. However, the low quality of fluoroscopic or ultrasound images potentially limits the accuracy of the generated 3-D surface.
Additionally, multimodal image registration also plays an important role in orthopedic navigation surgery. As mentioned above CT and MR images provide various useful information, respectively. It is natural to fuse CT and MR images to improve the visualization of target regions during orthopedic surgery. Figure 4.2 illustrates a software interface that performs CT-MR fusion for navigation.
In general, spatial registration is a prerequisite to various image-guided navigation systems and definitely a very important step to realize orthopedic navigation. The acceptable registration accuracy required from clinical applications is not defined yet and also different from various orthopedic navigation systems. The currently reported registration accuracy is generally less than 2.0 mm.
4.3.1.3 Real-Time Tracking
Orthopedic navigation systems must track surgical tools in real time during surgery. Such tracking aims to temporarily and continuously orient the surgical instrument to target regions in the patient. Based on real-time tracking, surgeons can visualize and synchronize the surgical instrument in 3-D surface models reconstructed from preoperative data. To this end, various external trackers have been developed [18].
External tracking devices usually consist of control units, sensing volume generator, and sensors (e.g., coils or reflective marker spheres). The sensor is usually fixed at the surgical instruments. The control unit and sensing volume generator work together to estimate the position of the senor. After a calibration between the sensor and surgical tool, the surgeon can successfully and automatically navigate the surgical tool on the basis of the continuous measurements of the sensor.
Currently, commonly used external trackers include optical tracking and electromagnetic tracking (Fig. 4.3). The former senses infrared-emitting or retroreflective markers affixed to a surgical tool or object and only can be used for rigid surgical instruments outside the body. The latter uses embedded sensor coils to perceive the location of the surgical instrument and can be employed to track flexible instruments inside the body. More recently, a stereovision tracking device has been commercialized. This technology perceives depth information and 3-D structure derived from video information from two or more video cameras (Fig. 4.4).
Additionally, motion analysis cameras can be introduced for tracking. These cameras provide kinematic measurements to mathematically calculate rotation centers through the movement of opposite skeletal structures usually attached with fixed markers, and marker information in the anatomy of the bone structures also can be collected by mobile marker pointers using the method of triangulation.
Real-time tracking is a vital component that synchronizes various information in orthopedic navigation. It generally uses external tracking devices including sensors that are integrated with the surgical tools and navigate them associated with the skeletal structures. The tracking devices discussed above have their own advantages and disadvantages. How to select a proper tracking device is an open issue but partially relies on a typical application.
4.3.1.4 Direct Visualization
Preoperative and intraoperative data should be represented properly to surgeons during orthopedic intervention. Multimodal data representation approaches in orthopedic navigation require to provide surgeons with critical structural information in real time and minimally interrupt the surgical workflow. Particularly, volumetric data (e.g., CT and MR) must be presented to the surgeon in an intuitive manner that shows important and understandable anatomical structures in the patient.
Currently, there are generally three ways to visualize 3-D volumetric data: (1) planar visualization, (2) surface rendering, and (3) volume rendering. Planar visualization approaches basically display orthogonal slices in three directions of axial, sagittal, and coronal axes. Volume rendering is a visualization technique that employs a transfer function to allocate an opacity or color associated with the voxel intensity to each voxel in volumetric data [20], while surface rendering depends on preoperatively segmented 3-D data that are further processed by an intermediate step (e.g., using the marching cubes algorithm) to generate anatomical 3-D models [21].
All the data used in orthopedic navigation and represented by any one of three approaches mentioned above are eventually shown on screens or monitors. This empowers surgeons with intuitively monitoring the movement of the surgical tools related to preoperatively reconstructed 3-D models and the anatomical structures in the patient in the operating room. In addition, direct visualization allows the orthopedic navigation system to efficiently calculate initial superlative implant or repairable position, resulting in effective preparation for bones making use of navigated tools, while analyzing soft tissues for proper ligament balancing over a range of rotation based on physical or simulated kinematic curves.
4.3.1.5 Postoperative Validation
Postoperative validation is necessary to evaluate the surgical outcome of orthopedic navigation. Successful orthopedic guidance should properly recover orthopedic structure function of patients, especially replace or resect the bones, knee, or joint in a high precision. Unsuccessful procedures possibly bring physical suffering or even disability to patients and increase economic burden and hospital stay time.
Although no gold standard has been reported in the literature, postoperative assessment of orthopedic navigation usually includes several aspects: (1) surgical (e.g., replacement or resection) accuracy, (2) recovery of orthopedic function, and (3) surgical complication. After surgery, CT or MR images may be acquired for orthopedic patients for postoperative validation purpose. Therefore, surgeons can visualize these images to inspect the surgical outcome. On the other hand, postoperative and preoperative image registration is also an effective way to evaluate the replacement or resection accuracy after orthopedic navigation surgery. Moreover, the recovery of the physical functions of the orthopedic structures on the patient is the most important outcome of orthopedic surgery. After recovering the bones, muscles, and joints in the musculoskeletal system, orthopedic patients are expected to do everyday physical activities exactly as normal or healthy peoples can act. Additionally, orthopedic surgery possibly generates complication after a surgery. These unanticipated problems should be minimized to avoid additional therapy.
4.3.2 Available Systems
Orthopedic navigation systems are surgical areas that are widely developed in the scientific community. Many commercialized systems are available to navigate orthopedic procedures. Various approaches and systems are used in orthopedic navigation for preoperative, intraoperative, and postoperative procedures. All of them aim to maximize the surgical outcome while minimizing the surgical risk.
Technically, these approaches are categorized into two main types: (1) fluoroscopy-based navigation and (2) dynamic CT-based navigation.
4.3.2.1 Fluoroscopy-Based Navigation
In this navigation, preoperative CT or MR images must be acquired. These images are processed and reconstructed to be 3-D models that are associated with localization of surgical tools and implants during surgery (Fig. 4.5). While external trackers are placed on specific anatomical reference points on the skeletal structure of interest, these points are indicated by a small series of fluoroscopic images.
A C-arm fluoroscopy-based navigation system usually contains a workstation and a monitor, external tracking systems, fluoroscopy, a clamp with dynamic reference frame, and several optically marked instruments. Devices used for calibration are usually two parallel plates that consist of radiopaque metal spheres. Optical tracking are most commonly used to determine the position and orientation of surgical instruments in real time; hence, it provides real-time navigation.
Fluoroscopy-based navigation has several specific features. First, it boasts in its virtual fluoroscopy which provides surgeons with virtual images in real time and determination of the most appropriate location from the gallery of these images. Next, this navigation does not rely on real-time changes on the patient since the navigation depends totally on the chosen images from the virtual image gallery. Furthermore, it has evolved to 3-D fluoroscopy-based navigation which provides additional benefits such as axial plane visualization that acts as a real-time CT scanner. Figure 4.6 shows a real-time view of CT and 3-D fluoroscopy matching.
Fluoroscopy-based navigation still has some limitation. 3-D virtual images are generated from specific 2-D X-ray projection rather than from optimized 3-D images. This implies that reconstructing 3-D virtual fluoroscopic images depends heavily on interpreting 2-D projected information, in turn, which relies critically on knowledge and skills of surgeons. Furthermore, fluoroscopic images are rather difficult to obtain in specific patients such as obese patients. This requires to reduce parallax effects by aligning the irrespective of size and the surgical field of interest to the fluoroscopic image viewing. Additionally, radiation is unavoidable in surgery.
4.3.2.2 Dynamic CT-Based Navigation
Cone-beam CT (CBCT) or O-arm is an intraoperative imaging technique that is introduced for dynamic and real-time visualization of anatomical structures on the patient in the operating room. Hence, it also can be used for navigation. While CBCT images are used to reconstruct 3-D models of regions of interest, they also can directly visualize the structures, implants, and surgical tools in a dynamic state during surgical navigation. Two primary methods or modes can be employed to collect and use 3-D CBCT images. Supervision mode requires to plan several steps that will be performed in the surgical procedure, and it guides surgeons by matching each of surgical steps with initially planned and simulated steps on a monitor during surgery. Real-time mode monitors the surgical tools and skeletal structures of interest (implants or repairable) that are directly moved to the surgical places.
CBCT or O-arm-driven navigation systems (Figs. 4.7 and 4.8) empower surgeons with accurate representation of the anatomical structures and effective administration of orthopedic surgical procedures such as implantations and bracing without requirement of surgical planning before surgery. While C-arm-based fluoroscopy provides orthopedic navigation with continuous static 2-D images in real time, it gives only a planar and 2-D image at a time until repositioned and does not permit simultaneous 3-D imaging. Compared to radiographs or C-arm fluoroscopy alone, 3-D O-arm mode provides enhanced imaging information and accurate intraoperative visualization of the position of bones and/or navigation implants. Therefore, dynamic CT-based navigation is a promising way to reduce intraoperative complications and simultaneously improve surgical outcomes.
In general, no matter what types are used in orthopedic navigation, CT or MR images are preoperatively acquired for surgical planning. The major difference between the fluoroscopy-based and dynamic CT-based navigation approaches is that the former offers 2-D planar images, while the latter can provide 3-D volumetric imaging data like a CT system. Currently, these types are widely used by medical device companies such as BrainLAB and Stryker, discussed in the following.
4.3.2.3 BrainLAB System
BrainLAB produces orthopedic navigation to meet with currently active patients who demand good structures (e.g., hip and knee) and require precise alignment and individual soft tissue balancing essential in total various arthroplasty surgeries.
BrainLAB’s HIP 6 allows precise hip arthroplasty based on navigation that can reduce outliers and improve acetabular placement, as well as obtain more consistent leg length restoration. With five easy steps, HIP 6 can efficiently calculate cup or stem position and determine leg length and offset without repositioning the patient. KNEE3 from BrainLAB is a smart image-free navigation software that visualizes and summarizes the complex interaction between 3-D kinematics, joint stability, and implant alignment. Compared to conventional surgical techniques, BrainLAB’s spinal navigation empowers accurate screw placement and reduction of fluoroscopy exposure, as well as enhanced visualization of instruments, skin incisions, and trajectories. Figure 4.9 shows various BrainLAB’s orthopedic navigation systems.
4.3.2.4 Stryker System
As one of leading companies in orthopedic navigation, Stryker also provides hip and knee navigation software for precise orthopedic surgery with unparalleled accuracy and control in total hip and knee replacements. Figure 4.10 displays Stryker’s Navigation System II that is equipped with OrthoMap software systems.
The Navigation System II has some specific features. First, dual articulating arms provide surgeons and staffs with maximum flexibility for positioning during surgery. More importantly, it employs Stryker’s own highly accurate and reliable digital cameras as the tracking devices with large working volume and virtually no jitter. Moreover, the Navigation System II provides the ability to import multimodal information such as fluoroscopic, microscopic, and endoscopic images, enabling surgeons to compare unaffected and affected anatomy with automatic symmetrical visualization as a preoperative and intraoperative guidance. Additionally, it contains three OrthoMap software systems including Precision Knee Navigation Software, Express Knee Navigation Software, and Versatile Hip Navigation Software.
More recently, Stryker has developed Mako that is a robotic arm-assisted system (Fig. 4.11). Mako can be used for total hip or knee surgery. It provides patients with a personalized surgical experience. Moreover, Mako provides patients with a more predictable surgical experience during joint replacement surgery.
4.3.2.5 NAVIO System
Smith & Nephew PLC is an international producer of advanced wound management, arthroscopy, and orthopedic reconstruction products and has successfully developed the NAVIO surgical system that is widely used for orthopedic surgery including knee replacement, joint replacement, and bone resection (Fig. 4.12).
The NAVIO surgical system is generally designed to provide surgeons with component positioning, ligament balancing, and bone preparation. Most interestingly, NAVIO performs an image-free navigation workflow, which is totally different from BrainLAB and Stryker’s systems. While an image-based navigation workflow usually includes diagnosis, preoperative data acquisition (e.g., CT or MR scans), preoperative data analysis and planning, intraoperative planning, and surgery, the image-free workflow contains only diagnosis, intraoperative planning, and surgery.
Image-free navigation strategy brings NAVIO with several specific advantages. First, intraoperative real-time imaging and computing eliminates time and costs associated with preoperative imaging and planning, which not only simplifies the surgical procedure but also reduces radiation exposure for patients. Moreover, NAVIO implements navigation without collecting a CT or MR scan and provides surgical personnel and patients with a patient-specific plan without additional procedures related to other image-based workflows that can increase cost or delay surgery. In this respect, virtual representation of patient anatomy is generated by using direct anatomic mapping and kinematic registration. More interestingly, NAVIO employs a handheld instrument to precisely resect bone approved by the surgeon within the patient-specific plan, which offers a unique and flexible method to arthroplasty. Additionally, the NAVIO surgical system allows a cost-effective method that gives a cutting-edge surgical practice and achieves outcomes predictable to the plan.
4.4 Future Development
Orthopedic navigation techniques are widely developed, and many commercial systems are available to clinical applications. However, as a matter of fact, different orthopedic patients have various and unique physical external and internal structures. Such patient variation results in various limitations in current orthopedic navigation. The future development of orthopedic navigation includes multimodal fusion, shape sensing, robotics, oncology, and academic collaboration, discussed in the following.
4.4.1 Multimodal Fusion
Various imaging multimodalities such as CT, MR, PET, and fluoroscopic images provide surgeons with different structural and functional information. While these information are obtained in different imaging spaces, multimodal information synchronization is essential to boost orthopedic navigation [2]. Unfortunately, seamless fusion of these information remains challenging for surgical navigation in various clinical procedures including orthopedic surgery. A future direction is the development of accurate and robust fusion of preoperative and intraoperative information.
4.4.2 Shape Sensing
The unique shape of the bone is used to register the preoperative plan with the position of the patient in the operating room. On the other hand, intraoperatively located surgical tools are very important for orthopedic navigation. The use of shape sensing for tracking anatomical structures and surgical instruments is a novel idea to improve accuracy and precision of orthopedic navigation [11]. More recently, shape sensing techniques were thoroughly reviewed for minimally invasive surgery [24]. In addition, orthopedic navigation system with sensorized devices is a promising way to report anatomical alignment and feedback in real time during surgery [23].
4.4.3 Orthopedic Robotics
Orthopedic surgery has been incorporated with robotic ethnology for the planning and performance of total hip replacement in 1992 [15]. Subsequently, robotic-assisted surgery has been most popular in unicompartmental arthroplasty, which results in greater accuracy of surgical implantation compared with conventional techniques. Although robotic-assisted orthopedic surgery has been demonstrated superior accuracy in most short-term studies, it still suffers from various limitations [15]. Robotic-assisted orthopedic surgery warrants further research and development.
4.4.4 Orthopedic Oncology
Surgical navigation in orthopedic oncology is a relatively new research area [29]. It is an extension of computer-assisted orthopedic navigation. While orthopedic navigation offers real-time direct visualization and accurate tracking of anatomical and pathologic structures and surgical instruments, it limits bulky navigation facilities, a long time of settings, and a lack of stable cut instruments [30]. Further development is to address these limitations and improve its efficiency and robustness.
4.4.5 Simulation and Validation
Although current orthopedic navigation systems combine preoperative images for resection planning, they have no support for simulation such as virtual bone resection and assessment of the resection defect and bone allograft selection from a 3-D virtual bone bank. It still remain unique and largely unexplored engineering challenges to build training simulators for orthopedic surgery, e.g., it is difficult to simulate both the large forces and subtle haptic feedback [27]. On the other hand, the navigation system (hardware and software) generated error has been generally disregarded [13]. Unbiased validation of accuracy and precision of orthopedic navigation requires to investigate the impact of extra-articular deformity on the system-level errors generated during intraoperative resection measurement [1].
4.4.6 Academic Collaboration
Recently, Conway et al. [7] proposed a model for academic collaboration in orthopedic surgery. Such a collaboration aims to solve major problems in orthopedic surgery worldwide. Although this model specified much methodological aspects, it still remains many challenges, e.g., the relative lack of objective, measurable outcomes regarding its interventions, and the financial sustainability [7]. Future recommendation is to extend this model that demands for new partnerships and efforts.
4.5 Closing Remarks
This chapter discusses surgical navigation in orthopedics. Computer-assisted orthopedic navigation surgery is generally considered as the new generation of orthopedic surgery. To realize orthopedic navigation, various techniques have been reviewed, such as multimodal modeling, spatial registration, real-time tracking, and direct visualization. In particular, several available orthopedic navigation systems were reviewed in this chapter. Although orthopedic navigation is a relatively mature field, it still requires further development including multimodal fusion, shape sensing, robotics, oncology, simulation and validation, and academic collaboration. These developments will definitely progress current orthopedic navigation and surgery approaches and systems to a completely new stage of intelligent orthopedics.
References
Angibaud LD, Dai Y, Liebelt RA, Gao B, Gulbransen SW, Silver XS (2015) Evaluation of the accuracy and precision of a next generation computer-assisted surgical system. Clin Orthop Surg 7(2):225–233
Aponte-Tinao LA, Ritacco LE, Milano FE, Ayerza MA, Farfalli GF (2015) Techniques in surgical navigation of extremity tumors: state of the art. Curr Rev Muscoskelet Med 8(4):319–323
Bostel T, Nicolay NH, Grossmann JG, Mohr A, Delorme S, Echner G, Haring P, Debus J, Sterzing, F (2014) Mr-guidance – a clinical study to evaluate a shuttle-based MR-linac connection to provide MR-guided radiotherapy. Radiat Oncol 9:12
Cho HS, Oh JH, Han I, Kim HS (2009) Joint-preserving limb salvage surgery under navigation guidance. Eur J Surg Oncol 100(3):227–232
Cho HS, Park IH, Jeon IH, Kim YG, Han I, Kim HS (2011) Direct application of MR images to computer-assisted bone tumor surgery. J Orthop Sci 16(2):190–195
Chowdhary A, Drittenbass L, Dubois-Ferrière V, Stern R, Assal M (2016) Intraoperative 3-dimensional computed tomography and navigation in foot and ankle surgery. Orthopedics 39(5): e1005–e1010
Conway DJ, Coughlin R, Caldwell A, Shearer D (2017) The institute for global orthopedics and traumatology a model for academic collaboration in orthopedic surgery. Front Public Health 5:Article 146
Enchev Y (2009) Neuronavigation: geneology, reality, and prospects. Neurosurg Focus 27(3):E11
Golby AJ (2015) Image-guided neurosurgery. Elsevier, Amsterdam
Harijan A, Halvorson EG (2011) Eponymous instruments in plastic surgery. Plast Reconstr Surg 127(1):456–465
He X, Popovic A, Flexman ML, Thienpharapa P, Noonan DP, Kroon R, Reinstein AL (2017) Shape sensing for orthopedic navigation. US Patent US20170281281A1, 5 Oct 2017
Hernandez D, Garimella R, Eltorai AEM, Daniels AH (2017) Computer-assisted orthopaedic surgery. Orthop Surg 9(2):152–158
Hsu HM, Chang IC, Lai TW (2016) Physicians perspectives of adopting computer-assisted navigation in orthopedic surgery. Int J Med Inform 94(10):207–214
Hutchinson M (2006) A brief atlas of the human body. Benjamin Cumming, San Francisco
Lang JE, Mannava S, Floyd AJ, Goddard MS, Smith BP, Mofidi A, Seyler TM, Jinnah RH (2011) Robotic systems in orthopaedic surgery. Bone Jt J 93(10):1296–1299
Li J, Wang Z, Guo Z, Chen GJ, Yang M, Pei GX (2014) Precise resection and biological reconstruction under navigation guidance for young patients with juxta-articular bone sarcoma in lower extremity: preliminary report. J Pediatr Orthop 34(1):101–108
Luo X, Wan Y, He X, Mori K (2015) Observation-driven adaptive differential evolution and its application to accurate and smooth bronchoscope three-dimensional motion tracking. Med Image Anal 24(1):282–296
Luo X, Mori K, Peters T (2018, in press) Advanced endoscopic navigation: surgical big data, methodology, and applications. Annu Rev Biomed Eng 20:221–251
Marieb EN, Hoehn KN (2015) Human anatomy & physiology. Pearson, Harlow
Moreland K (2013) A survey of visualization pipelines. IEEE Trans Vis Comput Graph 19(3): 367–378
Nielson G (2003) On marching cubes. IEEE Trans Vis Comput Graph 9(3):283–297
Resnick D, Kransdorf M (2004) Bone and joint imaging. Elsevier-Saunders, Philadelphia
Roche M, Boillot M, McIntosh J (2015) Orthopedic navigation system with sensorized devices. US Patent US9011448, 21 Apr 2015
Shi C, Luo X, Qi P, Li T, Song S, Najdovski Z, Fukuda T, Ren H (2017) Shape sensing techniques for continuum robots in minimally invasive surgery: a survey. IEEE Trans Biomed Eng 64(8):1665–1678
So TY, Lam YL, Mak KL (2010) Computer-assisted navigation in bone tumor surgery: seamless workflow model and evolution of technique. J Pediatr Orthop 468(11):2985–2991
Takao M, Nishii T, Sakai T, Yoshikawa H, Sugano N (2014) Iliosacral screw insertion using CT-3D-fluoroscopy matching navigation. Injury 45(6): 988–994
Thomas GW, Johns BD, Kho JY, Anderson DD (2015) The validity and reliability of a hybrid reality simulator for wire navigation in orthopedic surgery. IEEE Trans Hum Mach Sys 45(1):119–125
Wiesel SW, Delahay JN (2011) Essentials of orthopedic surgery. Springer, New York
Wong KC, Kumta SM (2013) Computer-assisted tumor surgery in malignant bone tumors. Clin Orthop Relat Res 471(3):750–61
Wong KC, Kumta SM (2014) Use of computer navigation in orthopedic oncology. Curr Surg Rep 2(4):47
Zheng G, Dong X, Rajamani KT, Zhang X, Styner M, Thoranaghatte RU, Nolte LP, Ballester MAG (2007) Accurate and robust reconstruction of a surface model of the proximal femur from sparse-point data and a dense-point distribution model for surgical navigation. IEEE Trans Biomed Eng 54(12):2109–2122
Zheng G, Nolte LP (2015) Computer-assisted orthopedic surgery: current state and future perspective. Front Surg 2:66
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Ewurum, C.H., Guo, Y., Pagnha, S., Feng, Z., Luo, X. (2018). Surgical Navigation in Orthopedics: Workflow and System Review. In: Zheng, G., Tian, W., Zhuang, X. (eds) Intelligent Orthopaedics. Advances in Experimental Medicine and Biology, vol 1093. Springer, Singapore. https://doi.org/10.1007/978-981-13-1396-7_4
Download citation
DOI: https://doi.org/10.1007/978-981-13-1396-7_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-1395-0
Online ISBN: 978-981-13-1396-7
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)