1 Introduction

Osseointegration is the direct attachment of bone onto a metal implant whereby an external prosthesis passes through the skin and is connected to the appendicular skeleton. The concept began with the observation that bone grows directly onto titanium implants without any intervening tissue in animal models [1]. Following this groundbreaking research, osseointegrated implants were introduced successfully in the clinical setting as dental prosthesis. In recent decades, osseointegration has proved useful for extremity amputee patients.

Osseointegration solves many of the challenges that come with conventional socket based prosthesis, including skin irritation and poor fit [2]. Transdermal prosthetics demonstrate promising early results for patients with various levels of amputation. However, these implants are not without their shortcomings. The percutaneous nature of the metal implant makes them highly susceptible to infection. Additionally, the transdermal interface may be at risk of irritation and breakdown due to constant contact between the implant and the skin. Finally, implants may fail to integrate or lose bony ingrowth over time due to improper loading of the implant. Novel technologies hope to capitalize on the transdermal design of these implants to combat these challenges. Sensors placed inside the metal implant can transmit information in real time. This provides an opportunity for early detection and prevention of complications. In that context, this paper aims to (1) review the current state of Osseointegration and (2) review emerging sensor technologies, and their potential to solve the complications of transdermal implants.

2 The elements of osseointegration

Three basic elements comprise osseointegration: the living bone, the metal implant, and the transdermal surface. The metal prosthesis is implanted into the distal end of the residual bone of the amputated limb. The implant then exits the soft tissue and skin to attach to an external prosthesis. The characteristics of these three elements, as well as their interfaces, are crucial for successful osseointegration.

2.1 The living bone

Bone anchorage requires a healthy bone bed with efficient vasculogenesis to allow for the complex migration of inflammatory and progenitor cells [3, 4]. The endosteal injury from surgery activates the local hematopoietic system that drives the migration of subsets of macrophages, mesenchymal, hematopoietic stem cells, endosteal, and periosteal-resident osteoblasts to the site of injury [5, 6]. Recent studies have observed that osteoblast express extrinsic coagulation initiator tissue factor. It was previously unknown that the extravascular coagulation cascade had a regulatory role on the endosteal stem cell activation [7]. Altogether, these cell populations synchronously mediate deposition of de novo bone tissue at the bone-implant junction. Coagulum forms within an hour of surgery and granulation within 2 h. Next, extracellular matrix glycoproteins initiate replacement of the granulation tissue with bone-specific proteins such as osteonectin and fibronectin at the bone-implant interface [8]. This process, which is mediated by macrophages, stimulates wound vascularization allowing for mesenchymal stem cell migration and dead cell clearance. In addition to macrophages, bone marrow injury also activates endosteal osteoblasts and their progenitors to migrate to the bone-implant interface [9, 10].

Next, a proteoglycan layer deposits at the bone-implant interface around 12 weeks after implantation [11]. This layer is separated from the implant surface by collagen and cellular layers that are mainly comprised of type I collagen [12]. Concurrently, hydroxyapatite (HA) crystals are deposited directly onto the implant’s titanium surface. Following proteoglycan deposition, woven bone is observed within the empty spaces at the bone-implant interface [6]. This woven bone is gradually replaced by trabecular bone within weeks. The bone fragments from surgery are also enveloped with new bone, which may enhance peri-implant osteogenesis [5]. Finally, mature lamellar bone replaces the trabecular bone.

Stable implant anchorage is maintained over time by bone remodeling at the endosteal interface [6]. This process starts within weeks of implantation. Despite bone resorption, during remodeling, implants are able to maintain stability within the host bone. New bone forms on the surface of the native bone following osseointegration. This indicates that osseoconductive factors are present in the gap between the bone and the implant [6]. Bridging the gap between implant and bone depends on the surface coating of the implant [13]. Osteoconductive hydroxyapatite stimulates the ingrowth of bone into the calcium and phosphorus lattice of the coating. This leads to a tighter apposition between the HA and the bone without intervening tissue. While the strong bone-HA bonds are favorable, they are stronger than the HA-implant adhesion and can lead to delamination of the surface coating from the implant surface. The implications of these surface-coating interactions on long-term implant stability are unclear and require further investigation [14, 15].

It is imperative to optimize the host factors that impact the bone bed and its vascularity. Chronic medical conditions, such as diabetes mellitus, osteopenia, and osteoporosis all negatively contribute to bone vascularity [16,17,18]. Furthermore, chemo and radio-therapy have been shown to decrease vascularity as well [19, 20]. While some of these factors may be manageable clinically, others may dictate whether patients are candidates or not for osseointegration.

2.2 The implant

Successful osseointegration depends on implant material, the surface coating, and the implant design parameters [21, 22]. Parameters, such as length, diameter, thread shape, and porosity alter the load-bearing conditions of the implant on the endosteal surface, and thus control bone ingrowth. Larger diameter implants distribute stress more uniformly, which is important in osteoporotic bone [23, 24]. Like diameter, implant length helps reduce stress peaks in cancellous bone [25]. Both parameters have been demonstrated clinically [26]. Implant threads can vary by geometry, pitch, depth, and width [27, 28]. For instance, depth allows for greater ingrowth of trabecular bone and thus more stable osseointegration [29]. Finally, threads can be shaped as square, v-shaped, spiral-shaped or buttress-shaped. Of these, the square shaped thread design has demonstrated the best primary stability under immediate loading conditions. Altogether, these parameters must be optimized to allow for long-term implant stability in the living bone.

Osseointegration implants are commonly made from titanium and its alloys. Titanium has a high specific strength, it is corrosive resistant, and it is biocompatible with bone [30,31,32,33]. Common titanium alloys contain both aluminum and vanadium. Making alloys with these ions does have its drawbacks, as both aluminum and vanadium can cause mitochondrial damage and neurologic disorders [34,35,36]. Strontium and Zirconium based alloys avoids this problem.

Surface coatings can be used as adjuncts to osseointegration as surface roughness plays a key role in bony ingrowth [37]. Porous coatings allow for bone to grow within an interconnected network of channels [38]. Porous coated implants undergo faster osseointegration and neovascularization compared to smooth implants [39]. A recent study demonstrated that 20–50% porosity stimulates bone growth [40]. Optimum pore size can also provide for superior fixation. Taniguchi et al. demonstrated that a 600 µm pore size had superior fixation compared with 300 and 900 µm pore size titanium implant at the same level of porosity [41]. There is a trade off between an increase in porosity and with a decrease in material strength and fatigue resistance [42]. Future efforts should aim to develop new titanium alloys or implants with graded porosity, with highly porous surfaces and less porous cores, that do not compromise implant strength while improving osseointegration.

2.3 The transdermal interface

The skin-implant interface is where the metal implant exits the skin and attaches to an external prosthesis. This junction presents many clinical challenges for patients and physicians, namely infections. Surface bacterium on the external prosthesis can traverse the metal implant and directly inoculate the bone and surrounding soft tissue [43, 44]. In the absence of infection, other complications can occur at the skin penetration site and skin/abutment interface [45, 46]. These local skin reactions likely account for the second most common complication associated with osseointegrated prosthetic implants [45]. The key to preventing infection in osseointegration procedures is optimizing the soft tissue layer. Several avenues of ongoing research to combat infection include engineering epithelial attachment to the implant and antibacterial implant coatings [47,48,49,50,51].

Epithelial attachment to the implant surface can reduce soft tissue inflammation and breakdown. Implant surfaces can be made porous to reduce the sheer stresses experienced by the surface cells [52, 53]. Biomolecular coatings can also be used to activate the adhesion of epithelial cells [54]. Specific extracellular matrix peptides have been found to successfully promote keratinocyte adhesion to titanium. These peptides include fibronectin [55] and E-cadherin [56]. The junctional epithelial cells of gingival tissues have also been investigated to identify proteins that allow for oral keratinocytes to attach directly to the tooth surface. Laminin 332 was identified as a critical facilitator of tissue attachment to percutaneous teeth [57, 58]. Peptides derived from Laminin 332 have been observed in preclinical models as preventing the infiltration of inflammatory cells into cutaneous wounds [59,60,61]. This initial research shows promise for future surface treatments that can be used clinically.

3 Implants used in clinical practice

3.1 The osseointegrated prosthesis for the rehabilitation of amputees (OPRA)

R. Brånemark established the OPRA implant system and its associated rehabilitation protocol in 1998. The design is based off of the dental implants designed by Par Ingvar Brånemark in the 1960’s [62]. The implant consists of three components: the fixture, which a threaded, cylindrical implant that integrates into the residual bone; the abutment, which is the percutaneous component that press fits into the distal end of the fixture thereby allowing attachment to the external prosthesis; and the abutment screw, which connects the fixture to the abutment (Fig. 1) [63]. In addition to the threads, there are radially placed perforations along the body of the fixture. These design features allow for increased torsional stability and allows for implantation in short residuum [64]. One drawback to the OPRA device is distal bone resorption due to concentrated stresses around the threads [65]. This phenomenon appears to destabilize the skin seal and can lead to infection.

Fig. 1
figure 1

a Schematic of the components of the OPRA implant system. b The OPRA fixture; the exterior surface is treated to enhance osseointegration. The lower image shows a close-up of the micro structure following laser treatment

The OPRA device is surgically implanted in a staged fashion [66]. This paradigm stems from the original preclinical work on the subject [62]. In the first stage, the fixture is placed into the medullary canal of the residual bone. And the distal end is augmented with bone graft from either the limb itself or from the iliac crest. The overlying soft tissue is closed. Originally, a period of 6 months allowed for soft tissue rest and bone graft incorporation [46]. Some centers now shorten this interval between surgeries to a matter of weeks. In the second stage, the overlying soft tissue is defatted, thinned, and a stoma is created to allow the percutaneous abutment to connect to the fixture (Fig. 2). The soft tissue is secured tightly to the residual bone’s distal surface to prevent sheering.

Fig. 2
figure 2

From left to right, preoperative, post-operative radiographs following Stage 1 and Stage 2 of the OPRA Implant System implantation procedure

The success of the OPRA device is due, in part, to its rehabilitation protocol. When the device was first introduced, patients underwent a rapid loadbearing protocol. Many of these early patients experienced implant loosening and required revision surgery [66]. The rehabilitation protocol was revised so that patients gradually loaded the prosthesis, which prevents micromotion and allows for continued bone remodeling. Revising the rehabilitation protocol increased implant survivorship from 40 to 80% [46, 67]. Since then, several groups have incorporated gradual loading into their rehabilitation protocols [68].

The OPRA device is one of the most widely studied systems in clinical use. The device is the only implant currently approved by the Food and Drug Administration (FDA) under a Humanitarian Device Exemption for the treatment of transfemoral amputees and is currently studied in both the upper and lower extremity. Several studies demonstrate favorable short- and mid-term survivorship as well as improved quality of life. The implant has undergone several design changes in the past two decades, including the application of a nanoporous surface coating [69], as well as a fail safe system whereby the abutment or screw fractures under excessive load bearing before causing a fixture or bone fracture.

3.2 The compress

Not all OI implants share the same design. The Compress, first developed for oncologic limb salvage reconstruction, is an endoprosthetic system that press fits into the residual bone [69, 70]. The Compress’s intramedullary stem is affixed to the bone through a series of transverse pins. A spindle abuts the distal end of the residual bone and produces 600–800 lbs of force, thus loading the bone and stimulating growth through Wolff’s law. This compressive load is designed to prevent aseptic loosening of the implant [71]. Studies of the device as a distal femoral replacement demonstrate superior survivorship and lower mechanical failure rates compared to other distal femoral replacement implants [72]. The Compress has a porous titanium collar, meant to be placed under a myofascial flap. The interconnecting pores allow not only for soft tissue ingrowth, but also for neovascularization within the implant itself.

The Compress addresses a major clinical challenge of other implants: Stress shielding or the removal of physiologic stress on bone by an implant [73, 74]. This process eventually leads to osteopenia and reduced cortical thickness. Stress shielding leads to bone resorption, which directly correlates to the stiffness of the implant [70, 75]. Stress shielding is thought to contribute to the risk of fracture or aseptic loosening [76].

The other major advantage of the Compress is the ease of revision [77]. For instance, the surgical treatment for an infected implant would consist of removing less than one cm of bone, clearing the infection, and replacing the implant. This is far simpler forward and less devastating than revising a stem and comes with less loss of bone. The sequelae of revising well-fixed orthopaedic implants have been well studied in the arthroplasty literature. These implants may require special instruments, such as powered reamers or trephines to remove the implant [78]. Despite careful attention to technique, these instruments may still lead to metal debris, bone loss, and thermal necrosis [78]. These concerns are avoided altogether by the Compress when fractures occur distal to the anchor plug. Furthermore, there is no need to remove the surrounding structural cortical bone, negating the need for bone graft. Also, bone density will have improved with the compress from baseline [77]. In the setting of chronic infection requiring two-stage revision, the spindle may be retained as long as it is well fixed. This preserves the primary integration at the bone-implant, which may not be robust if the spindle were revised.

The Compress, first used for osseointegration in 2012, has been implanted in eleven amputees. Ten of these patients were transfemoral amputees and one patient was a transhumeral amputee. The system was implanted in either a one- or two-stage fashion [70]. The early results of this implant are promising. There have yet to be any revisions related to infection, though two patients sustained periprosthetic fractures due to falls. Further results are needed to determine the long-term outcomes and survivorship of this implant in osseointegration.

3.3 Osseointegrated prosthetic limb (OPL) and the integral-leg prothesis (ILP)

Two similar implants that rely on press fit fixation for integration are in clinical use in Europe. The internal leg prosthesis (ILP) was first developed by Aschoff for transfemoral amputees in 1999. The implant consists of a cobalt-chrome “endo” stem that press fits into the intramedullary canal and an “exo” module. The “endo” stem is coated in metal spongiosa to create a deep porous surface for bony ingrowth [63, 69]. While porous coatings are used extensively in arthroplasty, extensively coating the implants can cause stress shielding and bone resorption and make removing the implants for revision surgery more difficult. The ILP’s soft tissue interface has undergone several design revisions, from a textured surface to a polished, non-abrasive surface. While it was thought that the abrasive surface would allow the soft tissue to adhere to the implant, it led to skin breakdown. The smooth, polished surface addressed this issue by promoting tissue drainage [79].

The Osseointegrated Prosthetic Limb (OPL) (Paramedica, Milan, Italy) is a modified version of the original ILP prosthesis that is used for transfemoral and transtibial amputees [63, 69]. The implant is made of titanium rather than a Co–Cr alloy to better match the elastic modulus of the host bone. The OPL keeps the press-fit design, but replaces the spongiosa metal coating with a plasma spray roughened surface for integration enhancement. The OPL can either have an extramedullary or intramedullary head for transfemoral amputations [80].

Both implants use a transcutaneous “dual cone” designed adapter to connect the intramedullary implant to the external prosthetic. The adapter is highly polished to minimize soft tissue irritation [63, 69, 81] and has a safety pin that fails under excessive loads, thereby preventing fracturing the surrounding bone [82, 83]. Taken together, these implants have a low rate of superficial infection. Furthermore, 2-year clinical outcomes demonstrate increased cortical thickness around the implant, which may be protective against periprosthetic fractures. In sum, these implants could offer an alternative choice to the OPRA and Compress devices and may be more appropriate in certain patients. However, they are currently not approved for clinical use in the United States and there are no studies that directly compare the various transdermal orthopaedic implants. Future studies are needed to determine which implants are appropriate for specific amputee populations.

4 Clinical aspects of osseointegration

4.1 Indications

Osseointegration is offered only to amputees who cannot tolerate or find extreme frustration with traditional prosthetic wear due to the lifelong risk of infections and other implant associated complications. These difficulties can occur in up to a third of patients with a major limb amputation [84]. These difficulties should be reported and documented by the clinicians, rehabilitation specialists, and prosthetists. Traditional socket-based prosthetic fit depends on residual limb length. Mismatch between the residua and the socket can lead to limb pistoning within the socket, which in turn causes skin breakdown, and pain. Even some patients who do not have issues related to pistoning may be intolerant of a traditional prosthetic due to the generated heat, hirsutism, and lack of osseoproprioception.

Most patients must undergo a trial of traditional prosthetic use before being considered for osseointegration surgery. However, some exceptions do exist, including patients with inadequate limb length or musculature to power a traditional prosthetic. Furthermore, some centers are studying the use of transtibial osseointegrated prosthetics as a palliative option for vasculopathic patient, whose need for mobility outweigh the potential complications of infection and soft tissue healing. Early indications are that these patients see an improvement in daily functional 1 year after surgery [70, 85]. Despite offering prosthetic options to patients who otherwise would not be suitable for one, osseointegration does require adequate bone stock within the residual bone for successful implantation. For instance, the OPRA device requires 12 cm of residual bone [46]. Further investigation may expand the indications of osseointegration.

4.2 Contraindications

Even with narrow indications for osseointegration, there is still a subset of patients who would clearly not suitable for the procedure. Patients with previous infections in the residual limb, including osteomyelitis and deep soft tissue infection should not be considered for osseointegration. Patients with poor host factors, such as wound healing and diabetes, are relatively contraindicated from surgery [46]. These patients have been excluded prospective trials. Despite these concerns, there are recent studies that suggest satisfactory results in patients with previous infections or peripheral vascular diseases, though these results stem from a very limited number of patients. Furthermore, various mental health diagnoses may serve as relative or absolute contraindications to transdermal implantation, such as conversion disorder, substance abuse, and various manifestations of post-traumatic stress. Consultation and clearance by a mental health professional familiar with amputee care is of critical importance [70, 85]. Ultimately, successful osseointegration requires multiple surgeries and prolonged rehabilitation. A multidisciplinary team of specialists is required for each patient, who must be motivated to complete the rehab protocols and remain diligent in monitoring for infection. These patients should be selected on a case-by-case basis after a lengthy discussion with their physician regarding goals and expectations.

4.3 Outcomes

While limb osseointegration is still in its infancy, studies do show favorable short- and mid-term survivability of the implants. Osseointegration in dentistry has been studied for decades while extremity OI did not begin until the late 1990’s. The OPRA device was the first OI implant to be studied for transfemoral amputees. Early recipients did not follow a gradual rehabilitation protocol, resulting in a number of revision and removal procedures [66]. In 1999, the OPRA’s surgical and rehabilitation protocol was revised to slow the progression of weight bearing. In the 51 patients studied prospectively using the revised protocol, 2-year implant survival was 92%, with only one patient requiring removal for infection and three experiencing aseptic loosening [67]. This improvement in survivorship was seen at other centers following the adoption of the revised rehabilitation protocol [43, 68, 80, 86].

Other implants, such as the Compress, OPL, and ILP demonstrate favorable survivability [83]. However, most of these results are limited to a few years of follow up and a small number of patients.

Patients generally experience improvement in quality of life as measured through standardized clinical assessments. The questionnaire for persons with trans-femoral amputations (Q-TFA), and the Short Form 36 (SF-36) are most commonly used for follow up [46, 87]. Objective walking scores, such as the six-minute walking test (6MWT) and the Timed Up and Go (TUG) test have also been shown to significantly increase following osseointegration in the lower extremity [88]. The improvement in Osseoproprioception and tactile feedback may explain some of this functional improvement [89]. Future studies of osseointegration outcomes may incorporate PROMIS scores into their follow up assessment. These score may provide more nuanced and thorough evaluations of patients’ functional improvement [90].

5 Overcoming the challenges of osseointegration

Regardless of design or surgical strategy, all OI implants face similar challenges. In particular, transdermal implants are susceptible to infections. Common organisms, such as staphylococcus aureus and coagulase-negative staphylococci species cause the majority of infections. Most infections are superficial and manifest as pain, erythema, and/or discharge at the skin implant interface [45]. Superficial infections occur in approximately half of osseointegrated patients and are usually successfully treated with antibiotics without going on to involve the underlying bone [46]. Despite occurring less frequently, deep infections pose a greater clinical challenge as they can lead to implant loosening, multiple surgical revisions, and even complete removal of the implant [43, 91].

There have been numerous strategies to mitigate the sequelae of infection in osseointegrated implants. These include close wound surveillance protocols to decrease the number of unnecessary surgical procedures for infection control [46]. Other strategies involve augments to the implant itself. These include antibiotic and silver nanopartical coatings [92,93,94]. While these efforts show promising pre-clinical results, they have yet to be tested in human subjects, so their true efficacy is unknown. Therefore alternative strategies of infection prevention and treatment need to be explored.

One such alternative strategy to combat infection is to apply external electrical stimuli to the titanium implant. A recent line of effort investigated the role cathodic electrical stimuli can have on the formation of biofilms. In conventional implants, gram-positive bacteria can form a glycocalyx biofilm that prevents penetration of antibiotics, thus making them much more resistant to antibiotic penetration [95]. Titanium is an active element with the tendency to oxidize. The oxide film that forms on the metal’s surface through passivation acts as a protective barrier against corrosive chemical reactions with the surrounding biological environment [96]. The local electrochemical environment influences this oxide film. An applied cathodic protential promotes reductive dissolution of the oxide film by lowering the valence of the titanium ions. The resulting effect is that there is an excess negative charge on the electrode surface. Consequently, this negatively charged surface prevents the formation of biofilms. Multiple mechanisms can explain this phenomenon, including electrostatic repulsion due to the negative surface of bacteria [97] and the disruption of charge distributions within the extracellular matrix of the biofilm itself [98]. Furthermore, the applied cathodic charge increases the local environments’ pH through the consumption of oxygen and free radical generation. The H2O2 that forms around these cathodes have been observed to inhibit bacterial growth. The change in pH also alters the charge of the polymeric sugars and proteins that are important for gram positive and gram negative cell walls. Altogether, there are multiple mechanisms by which an applied cathodic charge could inhibit bacterial growth.

Ehrensberger et al. [96] investigated whether cathodic voltage-controlled electrical stimulation could inhibit bacteria on titanium implants in both the in vivo and in vitro settings. The in vitro studies demonstrated that CVCES of − 1.8 V for 1 h reduced the number of colony-forming units (CFUs) of methicillin-resistant Staphylococcus aureus (MRSA) on commercially pure titanium, as well as the surrounding solution by 92% compared to open circuit potential conditions. The in vivo studies, the investigators used an infected implant rodent model with commercially pure titanium implants inoculated with MRSA. Rodents either received vancomycin for 1 week, CVCES treatment, or both. The CVCES treatment was delivered on post-operative day six with subcutaneous electrodes. The initial results demonstrated 99.8% reduction in CFUs, but bacteria were still detectible on histologic analysis. Furthermore, no deleterious effects were seen in the surrounding bone or soft tissue from CVCES treatment. In subsequent studies, treatment was prolonged to 5 weeks of antibiotics and two sessions of CVCES of − 1.8 V for 1 h [99]. Rodents that received prolonged antibiotics along with two sessions of CVCES had undetectable levels of MRSA upon final analysis. Similar results were observed when implants were inoculated with Acinetobacter baumannii when CVCES is applied for 4–8 h [100].

The utility of CVCES in the prevention and treatment of infections in osseointegrative implants is unknown. Theoretically, CVCES could be administered to the external portion of a transdermal implant to treat an underlying infection in addition to antibiotics. These electrical stimulations could also be used prophylactically to prevent infection in the first place as part of routine care for an implanted device. Further research will be needed to evaluate this technology in the clinical setting.

5.1 Infection monitoring with electrical capacitance tomography

Embedded sensors within transdermal implants could allow for passive surveillance for infection within the endosteal interface. One such technique takes advantage of the change in the local pH caused by a bacterial infection. This change in pH can be measured indirectly via the change of electrical capacitance of a pH-sensitive thin film through electrical capacitance tomography (ECT) [101]. This technique estimates the permittivity distribution of a specific region through a series of equidistantly spaced electrodes along a circular perimeter. Algorithms can then interpret these signals into images that are organized to reflect the spatial distribution of the measured capacitance within the space [101]. As such, this technology could not only inform both the presence and location of an underlying infection. Electrical capacitance tomography was originally developed in the 1980’s for flow monitoring [100] and has since been investigated for real-time navigation during total hip replacement surgery [101]. Electrical Capacitance Tomography requires no radiation or contact to the underlying implant. Furthermore, the pH sensitive layer can be spray coated as part of the implant manufacturing process [101].

Gupta et al. tested electrical capacitance tomography based sensors for pH change detection. Their pre-clinical experiments used aluminum rods that were fabricated with thin film spray coatings that changed their permittivity due to the surrounding pH to act as model osseointegrative implants. Next, ECT forward and inverse problems were then implemented and the rods were placed into buffered solutions at various pH levels. Once dried, the rods were interrogated with ECT. The results confirmed that the change in permitivitty due to pH and the ECT algorithm was able to localize these variations [101]. This model does have its drawbacks as it was tested in a controlled setting, rather than in vivo where a complex environment dictates the local pH. Furthermore, the rods used were made of aluminum rather than titanium. Therefore it is unknown if the same results would be found using the material most commonly used to manufacture transdermal bone-anchored implants. However, if this technology proves successful in further testing in more complex models, it could pave the way for an external surveillance system that could localize early signs of infection.

Sensing systems could also measure the stresses experienced by transdermal orthopaedic implants during loading. Osseointegration surgical strategies require two stages to allow for the integration of the implant into the surrounding bone. The interval between the procedures varies between institutions between 6 weeks and 6 months. It is not currently known when the implant truly incorporates into the bone and when the patient can begin rehabilitation. Furthermore, patient load bearing following the second procedure follows a slowly progressive protocol that can range from 6 to 12 months. Early protocols that called for rapid loading of the implant led to stress fractures. Current measuring techniques during this progressive load bearing protocol consists of static measurements using scales or force plates. There have been previous attempts to use an external load sensor system for osseointegrative implants. Frossard et al. [102] used a portable system that successfully measured triaxial forces and moment in real time from a sensor mounted onto the implant. However, this system required multiple sensors to be mounted to the implant and therefore was impractical.

Electrical capacitance tomography has recently been studied as a measuring tool for stresses felt by transdermal implants. Gupta et al. [103] investigated the use of multi-walled carbon nanotube based thin films to measure stresses experienced by model prosthesis. The electrical permitivitty of these thin film coatings are sensitive to changes in strain. As with previous pre-clinical ECT studies, an electrode array was used to capture the changes in permittivity within the circular area, and the ECT algorithms reconstructed the corresponding changes in the field into an image map of the stresses across the model implant. The results of the study demonstrated that noncontact strain monitoring was not possible on implants without the thin film coatings. However, the coatings increased the changes in permittivity-measured field, whereby it could be detected by the ECT sensing system [103]. The noncontact measuring system could detect both uniaxial loading as well as cantilever bending moments. These results, while promising, should be taken with caution as these experiments were carried out in a controlled setting with plastic models. The thin film coatings, which are necessary for noncontact strain detection, may block the block host bone from achieving apposition with the implant surface and thus cause integration failure. Furthermore, it is unknown if titanium would interfere with the measurement of permittivity changes. If this technology proves feasible, rehabilitation and progressive implant loading could be based on real time integration measurements rather than on time-gated protocols. Future studies should investigate the role of this technology in a more realistic setting.

5.2 Monitoring for fixation: guided wave strategy

Another novel strategy for monitoring and quantifying osseointegration is the use of guided waves. One hurdle to osseointegration sensing systems is that internal devices must be biocompatible. Furthermore, a sensor placed along the body of the implant must not interfere with the bony ingrowth to the implants surface. Guided waves eliminate this challenge by placing both wave actuators and sensing devices outside of the body onto the implant itself [104]. The goal of this strategy is to use a series of piezoelectric actuators generate waves at the tip of the prosthesis and allow the titanium implant to guide the wave to the titanium-bone interface. The waves then reflect back down the implant to a second set of piezoelectric sensing elements. As the implant integrates into the host bone, it provides more surface area by which to transfer the waves initial energy from the titanium surface [105]. Thus the measured reflective wave’s energy change would represent a change to the integration of the implant [106].

Guided wave sensing systems for osseointegration has been simulated in the laboratory settting using cylindrical. Wang et al. [105] reported using elements made of lead zirconate titanate (PZT) and niobate-lead titanate solid solution (PMN-PT) for both piezoelectric actuation and measurement. Wave motion within a rod consists of three modes: longitudinal, torsional, and flexural. The wave response from the implanted portion of rod can be decomposed into these three modes. In these experiments, metal rods were placed into high-density polyethylene (HDPE) because its young’s modulus and density closely matches human bone. Two sets of simulated implants were used: a tight and a loose implant. The results demonstrated the measured energy of the reflected waves linearly decreased as the metal rod became more “fixed” or integrated into the HDPE block [105]. This confirmed that measured reflective waves changed as more energy was imparted across the bone-implant interface. In subsequent experiments, the investigators tested the guided wave strategy with titanium rods within Sawbone femurs and found similar results as the previous studies. In the Sawbone simulation, the investigators discovered that a 50% decrease in reflected wave energy correlated to full integration of the implant [107].

Taken together, these wave measurements could serve as surrogates as changes in bone density and elastic modulus that occur with implant ingrowth. The measurements could begin immediately after implantation and could help guide rehabilitation by providing precise-real time data regarding the status of fixation. Further testing would be needed however to see if these measurements are still possible within the complex in vivo environment, soft tissues may interfere with the propogated waves, or while a patient is ambulating. Furthermore, it is unknown if these guided waves over time would interfere with the implant integration.

5.3 Brining it all together: the e-OPRA and integrated sensing systems

The field of Osseointegration continues to see advancements in multiple areas of implant design. One particular area receiving attention is the integration of electronics to enhance the neuromuscular-implant connection [107]. Until now, no human trials have tested the safety or effectiveness of osseointegrated implants with integrated sensor technology. The enhanced OPRA (e-OPRA) system allows for bidirectional communication between implanted neuromuscular electrodes and the external prosthesis. This system uses advanced algorithms and neural stimulation to provide sensory feedback to the patient and allow for volitional motor control of the prosthesis. This system is under clinical investigation in Europe and a clinical trial recently opened in the United States. Additionally, integrated sensors could provide real-time information to clinicians including implant stability and early signals of infection.

5.4 Tracking progress: the need for a osseointegration patient registry

Despite these promising avenues of research, transdermal osseointegrated amputation surgery is still in its infancy. There is a paucity of reported long-term outcomes of osseointegration patients. Furthermore, multiple centers worldwide are performing osseointegration with various implants, surgical strategies, and rehabilitation protocols, independent of each other. This necessitates the creation of an osseointegration quality registry. Doing so would allow post marketing surveillance and track implant related complications such as early mechanical failures, changes in failsafe design, and infections. Additionally, patient reported outcomes are also of interest in an effort to compare the risks and benefits of various implants.

6 Conclusion

In conclusion, transdermal osseointegrated amputation surgery is an evolving concept in amputee care. The successful anchorage of a percutaneous metal implant requires optimization of the host bone, the implants surface, and the skin-implant interface. Osseointegration will not occur without synergy between these three components. This treatment has the potential to benefit amputees worldwide, who would otherwise not be able to tolerate a traditional prosthesis. Osseointegration must be approached with caution and performed under the supervision of a multidisciplinary team of surgical and rehabilitation specialists. The life-long clinical challenges of osseointegration, including infection and loss of integration, require constant surveillance to combat. Emerging sensor technologies leverage the percutaneous nature of the metal implant to interrogate its internal environment. These sensors could provide the real time status of the endosteal surface. While some laboratory studies show promising results of these proof-of-concept technologies, more investigation is needed to demonstrate that these sensors would be both effective and safe in clinical application.