Keywords

Introduction

With deference to The Band: “In the winter of ’65, we were hungry, just barely alive.”(The Night they drove Old Dixie down, 1969). A scant century later in 1965, still hungry, having weathered college in Boston, and now commencing med school at Johns Hopkins in Baltimore, I (KRK) was serendipitously invited to join the Wilmer-based lab team of the post-doc who had patiently shepherded my undergraduate thesis. Soon thereafter, my lifetime first surgical exposure occurred when summoned to the Wilmer OR by the renowned Professor A E Maumenee as he performed a PK (penetrating keratoplasty) on a teenage boy with densely opaque corneas. After completing trephination, the surgeon literally dropped the corneal disc into my waiting fixative and bid me to “go study this with your electron microscope.” Proceeding as directed, the immediate result was the initial ultrastructural description of congenital hereditary corneal dystrophy (CHED) [1], while the long-term consequence became my fascination with corneal surgery. The Maumenee keratoplasty, by the way, was altogether up to date, securing with 10-0 Ethilon nylon, a cornea manually trephined from a whole donor eye, aided by a Zeiss operating microscope.

Fast-forward another decade, and I was again instructed by Ed Maumenee to take the Boston cornea fellowship with Dr. Claes H. Dohlman (CHD) and company. Thus the summer of 1976 was especially highlighted by keratoplasty on a patient of Claes whose multiple graft rejections necessitated a tissue matched cornea from, as it happened, Baltimore, where corneal preservation in tissue culture media had become standard. However, when confronted intraoperatively with the corneo-scleral donor cap but without a compatible trephination block, I rapidly retrieved the required Teflon punch block from my Cornea Service stash. As the patient was already under general anesthesia, the punch block was rapidly sterilized and rinsed, and with the certainty of recent Wilmer experience, I deftly punched the precious donor tissue, only to recoil in horror as the pristine cornea disc immediately assumed the consistency and transparency of a potato chip! The Teflon block, although not perceptibly warm to surgical-glove-insulated hands, was nonetheless sufficiently hot to literally cook the cornea! This was my first encounter with CHD’s somewhat guttural throat clearing, then as now, a uniform indicator of his dismay if not displeasure. Undaunted, with co-fellow Steve Foster, also witness to the catastrophe, we took this crispy cornea to the Eye Research Institute, performed ultrastructure on the tissue and thermal absorption studies on the block. Proudly offering our manuscript to Claes in anticipation of his coauthorship, his throaty response coupled with “I don’t think so” again conveyed tacit dissent. Of course, “Damn the torpedoes…,” Steve and I published nonetheless [2].

The Ghost of Keratoplasty Past : 1975–2000

The era of the late 1970s and 1980s, when I was privileged to direct the MEEI Cornea Service, offered a Periclean Age. Apart from the overarching inspiration of then Harvard Department Chair and Player-Coach CHD, the enormity of contributing colleagues included Deborah Langston, the Herpes Queen; Dick Thoft, self-styled Ocular Surface King; and Steve Foster, Creator of Corneal Immunology, plus scores of high energy clinical and research fellows who ultimately flew higher and faster than us all. Keratoplasty, although remaining almost exclusively penetrating (Fig. 13.1), was advancing on many fronts: Eye banking uniformly involved donor tissue uniformly preserved in tissue culture medium (Fig. 13.2), extending viability from two days to two weeks, and increasingly networked tissue sharing to eliminate interminable locally limited waiting lists. As refractive surgery progressed out of incisional infancy, the simplistic notion that a clear corneal graft was a success, irrespective of refractive and astigmatic issues, became more visually sensitized by development of topographic instrumentation. Transplantation immunology advanced at the basic level but remained slow in clinical translation as even large-scale clinical trials of HLA matching seemed of too limited cost/benefit to warrant broad adaptation, although advances in topical and systemic immunosuppression extended the survival of high-risk keratoplasty (Fig. 13.3). Importantly, as phacoemulsification improved throughout the 1980s, IOL materials and designs processed from anterior to posterior chamber location although the unanticipated tsunami of pseudophakic corneal edema consequent to closed loop anterior chamber IOLs busied keratoplasters for more than a decade with PK plus IOL exchanges (preferably with PC IOL replacement) [3] (Fig. 13.4).

Fig. 13.1
figure 1

Evolution of the PK. Left: Square Castroviejo-style graft, vintage 1960. Still clear 40 years later! Right: PK sutured with 24-bite 10-0 nylon running suture for accelerated visual recovery and astigmatic adjustment capability

Fig. 13.2
figure 2

Eye Banking, circa 1975. With transition from enucleated whole eye donor preparation to storage of the excised corneo-sclera in tissue culture medium (TC-199 as shown), viability of donor cornea tissue (at 4 degrees Centigrade) was greatly extended from 2 days to 2 weeks, thereby facilitating tissue sharing and elective surgical scheduling as well as eye bank networking, both regionally and internationally

Fig. 13.3
figure 3

Immunosuppression for high-risk keratoplasty. This 29-year-old woman with CHED (left) had undergone 3 penetrating keratoplasties in each eye, all rejected (middle). In 2005, PK no. 4 was performed for each eye while utilizing oral prednisone, cyclosporine, and mycophenolate (right). At 10-year follow-up, both grafts remained clear while utilizing only topical corticosteroid

Fig. 13.4
figure 4

PK and IOL exchange. During the 1980s and 1990s, the use of closed loop anterior chamber IOLs became a leading cause of corneal and macular edema (left). Penetrating keratoplasty and IOL exchange, here with iris-sutured posterior chamber IOL, was frequently required (middle). Although use of a flexible haptic anterior chamber IOL remained an option, a comparative clinical trial [3] favored the posterior chamber IOL (right)

During this period, contemporaneous developments in three distinct but related areas particularly served to enhance keratoplasty outcomes:

First, anterior segment reconstruction techniques evolved from the strategies of ocular trauma and cataract surgical complications. The penetrating keratoplasty itself greatly facilitated open sky access to performing iris- or scleral-fixated PC IOL. Mechanical and/or visco dissection of irido-corneal synechiae with iridoplasty closure recreated a central pupil and reduced adhesion recurrence with secondary glaucoma [4]. Such anatomically and visually restored eyes enjoyed long-term graft survival [5] (Figs. 13.5, 13.6, 13.7 and 13.8).

Fig. 13.5
figure 5

Anterior segment reconstruction: Rotating autograft. Left: Following corneal laceration although the central cornea remains clear, the linear stromal scar, irido-corneal adhesion and iris tears, lens remnants, and vitreous prolapse must be managed. Right: A rotating corneal autograft repositioned the scar superiorly beneath the upper lid. Removal of cataract remnants, iris-sutured posterior chamber IOL, synechiolysis and iridoplasty completed anatomical restoration and visual rehabilitation while retaining the patient’s own cornea

Fig. 13.6
figure 6

Anterior segment reconstruction: Synechiolysis. Top left and middle: In the presence of irido-corneal synechiae and iris ruptures, excision of the corneal disc is carefully performed with Vannas scissors to avoid further iris damage and to retain iris tissue segments for reconstruction. Bottom left and right: A dry cellulose surgical sponge is useful to reduce iris synechiae from the cornea and angle, supporting the keratoplasty wound margin anteriorly while depressing the adherent iris posteriorly

Fig. 13.7
figure 7

Anterior segment reconstruction: secondary IOL and iridoplasty. Top left: The posterior chamber IOL is secured by sutures (9–0 or 10–0 polypropylene or Goretex) attached to IOL haptics and buried beneath partial thickness scleral flaps. Top right and bottom left: Iridoplasty utilizes multiple interrupted sutures of 10–0 nylon or polypropylene. Bottom right: 3 years postoperatively, the PK remains clear, anterior chamber deep, pupil round and central, IOL stable, and intraocular pressure normal

Fig. 13.8
figure 8

Anterior segment reconstruction: selected cases. Top left: Monocular auto accident survivor with corneal scar, extensive iris rupture, and aphakia. Top right: Elaborated iris remnants recreate a central pupil in conjunction with PK and posterior chamber IOL. Bottom left: Seaman sustained extensive corneal and iris laceration but remarkably no lens injury. Bottom right: PK included vitreous decompression to deepen anterior chamber, whereupon 3.00–6.00 iris remnant could be released and sutured to restore central pupil while still retaining clear lens!

Second, extending Thoft’s conjunctival autograft insights and influenced by identification of limbal basal epithelium as stem cells of the corneal epithelium in the early 1980s, we initiated limbal autograft transplantation, predominantly for unilaterally chemically burned eyes [6] (Fig. 13.9). As grafted limbal stem cells (LSCs) became “the gifts that keep on giving,” definitive and permanent restoration of the ocular surface, thereby reducing corneal conjunctivalization, inflammation, neovascularization and ulceration greatly improved keratoplasty prognosis, especially if anterior lamellar keratoplasty could be performed to obviate endothelial rejection risk (Fig. 13.10). Subsequent extensions of this fundamental concept have included limbal allograft transplantation, mini-limbal biopsies expanded in tissue culture on various substrate membranes, simplified limbal epithelial transplantation (SLET) (Fig. 13.11), and nasal or oral mucous membrane transplantation [7]. Although the requirement for concomitant immunosuppressive therapy complicates management of limbal allograft variants for bilateral LSC-deficient cases (Fig. 13.12), advances in immunotherapy have improved outcomes in such “desperate times call for desperate measures” situations. The certain to be forthcoming adaptation of pluripotent mesenchymal or other stem cells via either direct or cultured application should soon render ocular surface restoration routine.

Fig. 13.9
figure 9

Ocular surface reconstruction: limbal autograft for chemical injury. Top left: Following peritomy, the fibrovascular pannus is peeled by superficial keratectomy. Bottom left: Two limbal strips from donor eye are transferred superiorly and inferiorly. Top right: Following chemical assault, vision is reduced to hand motions, as dense pannus with superior symblepharon have developed. Bottom right: Following limbal autograft alone, vision recovers to 20/60 and cosmesis is improved such that keratoplasty is not required

Fig. 13.10
figure 10

Ocular surface reconstruction: limbal autograft and lamellar keratoplasty. Left: In 1986, this 6-year-old boy sustained severe chemical injury OS by alkaline bathroom cleaner. Middle: Following limbal autograft, cornea is stable without conjunctivalization. Right: 1 year following deep anterior lamellar keratoplasty (DALK), vision is 20/40 as cornea is clear and avascular. This patient returned for more than 20-year follow-up with continuing stable visual and anatomical outcome, attesting to longevity of transferred limbal stem cells

Fig. 13.11
figure 11

Ocular surface reconstruction: simplified epithelial limbal transplantation (SLET).Two ocular chemical injury cases with 1-year postoperative follow-up demonstrate remarkable ocular surface and visual recovery. (Reproduced courtesy of V Sangwan and S Basu)

Fig. 13.12
figure 12

Ocular surface reconstruction: limbal allograft. Top left: A 32-year-old woman with Stevens-Johnson Syndrome and active corneal neovascularization. Bottom left: Following limbal transplant from identical twin sister, corneal surface is restored. Top right: A 64-year-old man following bilateral chemical injury. Bottom right: After limbal graft from brother, followed by PK, cataract + IOL and iridoplasty, corneal clarity improved. Patient lived 30 years thereafter with adequate functional vision

Third, by the mid-1990s, the role of amnion membrane both as a collagen matrix membrane but especially for its intrinsic epi- and neurotrophic growth factors, anti-inflammatory substances and neovascular inhibitors afforded a paradigm shift in the management of persistent epithelial defects, neurotrophic keratopathy and conjunctival replacement, among multiple other ocular surface applications. Various preservation and application methods have been devised by Tseng and others [8], and with respect to keratoplasty, adjunctive amnion applications for limbal transplantation and for neurotrophic corneas have greatly improved these otherwise poor surgical prognosis situations [9] (Figs. 13.13 and 13.14).

Fig. 13.13
figure 13

Amnion membrane and DALK for high-risk keratoplasty. Left: Following herpes zoster, neurotrophic keratopathy with PED, stromal thinning and scarring reduce visual acuity to 20/400. Top right: Following DALK with amnion membrane overlay (remnants present centrally) and lateral mini-tarsorrhaphy. Bottom right: Eventual corneal surface stability improves visual acuity to 20/30 and allows tarsorrhaphy release

Fig. 13.14
figure 14

Amnion membrane and DALK for high–risk keratoplasty. Top left: A 36-year-old male with herpes zoster reducing vision to hand motion due to dense corneal scar. Bottom left: Higher magnification resolves active stromal neovascularization releasing lipid and cholesterol into stroma. Right: At 6 months after DALK, amnion membrane overlay and mini-tarsorrhaphy, followed by cataract extraction, vision improves to 20/40. Note residual deep cholesterol deposits which after 12 years postoperatively have absorbed completely. Neovascularization has never recurred

Back to the Future: The Keratoplasty Revolution of Selective Lamellar Keratoplasty: 2000–Present

For the half century since initiation of modern corneal transplantation, keratoplasty almost invariably meant penetrating keratoplasty. Although increasingly successful, the common risks and comorbidities of PK, such as graft failure or rejection, glaucoma, cataract, high or irregular astigmatism, begged for less invasive and vulnerable techniques to improve vision, speed recovery and reduce complications. Despite even earlier interest in lamellar keratoplasty (LK), dating to von Hippel in 1888, the technical difficulties and imperfect visual outcomes limited its adaptation even for keratoconus and stromal scars. However, with the new millennium came the Keratoplasty Revolution, shifting from “open sky” full thickness corneal surgery to selective tissue layer replacement targeted to disease-specific indications. Thus anterior lamellar keratoplasty (ALK) for stromal replacement and endothelial keratoplasty (EK) for endothelial failure have consequently and continually reduced PK surgery, such that by 2013, in the United States, EK had exceeded PK procedures.

Revival of Anterior Lamellar Keratoplasty: Historically among the first successful keratoplasties, ALK offered solely stromal tissue replacement for scarring and keratoconus without risk of immune-driven endothelial rejection. Although the visual results of early ALK were hampered by interface irregularity, the breakthrough of deep ALK (DALK), predominantly thanks to Anwar’s “Big Bubble” technique, allowed dissection to the level of Descemet membrane and offered major visual advantages to surgeons able to master the technical challenges (Figs.13.13, 13.14 and 13.15). ALK is especially advantageous in the developing world where donor cornea supply is limited and possibly of lower endothelial quality. Even in developed areas, such as the U.S., for example, 40% of potential donor corneas are discarded predominantly for unhealthy endothelium [10]. As ALK relies solely on clear stromal matrix but not viability of donor epithelial or endothelial cells, long-term preservation methods further extend the feasibility and economy of this approach. Gamma-irradiated corneal tissue, for example, the Vision Graft Sterile Cornea (VSC , Tissue Banks International), affords such an alternative to fresh tissue. Although limited to ALK and corneal patch grafts, this tissue can also be utilized to reinforce leaking conjunctival filtering blebs, exposed glaucoma shunt tubes or eroded IOL haptic or sutures materials [11, 12]. Such grafts also have room temperature shelf life of at least one year, thereby enhancing availability and distribution, and their sterility reduces risk of donor-to-host microbial infection. Additionally with the advent of femtosecond lasers, the keratoplasty incisions for both PK and ALK can be customized in zigzag or mushroom configurations to facilitate earlier suture removal and reduced astigmatism [13]. The femtolaser can also facilitate determining the optimal lamellar dissection plane for Big Bubble DALK technique, but its use is hampered by limited access, substantial cost and application only within clear storma.

Fig. 13.15
figure 15

Deep anterior lamellar keratoplasty: In a young patient with mucopolysaccharidosis Type I-H (Hurler syndrome), DALK greatly improves clarity of stroma densely infiltrated by storage accumulations while avoiding risks of PK

Trends in Endothelial Keratoplasty: Since the early 2000s, EK has progressively become standard therapy for endothelial dysfunction, notably Fuchs dystrophy, pseudophakic corneal edema, and keratoplasty rejection. The evolution of EK is in itself epic, commencing with posterior lamellar keratoplasty (PLK) by Melles, followed by deep lamellar endothelial keratoplasty (DLEK) of Terry, then culminating with Melles’s Descemet stripping endothelial keratoplasty (DSEK), which he yet further advanced in 2006 with Descemet membrane endothelial keratoplasty (DMEK) [14, 15] (Fig. 13.16). Although a highly challenging surgical technique, DMEK is equally highly advantageous, having optimal visual acuity, rapid visual recovery and reduced allograft rejection risk [16] (Fig. 13.17).

Fig. 13.16
figure 16

Evolution of endothelial keratoplasty. In less than two decades, endothelial keratoplasty has advanced from deep lamellar endothelial keratoplasty (DLEK, left) to Descemet stripping automated endothelial keratoplasty (DSAEK, middle) to Descemet membrane endothelial keratoplasty (DMEK, right). (Reproduced courtesy of T John)

Fig. 13.17
figure 17

Descemet membrane endothelial keratoplasty. Top left: Preparation of Descemet membrane graft facilitated by trypan blue staining. Center: Descemet graft inserted as scroll into anterior chamber. Top right: Manipulation and air bubble position and secure graft. (Reproduced courtesy of T John)

As technically daunting as EK itself is the preparation of the donor tissues, requiring thin (100 u or less) stromal or solely Descemet membrane disc carriers of delicate endothelium. With the adaptation by eye banks to utilize the microkeratome for preparation of uniformly thin posterior stromal/endothelial discs for so-called Descemet stripping automated endothelial keratoplasty (DSAEK) and the extension to DMEK tissue preparation, the increasingly routine provision of “tissue ready” DSEK or DMEK tissues, often provided within single-use insertion instruments, is indicative of the increasingly adept contribution of eye banking to the Keratoplasty Revolution.

Yet other variations on the endothelial regenerative theme are of merit. For patients with moderate Fuchs dystrophy having largely central guttae and a preserved peripheral endothelium, primary descemetorhexis alone without replacement by a donor graft has attracted major interest (see Kocaba & Colby Chap. 8, this volume). This technique, termed descemetorexhis without endothelial keratoplasty (DWEK) or Descemet stripping only (DSO), involves removal of the central 4–5 mm disc of endothelium and Descemet membrane by manual descemetorhexis alone and thereby stimulates peripheral endothelial cells to migrate and cover the central defect (Figs. 13.18 and 13.19). Recognizing that postoperative corneal edema requires weeks to months for resolution as the central endothelial cell layer is regeneratively restored, the visual outcomes of successful DSO can be comparable to DMEK-treated eyes [17], and if unsuccessful, subsequent EK can still be favorably performed (see Kocaba & Colby Chap. 8, this volume).

Fig. 13.18
figure 18

Descemet stripping only (DSO). Central stripping of Descemet’s membrane without endothelial keratoplasty allows more viable peripheral corneal endothelial cells (left, specular microscopy) to migrate centrally and repopulate the central cornea (right, slit lamp retroillumination), ultimately resulting in deturgescence and visual clarity. (Reproduced courtesy of P. Veldman)

Fig. 13.19
figure 19

Descemet stripping only (DSO). Left: Slit lamp retroillumination 3 years postoperatively discloses restored corneal clarity and curved margins of Descemetorhexis. Right: Specular microscopy of central cornea resolves restoration of endothelial cell mosaic with normalized endothelial cell count and minimal pleomorphism. (Reproduced courtesy of K. Colby)

Yet another exciting variation on corneal endothelial regenerative theme are extraordinary advances of the Koizumi and Kinoshita teams in Japan where the intracameral introduction of cultured human corneal endothelial cells in the presence of Rho-kinase (ROCK) inhibitor agents has clinically achieved remarkable visual recovery (see Kinoshita Chap. 18, this volume), and most recently even without endothelial cell augmentation, DSO plus topical application of a ROCK inhibitor (ripasudil, Galanatec) has proven successful for Fuchs dystrophy [18].

Keratoconus and Ectasia: Keratoconus (KCN) and post-refractive ectasia are corneal degenerations characterized by progressive stromal thinning and steepening plus irregular astigmatism. Typically commencing in puberty, variably progressing and often stabilizing during the third decade, the incidence of KCN is 0.001–0.03% [19]. Post-refractive ectasia is rarely consequent to surgery-induced mechanical destabilization that occurs in 0.04–0.6% of predominantly LASIK cases [20]. Management of both conditions range from spectacle or contact lens use for milder cases to keratoplasty for more advanced cases. A multicenter observational study of KCN determined the 8-year incidence of penetrating keratoplasty to be 15% for patients younger than 40 and 8% for those 40 and older [21].

Commencing approximately 20 years ago, intrastromal rings, arcuate segments of rigid PMMA (Intacs and others) were devised for surgical implantation in the peripheral stroma to support and reduce irregular astigmatism (Fig. 13.20), and although not capable of stopping ectatic progression, variable improvement of corneal shape and acuity has reduced the need for keratoplasty. More recently, intrastromal rings have been found to be increasingly synergistic in conjunction with corneal cross-linking (CXL) [22].

Fig. 13.20
figure 20

Intrastromal ring segments. To reduce irregular astigmatism of keratoconus and other ectasias, arcuate PMMA (polymethyl methacrylate) segments (usually 2) are surgically implanted within the mid-peripheral stroma. Although progression of the primary disorder is not arrested, corneal flattening (left) improves refraction and contact lens tolerance

Indeed, CXL, the real game changer for ectasias since the late 1990s [23]. is a relatively minimalist procedure capable to halt, but not reverse, ectatic progression. Involving application of riboflavin (vitamin B2) solution followed by ultraviolet radiation (UV-A) to the corneal stroma to produce free oxygen radicals which promote formation of new covalent bonds between stromal collagen fibrils, clinical results of CXL (see Hersh Chap. 16, this volume) include arresting progression of the condition, vision preservation and possible improvement of topographic shape, thereby greatly reducing need for future keratoplasty [24].

With the 2016 US FDA approval of CXL, a paradigm shift in the treatment of ectatic corneal conditions (and certain infectious keratitis cases) has occurred. Combined with advances in contact lens materials as well as scleral, hybrid and PROSE lens designs (see Jacobs Chap. 24, this volume), keratoconus management has become far less surgically invasive. Moreover, the integration of CXL, intrastromal rings and even topography-guided excimer refractive surgery in selected eyes lacking extreme corneal deformity and/or refractive regulatory can further improve long-term vision [25].

Perspective on Penetrating Keratoplasty in a Lamellar and Keratoprosthesis World

As selective keratoplasty increasingly replaces the “one size fits all” of PK, the menu and algorithm of choices have become increasingly well defined. Thus, for predominantly ocular surface issues, including LSC deficiency, the performance of LSC transplantation variations can in itself be restorative and require augmentation with ALK only when stromal scarring or thinning coexist. The application of ALK, especially in developing areas where infectious or traumatic corneal scarring is prevalent and perhaps utilizing preserved donor stroma, is promising for efficiency as well as economy and especially avoidance of immune rejection. Purely endothelial dysfunction or dystrophy will be entirely within the purview of EK, and increasingly DMEK, at least until the further refinement of DSO + rho kinase inhibitor strategies. Thus, only somewhat more specialized circumstances will merit DSAEK or PK. Specific scenarios in which DSAEK is potentially preferable over DMEK include aphakia, pseudophakia with anterior chamber IOL, as well as eyes post-filtering procedures (especially, tube shunts) or post-vitrectomy. PK remains more than justified in settings of both combined stromal (e.g., scarring or ectasia) plus endothelial (e.g., edema) abnormality and especially where more complex anterior segment reconstruction (e.g., IOL exchange, synechiolysis, iridoplasty, vitreolysis) is mandated. Keratoprostheses remain required for extreme ocular surface conditions (e.g., severe dry eye, lid and sensation abnormality) and multiple immune failures with potentially broader application pending improved biocompatibility. Hence, the “Compleat Keratoplaster” is obligated to maintain an increasingly broad skill set and to devise from her/his Tool Box the most situation-specific strategy.

The Bright Bioengineered Future

As previously observed, the limited supplies and increasing costs of conventional eye banking and tissue preservation doom the ability of human donor corneas to ever meet demand, especially in the developing world. Fortunately, many previously theoretical approaches are now under active development, including xenotransplantation, tissue-engineered corneas, and cell-based therapies as well as other approaches to biologic corneal regeneration.

Xenotransplantation: Offering potential to close the gap between need versus availability of transplantable corneal tissue, the initial clinical corneal xenotransplantation was performed by Kissam in 1838 [26], over 70 years before Zirm’s historic first corneal allotransplantation[27]. Given the many anatomical and optical similarities between pig and human corneas, use of porcine corneas has been extensively researched [28]. Few recent studies of clinically relevant pig-to-nonhuman primate models report that although porcine corneas transplanted into rhesus monkeys were rejected, immune suppression with steroids prolonged graft survival [29], and acellular porcine corneal stromas (APCS) grafts both in vitro and in rabbit models showed good histocompatibility and low immunogenicity [30]. Approved for clinical trials in China, a human clinical study utilizing APCS for ALK in fungal keratitis cases demonstrated no recurrent infections, nearly complete epithelialization and substantial visual improvement [31]. Recent progress in the genetic manipulation enables pig tissues to resist the primate immune response, suggesting that as the remaining immunological barriers are overcome, pig to human corneal transplantation may become a wider reality [32].

Tissue Engineering and Regenerative Biology: The priorities for considering prosthetic corneal alternatives are transparency, tissue strength, and biocompatibility to improve integration. The culmination of CHD’s 50-year focus in developing the Boston Keratoprosthesis, from FDA approval since 1992 to the most successful artificial cornea to date, is nonetheless limited by lack of biointegration of its polymethyl methacrylate core (also see Dagher Chap. 15, this volume).

Fortunately, interest in tissue engineering and regenerative biology rapidly advances the fabricated cornea cause [33]. In principle, the two primary approaches to tissue engineering are cell-based and scaffold-based. In the former, specific cell types, such as the corneal epithelium and endothelium, play the main role in tissue engineering the coe latter, a stnstruct. In thromal scaffold affords a substrate mimicking the microenvironment supporting the constituent cell populations. Hence, many research groups now pursue developing biomimetic, cytocompatible and transplantable stromal replacements, utilizing materials ranging from classical collagen gels, films and sponges to less traditional components such as silk, fish scales, gelatin and polymers [34, 35].

An alternative strategy is to create a corneal implant that is refractively suitable and biointegratable but does not attempt to perfectly mimic the cytoarchitecture of native cornea. Materials that have been investigated for this concept include natural polymers such as gelatin, semisynthetic polymers such as gelatin methacrylate, and synthetic polymers. The most direct ways in which 3-D printing could be employed in corneal implants are to print a scaffold upon which cells are seeded (either in vitro or in vivo), the resultant synthetic or semisynthetic construct employing a “core and skirt”-like model (Fig. 13.21).The concept of 3-D bioprinting stem cells or collagen into a corneal scaffold is additionally promising. The ability to regenerate the cornea with autologous cells could dramatically improve as the clinical use of cell-based corneal replacements within tissue-engineered matrices offers major advantages including relative resistance to infection, inflammation, and immune rejection, as well as organic integration into the host tissue and formation of well-differentiated epithelia [36]. A UK group recently reported proof of concept for a 3-D-printed corneal construct of alginate and collagen containing human corneal stromal cells which were able to survive and grow [37], suggesting that tissue engineering, with the aid of 3-D printing and host cultured components, may be possible. Due to its consistent microstructure, limited cellular components and avascularity, the cornea is a near ideal candidate to become one of the first ink-printed tissues. The ability to integrate corneal imaging data and create a specific shape and curvature based on the patient’s own optical parameters is additionally exciting.

Fig. 13.21
figure 21

Diagrammatic concept for 3-D printing of artificial cornea. (a) Image acquisition via ultrasound (US) and optical coherence tomography (OCT) for modeling artificial cornea; or gathering measurements relating to the surrounding structures. (b) Image segmentation constructs a 3-D model. (c) Model is converted to standard tessellation language (STL) and sliced. (d) 3-D model printing by inkjet or extrusion-based system. (e) 3-D model implanted by a cornea surgeon similar to a standard allograft. (Modified from Fig. 2 Ludwig et al. [44])

Stem Cells: Stem cell-based therapies may eventually usurp conventional corneal transplantation even accepting the many challenges regarding each unique corneal cellular layer. As such, intensive research has focused on corneal stem cells as a source of regenerative cell-based therapy, based on increased awareness of the specific stem cell types located in each corneal layer [38].

Our knowledge of stem cells and their role in regenerative medicine has expanded greatly over the past four decades, as use of limbal stem cells (LSC) for ocular surface rehabilitation, first as in vivo autografts[6] and subsequently as ex vivo expansion grafts on amnion or collagen membranes [7], has become clinically routine for the restoration of intrinsic and acquired LSC deficiencies ([7]; see Rosenblatt and Djalilian Chap. 20, this volume). Complementary applications for stromal and endothelial stem cells, however, have remained more theoretical. Nevertheless, with identification of corneal stromal stem cells, their application as a cell-based therapy for corneal stromal scars is promising, as, for example, injection of human corneal stem cells in lumican-null mice can restore corneal transparency [39]. Similarly, identification of a higher density of corneal endothelial precursor cells in the corneal periphery holds promise for promoting in vivo endothelial cell proliferation. Apart from these intrinsic ocular sources of stem cells, several alternative sources outside the ocular milieu have also been identified, including derivatives from oral mucosal epithelium, dental pulp and hair follicle. Oral mucosal stem cells, which notably express limbal stem cell markers, have already been clinically applied for ocular surface regeneration with some success [40].

In addition to such defined stem cell sources, mesenchymal stem cells, notable for their pluripotency, express embryonic and mesenchymal stem cell markers, and can differentiate into cells of the three embryonic layers [41]. Such cells have been identified from adipose tissue, amniotic membrane, bone marrow and umbilical cord, and in particular, bone marrow-derived cells applied to animal corneas seem capable of inhibiting inflammation and angiogenesis as well as promoting wound healing [42]. Likewise, adipose-derived stem cells have been demonstrated in rabbit corneal stroma to aid in repair and regeneration through differentiation into functional keratocytes with production of cell-specific products [43]. Further work involves the use of adipose-derived stem cells to transform as human corneal endothelial cells.

To summarize, advances in corneal stem cell research hold particular promise of their use in regenerative medicine and tissue engineering. Although transplantation of LSC has become increasingly routinized, attentions now turn to use of adult and/or pluripotent embryonic stem cells for corneal epithelial restoration. Additionally, recent developments for human stromal stem cells, their immune privilege and their potential to secrete organized collagen lamella may be applicable in corneal tissue engineering. This concept should also be extendable to stem cells of the corneal endothelium and the conditions required to stimulate their proliferative activity. Unlike the selective keratoplasty principle of today, tomorrow’s cell-based therapies will target specific cellular strategies to solve specified corneal diseases.

Conclusion

In its near 120-year history, corneal transplantation may be divisible into 3 epochs.

The first half century from Zirm in 1906 through the mid-1950s established the all-important proof of concept but was hampered by relatively primitive instrumentation and optics for microsurgery, limited capabilities in eye banking and few anti-infective and anti-inflammatory medications. The ensuing five decades through the close of the twentieth century were hallmarked by linear advances in keratoplasty microsurgical technique and technology, consistent eye banking preservation and networking, plus adjunctive surgical strategies for the ocular surface, anterior segment reconstruction and cataract, as well as broad pharmacologic developments, with which multifaceted strategies could be crafter to suit complex clinical challenges (Fig. 13.22). With the new millennium, the Keratoplasty Revolution has launched two decades of exponential growth, as selective keratoplasty and keratoprostheses have opened the door through which tissue bioengineering and stem cell therapies will advance. Cell-based therapies for each corneal layer will target specific disorders and a single donor cornea might potentially treat many patients, thereby reducing the global burden of corneal blindness.

Fig. 13.22
figure 22

The creative choreography of keratoplasty. Staged application of various pharmacologic and surgical therapies requires strategy appropriate to the clinical situation. Top left: This 72-year old woman with rheumatoid arthritis and keratitis sicca presents sterile stromal perforation initially stabilized with cyanoacrylate tissue adhesive. Top right: Progressive stromalysis requires second tissue adhesive application. Bottom left: 6 months later, lamellar keratoplasty affords tectonic stability but stromal scar limits vision. Bottom right: After additional 6 months, penetrating keratoplasty, synechiolysis + iridioplasty and ECCE with posterior chamber IOL achieve 20/50 visual rehabilitation

In viewing this ever-accelerating growth curve of keratoplasty, we can see clearly now that the Best is yet to Come! What a privilege to observe and operate within this ever-exhilarating and advancing scene.