Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

The concept of the uremic syndrome caused by blood and tissue accumulation of toxic substances normally excreted in the urine was an established idea in the middle of nineteenth century [1, 2]. In the late 1800 s, renal insufficiency and concurrent uremic intoxication were treated only by simple and ineffective measures such as blood letting, dietary changes, digitalis, infusion of normal saline followed by forced diuresis, purgation, and diaphoresis [1, 3]. The period of time surrounding the beginning of the twentieth century was marked by intense research and growth in scientific knowledge that allowed the birth of clinical dialysis, a lifesaving therapy for patients with renal failure.

The Discovery of Principles of Dialysis: Diffusion and Ultrafiltration. Thomas Graham and Henri Dutrochet

The development of peritoneal dialysis in early 1900 s as a form of renal replacement therapy was made possible by remarkable progress in science and medicine that took place in the eighteenth and nineteenth centuries. In the field of physical chemistry, it was Thomas Graham (1805–1869) who completed vast work that included the discovery of laws of diffusion of gases (Graham’s law: the rate of diffusion of a gas is inversely proportional to the square root of its molecular weight), investigation of osmotic force, and separation of chemical or biological fluids by dialysis [47]. His work represents the theoretical foundation upon which clinical dialysis could later develop. Graham was born in Glasgow, Scotland; his father wanted him to study theology and enter the Church of Scotland. He became a student at the University of Glasgow in 1819, were he was attracted by the field of chemistry and attended lectures in chemistry against his father’s wishes. His passion and dedication for this science caused him later to alienate his father. Graham became professor of chemistry at numerous colleges, his lectures being attended by aspirants in chemistry and medicine, as their training was similar during that time. Between 1846 and 1861, he published an important series of papers in the Philosophical Transactions of the Royal Society: “The motion of gases” in 1846, followed by “The motion of gases part II” 3 years later, “The Bakerian lecture on osmotic force” in 1854, and “Liquid diffusion applied to analysis” in 1861 [4]. His studies led him to the innovative distinction between “crystalloids” and “colloids,” which he defined based on their ability to diffuse through a semi-permeable membrane and crystallize. He introduced the concept of “semi-permeable” membrane and redefined the term dialysis. In his experiments, he separated solutions containing sugar or gum arabic from water, using sheets of vegetable parchment impregnated with starch, acting as a “dialytic septum.” He noted that sugars can cross the semi-permeable membrane and called them crystalloids, as opposed to gum arabic, which did not cross the vegetable, semi-permeable membrane, and called these types of substances colloids. He wrote: “The molecules are moved by force of diffusion… It may perhaps be allowed to me to apply the convenient term dialysis to the method of separation by the method of diffusion through a system of gelatinous matter” [4]. As mentioned by Gottschalk, prior to this new meaning given to the term dialysis, this was used “to describe dissolution of strength or weakness of the limbs, coming from the Greek, to part asunder” [4].

Graham also suggested that animal tissue could be used as a functioning semi-permeable membrane and showed that the rate of diffusion of different molecules is inversely related to the molecular size. He also demonstrated that urea, which is present in the urine, can be dialyzed through semi-permeable membranes. Because of these brilliant discoveries that proved that solutes can be “dialyzed” or separated from a fluid using a semi-permeable membrane, he is considered to be the “father of modern dialysis.”

Prior to Graham, it was René Henri Joachim Dutrochet (1776–1846) who introduced the term osmosis to describe the movement of water through membranes that hamper the passage of solutes but allow the passage of water down concentration gradients of salts [5]. This is an early description of osmotic induced ultrafiltration. Some authors consider Dutrochet as being the “grandfather of dialysis” because he discovered the principle that explains osmotic ultrafiltration [5].

The Peritoneal Cavity and the Peritoneal Membrane

Egyptian morticians observed the peritoneal cavity as early as 3000 B.C. and recorded their observation in the Ebers papyrus [8, 9]. They described it as a “definite entity in which the viscera were somehow suspended” [8]. In the Roman times it was Galen, the Greek physician, who made thorough descriptions of the peritoneal cavity and peritoneum, which he observed by treating injuries of the gladiators [9].

Extensive knowledge of the peritoneal cavity started to accumulate in the last half of the nineteenth century, as the abdomen was often explored due to developments in abdominal surgery (see Table 1.1). In the early 1860 s, von Recklinghausen [10, 11] comprehensively described the peritoneal cavity, even proposing it was lined entirely by mesothelial cells and noting the lymphatic drainage. In 1877, Georg Wegner from the Surgical Clinic of the University of Berlin – published his “surgical comments on the peritoneal cavity” [12]. His observations were the results of experiments in rabbits where he injected hypertonic solutions of sugar, salt, or glycerin in an animal’s peritoneal cavity and found that intraperitoneal fluid volume increased. When hypotonic solutions were injected in the peritoneum, their volume decreased. In 1893, Beck described the peritoneal mesothelium and possible connections of the peritoneal cavity with lymphatics [13] and Kolossow described paths between the mesothelial cells, which he did not believe were connected with the lymphatic system [13]. The famous English physiologist E.H. Starling and his collaborator from Guy’s Hospital in London, A.H. Tubby, specifically studied the transport of fluids and solutes across the peritoneal membrane and published their result at the end of nineteenth century [14]. They determined that solute exchange was primarily between solutions in the peritoneal cavity and blood, lymphatic transport being considered negligible. They reproduced Wegner’s results and also studied the transport of indigo carmine and methylene blue and concluded that, as with water, they can cross the peritoneal membrane in both directions.

Table 1.1 Pioneering animal studies of the peritoneal membrane

Further insight into peritoneal physiology was acquired in the early years of the twentieth century. Some of the most well-known works during that period of time are those published by Cunningham, Putnam, and Engel. Cunningham [8, 15] studied the absorption of glucose from the peritoneal cavities of rats in 1920 and reviewed extensively the peritoneal structure and function in 1926. At the same time, Putnam described the dog peritoneum “as a dialyzing membrane” and brought more evidence that the peritoneum was a semi-permeable membrane that allowed bidirectional water and solute transport on the basis of the principles of osmosis and diffusion. His experiments were complex, including observations on dwell time, flow rate, fluid removal, and exchange of various solutes [16]. Advanced animal studies were done by Engel, who published his conclusions in 1927 [17]. He showed that animals could not tolerate extensive ultrafiltration and that solute clearance is directly proportional with its molecular size, the flow rate of the intraperitoneal fluid, peritoneal surface area, and blood flow. These were the times when the first attempts of therapeutic peritoneal dialysis were done in humans.

The Birth of Clinical Dialysis

In 1913–1914, Abel, Rowntree, and Turner developed a device they called the “artificial kidney” or “vivi-diffusion apparatus” using semi-permeable collodium membranes specifically designed to substitute the role of the kidneys in eliminating toxic substances when these organs are failing [4, 18]. Although they experimented with this apparatus only in animals, their intention was to develop a method of extracorporeal dialysis that could be used in humans. Their work was terminated in 1914 because of World War I. The first human hemodialysis was done in 1924 in Germany by Georg Haas, who apparently was unaware of Abel’s work in the United States [6]. Also in Germany, Heinrich Necheles had great interest in “external” dialysis, and was searching for a better dialysis membrane for his dialyzers [19]. His work with goldbeater’s skin, which was a commercial preparation of visceral peritoneum from calves' abdomen, must have stimulated Ganter to perform peritoneal dialysis [19].

First Attempts at Peritoneal Dialysis – Georg Ganter (1923)

Georg Ganter from Würtzburg, Germany is credited with the first publication regarding application of peritoneal dialysis to treat uremia. He was aware of the hemodialysis attempts of his contemporaries, and was captivated by the idea of using a patient’s own natural membranes for dialysis. He considered that application of external dialysis at the bedside would be complicated due to difficulties in establishing the extracorporeal circuit and toxic effects of the hirudin, which was used as anticoagulant [20].

In 1923, Ganter published the result of his investigations in humans and animals in his only paper, entitled “On the elimination of toxic substances from the blood by dialysis” [21]. He described his 1918 attempt to remove uremic toxins in a young man with glomerulonephritis using pleural lavage. He removed a pleural effusion and replaced the fluid with a single infusion of 750 mL of a sodium chloride solution and noted clinical improvement. The patient still died a few days after discharge, probably because Ganter did not recognize that uremic toxins will build up again [7]. He then carried out experiments on rabbits and guinea pigs made uremic by ligation of ureters and found that intraperitoneal instillation of saline solution improved the symptoms of uremia and blood urea nitrogen levels. In order to perform fluid exchanges, he used drainage tubes implanted in the peritoneal cavity and instilled saline solutions in volumes of approximately 50 mL, which were left in the peritoneal cavity for about 3 h. After this period of time the fluid was drained, with an average volume of 10–30 mL being recovered. The procedure was then repeated up to four times. He found that, after each exchange, there was almost complete equilibration of nonprotein nitrogen in the dialysate with blood concentrations and that some of the instilled fluid was absorbed. He also noted improvement in the animal’s uremic symptoms after peritoneal lavage: their appetite and activity level were improved after each exchange. Ganter used this procedure in a woman with acute uremia from bilateral ureteral obstruction due to uterine carcinoma: her condition improved transiently after a single intraperitoneal infusion of 1.5 L of physiologic saline. In another patient with a coma due to diabetic ketoacidosis, he instilled 3 L of saline intraperitoneally, and the patient’s mental status improved transiently.

His unprecedented clinical experience with intermittent peritoneal dialysis was limited, but he envisioned that this procedure could become a new form of therapy and recognized a few aspects of primary importance in its applicability: adequate access is extremely important to maintain good inflow and outflow, peritoneal infection is the most common complication and the use of sterile solutions can help prevent this complication, a large volume of dialysate is necessary in order to remove the uremic toxins, and he suggested 1–1.5 L per exchange. Dwell time also influences solute clearance and it was considered necessary that the fluid remain in the peritoneal cavity until the equilibrium between the blood and dialysate is reached. Additionally, he recommended the use of hypertonic solutions to promote fluid as well as toxin removal.

Unfortunately, Ganter did not continue research in the field of peritoneal dialysis, which evolved very slowly in the following years, probably because most of the attempts were unsuccessful in saving the life of patients.

Early Experience in Peritoneal Dialysis (1923–1950)

In 1950, Odel and his colleagues [22] summarized and analyzed the published experience with peritoneal lavage (dialysis) between 1923 and 1948 and formulated some recommendations based on published results and their own experience. They found only five papers published on this topic from 1923 to 1938 (including Ganters' paper) and as many as 33 papers between 1946 and 1948. No papers were published during World War II (1939–1945), but the number of fatal renal failure cases caused by trauma in both civilian and military patients brought this problem to the center of attention and stimulated research. They identified 101 reported patients treated by peritoneal lavage, including three patients treated by the authors. Sixty-three of the patients had reversible causes of renal failure, 32 had irreversible renal lesions, and two had an indeterminate diagnosis. Of the 101 patients reported in that 25-year period, only 36 survived: 32 patients with reversible uremia, two of the patients considered to have irreversible renal lesions, and two of those with indeterminate renal diagnosis. The most common causes of death were pulmonary edema (40%), uremia (33%), and peritonitis (15%).

The peritoneal dialysis technique was applied in a very diverse way: 22 of the reported patients received intermittent treatments, with 1–6 lavages for exchanges of 15 min to 6 h duration and 75 of the patients received continuous treatment of 1 to 21 days duration. In four cases the type of intraperitoneal lavage was unknown. There was also a great variety of solutions used for peritoneal lavage with 14 different types reported: different concentrations of sodium chloride and dextrose solutions, Ringer’s, Rhoads’, Hartmann’s two modified Tyrode’s, “A,” “P,” modified “P” solutions, Kolff’s and two unknown.

Rubber catheters introduced in the peritoneal cavity with the help of trocars, as well as glass or stainless steel tubes with multiple perforations, were used for inflow of the dialysis fluid into the peritoneal cavity, while mushroom-tip catheters of large bore, or stainless steel sump drains “similar to those perforated suction tubes used in operating rooms” were used for drainage of the peritoneal fluid from the peritoneal cavity. Catheter complications were very common and difficult to deal with: these included leakage of fluid, especially around the rigid tubes, bacterial contamination of the tubes, outflow obstruction caused by pocketing of the omentum, visceral perforation caused by the rigid tubes, and intra-abdominal hemorrhage. Other complications were noted: depletion of plasma proteins, sometimes to critical levels, in addition to derangements of the acid–base, electrolytes, and water balance. Odel and his colleagues were convinced that composition of the fluid used for dialysis was of the greatest importance and the main cause of experimental and clinical peritoneal dialysis failure was related to the imbalance of water and electrolytes. They advocated the use of a solution that would not change the normal electrolyte composition of the plasma, would permit maximal diffusion of waste products from the blood, would permit mild dehydration (moderately hypertonic solution), and would not irritate the peritoneum (the solution needs to have a pH close to that of the plasma).

It is worth noting that the early investigators were aware of the fact that peritoneal lavage aided in removal of metabolic waste products, but the concept of adequate dialysis had yet to be discovered. Frequently, the duration of dialysis was too short, or if time of dialysis was longer, the amount of dialysis fluid was not sufficient to achieve sufficient removal of waste products.

Although mortality with peritoneal lavage was high, it offered hope for effective therapy in some patients, especially in patients with reversible causes of renal failure, who were able to recover renal function before the peritoneal dialysis procedure failed.

Of the papers published after World War II, the most important are considered to be those of doctors Howard Frank, a surgical intern, Arnold Seligman, trained in chemistry, and Jacob Fine, their mentor and chief of service at Beth Israel Hospital in Boston. They worked under contract with the “Office of Scientific Research and Development” (OSRD), the federal agency created by President Franklin D. Roosevelt in 1941 to promote research for military purposes in medicine and weapons technology [23]. Their task was to work on treatments for acute renal failure in trauma patients, and because they wanted to avoid the use of anticoagulants they opted for peritoneal lavage as a good possibility. As the literature regarding the use of natural membranes was limited at the time they embarked on their project, the team began by doing very elegant studies of peritoneal irrigation in non-nephrectomized and nephrectomized uremic dogs [24]. They calculated the optimal flow rate and volume of peritoneal irrigation fluid in order to obtain the maximum urea clearance and to prevent uremia, compared the blood urea clearance by peritoneal irrigation with clearance through the kidneys, and experimented with irrigation of various parts of the gastrointestinal tract and pleural cavity, which proved to be ineffective means of urea removal. The irrigation fluid used was Ringer’s solution containing glucose, which was later changed to a Tyrode’s solution, in their search for the right solution. Their uremic, nephrectomized dogs survived for 3–10 days with peritoneal dialysis and none of them died of uremia, but rather of peritonitis. Their method involved the use of two catheters introduced into the peritoneal cavity, one of them used for inflow of irrigation fluid and the other one for drainage. Continuous irrigation of the peritoneal cavity was done for 20 h daily for 2 days and 8–12 h daily thereafter, with the outflow rate being modified in order to prevent overdistension of the peritoneal cavity. Encouraged by the results of their experimental work in dogs, in 1945 they decided to try the treatment on a patient who presented to the emergency room at Beth Israel Hospital with acute renal failure from sulfathiazole administration [23, 25]. Their treatment was successful and the patient recovered after 7 days of dialysis, using the same technique as described before. This technique can be called an “intermittently continuous irrigation,” because the fluid was introduced in the peritoneal cavity by continuous irrigation, but there were periods of time when the irrigation was stopped and so peritoneal dialysis did not take place. It is interesting that their papers do not make any reference to the first successful use of continuous peritoneal dialysis in a patient with urinary tract obstruction by Wear, Sisk, and Trinkle, and they were probably unaware of this achievement [22, 26]. Fine’s group’s success became known immediately and gave an impulse to others to use their technique. Motivated by their own accomplishment, they continued work on peritoneal dialysis and tried to perfect their work. They treated 18 more patients, but only four survived [22]. They found that peritonitis was the greater risk associated with the procedure, and the main reason for considering this method was still in the investigational phase [27]. They improved the irrigation fluid by decreasing the sodium chloride concentration to 0.74% to reduce the risk of hyperchloremia, added gelatin, and increased the glucose concentration to increase the fluid tonicity so they were able to control edema. Bicarbonate was used in the irrigation fluid to combat acidosis. The bicarbonate solution was sterilized separately and added to the solution before irrigation was initiated. Their closed system was bulky and seemed complicated, and the procedure required the constant attendance of a nurse (see Figs 1.1 and 1.2 ). The dialysis solution was sterilized and administrated from special 20-L Pyrex bottles, which required a large autoclave and were difficult to manipulate. The access was somewhat improved by introduction of a flexible sump drain that could be used as a two-way system if a separate inflow tube was not available. Most of the patients were treated with continuous flow technique, but they also used intermittent peritoneal lavage in some of the patients, with 0.5–2-L fill volumes depending on patient tolerability and 15 min to 3 h dwell time [27]. After finishing his internship, Frank remained at Beth Israel Hospital as a thoracic surgeon and member of the faculty at the Harvard Medical School and Arnold Seligman went to John Hopkins University School of Medicine, where he followed both a surgical and chemistry career [23].

Fig. 1.1
figure 1_1_978-0-387-78940-8

Schematic representation of closed system used by Frank, Seligman, and Fine (From Annals of Surgery 1948, with permission)

Fig. 1.2
figure 1_2_978-0-387-78940-8

Closed system used by Frank, Seligman, and Fine (From Annals of Surgery, 1948 with permission)

The next major step in the development of peritoneal dialysis was the work done by Arthur Grollman, from Southwestern Medical School in Dallas, Texas [28]. It is interesting that, in reality, Grollman did not believe in the value of peritoneal dialysis for treatment of acute renal failure, which he thought could be managed by conservative measures if they were properly applied [28]. His main interest was actually to find a simple way to prolong life of nephrectomized dogs, which he used to study the role of kidneys in hypertension [29] His procedure involved the instillation by gravity of the irrigating fluid in the peritoneal cavity of the dogs, using a needle introduced in the peritoneal cavity through the flank [28]. The fluid was left in the abdomen for variable periods of time, and then removed using the same size needle connected by an adapter to a rubber tube. The drainage was followed by refilling. The procedure was carried out twice daily in the morning and late afternoon and kept the dogs alive for 30–70 days after bilateral nephrectomy, compared to the previously reported average of 10 days. Although he called this technique an intermittent peritoneal lavage, it was actually a continuous type of peritoneal dialysis in the way we classify it today, because he did not have periods of a “dry” abdomen. He called it intermittent because it did not involve continuous instillation of the dialysis fluid, but rather the fluid was left to dwell in the peritoneal cavity for various periods of time. He considered that more frequent exchanges were not necessary to prolong a dog’s life, but could further decrease the level of urea and other catabolites. His kinetic studies showed that urea reaches equilibrium in 2 h after filling the peritoneal cavity of dogs with 1 L of fluid containing different concentrations of glucose. He also paid attention to the volume of fluid removed with peritoneal lavage using various concentrations of glucose in the dialysis solutions and using variable periods of time. In humans, he found equilibrium time for urea, electrolytes, creatinine, and glucose after 2 h, using 2–3 L instillation volumes of dialysis solution. He described the use of his method in five human patients. The dialysis fluid composition was modified based on patient needs and “intermittent” exchanges of 2 h duration were done for 16–48 h. The access used was for the first time a plastic tube, “to which omentum does not attach itself” [28]. The single plastic catheter was kept in place for the entire procedure and was used for both inflow, when it was attached through a needle to the infusion bottle, and outflow, when it was connected to an adapter and rubber tube for drainage. One of his patients survived, two others had some improvement in their clinical condition but died after peritoneal dialysis was stopped, and the other two did not improve at all with peritoneal dialysis. He did not have any peritonitis in humans; there was one episode in a dog dialyzed for 70 days, due to a break in the aseptic technique. Grollman considered his method superior to the continuous lavage previously described because of its simplicity and possibly increased efficiency in removal of the waste products. His technique did not require “the complex apparatus, multiple incisions, and constant attention necessary when one utilizes a constant perfusion technique as advocated by previous investigators” [28]. Additionally, he considered that continuous irrigation can create a “channeling” of the fluid between the inflow and outflow and be less efficient in urea removal due to decrease in available surface for exchange.

The Modern Era of Peritoneal Dialysis

The modern era of peritoneal dialysis started with Morton Maxwell in 1959 [30]. Maxwell started training in renal physiology with Homer Smith at the New York University School of Medicine in 1948. After he joined the staff of the VA Hospital in Los Angeles, California, he purchased a Kolff twin-coil kidney machine and started using it. He found this procedure “formidable” and expensive, with narrow applicability due to necessity of dedicated medical staff with special training, who had to work long hours to prepare the machine, deliver the treatment, and clean up after a 6-h-long session [31]. He turned his attention to peritoneal dialysis and found Grollman’s technique promising in its simplicity and worked on refining it. One of the obstacles he wanted to eliminate was the laborious way of extemporaneous preparation of dialysis solutions. Access-related complications were also a great limiting factor in peritoneal dialysis and he started experimenting with different catheters. Maxwell introduced a semi-rigid nylon catheter with a curved tip and numerous tiny distal perforations, which, similar to plastic catheters, caused less omental reaction than the rubber and metal tubes. Because it was semi-rigid, it did not have the tendency to kink as did other plastic catheters developed in the early 1950 s, and, by decreasing the diameter and increasing the number of very small perforations in the distal end, he prevented portions of the omentum from entering the catheter and this resulted in better performance. He convinced the Don Baxter Company of Glendale, California, and Cutter Laboratories of Berkeley, California, to produce a standard dialysis solution in 1-L sterile glass bottles, special Y-type administration tubing, and the new type of catheter [29, 30]. The peritoneal dialysis procedure involved insertion of the catheter into the peritoneal cavity through an incision of the abdominal wall, below the umbilicus, using a 17 French Duke trocar set [31]. Then the catheter was attached to the Y-tubing, previously connected to 2 L of warmed dialysis solution (see Fig. 1.3 ). The paired bottles were hung above the bed level and the dialysis fluid was allowed to flow into the peritoneal cavity. The tubing was clamped when the bottles were empty with some fluid still present in the administration tubing, and the bottles were then lowered onto the floor. After 1 h dwell time, the clamps were removed and the fluid was permitted to flow out of the peritoneal cavity. When the drainage was complete, a new pair of dialysate bottles was connected to the catheter using new tubing. This “intermittent” procedure was continued for 12–36 h as required buy the clinical situation and proved to be “mechanically successful” in 76 instances [31]. Maxwell and his colleagues specifically reported only six cases in their classical paper, with five survivors and one death after transient improvement with dialysis. Those patients who recovered had acute renal failure, barbiturate poisoning, intractable edema, hypercalcemia, and acute on chronic renal failure due to ureteral blockage.

Fig. 1.3
figure 1_3_978-0-387-78940-8

Maxwell’s paired bottle technique (From JAMA 1959, with permission)

The new dialysis solution contained sodium in a concentration of 140 mEq/L, chloride 101 mEq/L, calcium 4 mEq/L, magnesium 1.5 mEq/L, dextrose 15 g/L, and, for the first time, lactate 45 mEq/L. The lactate was replacing bicarbonate in the dialysis solution, eliminating the problem of precipitation of calcium salts. Potassium was excluded from the commercial dialysis solution because most of the patients with acute renal failure had hyperkalemia, but, if needed in patients with low serum potassium levels, it could be added to one of the bottles using a hypodermic syringe. The 1-L bottles were much easier to handle than the large carboys introduced by Fine’s group and generally used up to that point. If there was need for addition of other substances to the dialysis solution (potassium, dextrose, prophylactic antibiotics, heparin), they were added to one of the two bottles used for each exchange, a maneuver that, from their perspective, decreased the risk of peritonitis. Actually, they reported that in their experience peritonitis never occurred, although in fact the risk of contamination was still increased because the system was disconnected with each exchange.

Their experience in patients with chronic renal failure was unsatisfactory, but they imagined that with further improvement of the technique, peritoneal dialysis could become more efficient so that it could be applied for shorter sessions of 6–8 h duration. Theoretically, chronic patients could be admitted to the hospital at certain intervals and receive the peritoneal dialysis treatment, “in the same manner patients with refractory anemia are given transfusions at the present time” [31]. Although the procedure was still not ready for use in the treatment of chronic uremia, the fact that now it became a simpler nursing procedure and the dialysis solution was commercially available in 1-L bottles allowed it to be accepted and used more commonly as a treatment for acute renal failure. The new catheter used by Maxwell seemed to have fewer complications than previously used catheters and became widely used.

At the same time, at the U.S. Navy Hospital in Oakland, California, Paul Doolan and his team started research in dialysis, being stimulated once more by war casualties due to acute renal failure and hyperkalemia in the Korean War (1950–1953) [32]. Their goal was again to find a simple way to dialyze patients on the battle field or at the bedside, and found that peritoneal dialysis applied using Grollman’s intermittent flow technique was most appropriate. In the same year, 1959, they published their experience with the use of intermittent peritoneal lavage in ten patients [33]. They used dialysis solutions that were prepared in the hospital, with a lower content of sodium (128 meq/L), glucose used as an osmotic agent, and bicarbonate 28 mmol/L added as a buffer and, to avoid precipitation of calcium salts, they administered calcium parenterally. Potassium was added to the dialysis fluid as required by the clinical situation. Doolan and Murphy developed a polyvinyl chloride catheter with a straight intra-abdominal segment with multiple side holes, transverse ridges, and spiral grooves to avoid kinking and omental obstruction. William Murphy was the president of the Cordis Corporation and manufactured this catheter, but it did not become widely used because it was difficult to insert, sometimes even requiring laparotomy. Nevertheless, Doolan and his group used the catheter to successfully carry out intermittent flow peritoneal dialysis. The work of Maxwell and Doolan and the introduction of plastic catheters and commercially available “rinsing” solutions contributed to the widespread acceptance of peritoneal dialysis in the early 1960 s as a clinically feasible technique.

Long-Term Peritoneal Dialysis

The first chronic renal failure patient treated with long-term peritoneal dialysis was Mae Stewart, a 33-year-old black woman from San Francisco who had complications from a recent childbirth [32, 34]. In late 1959, she was referred to see Dr. Richard Ruben at Mt. Zion Hospital in San Francisco for management of her renal failure. Previously that year, Ruben had worked with Doolan at Oakland Naval Hospital, where he acquired the skills necessary to perform peritoneal dialysis. He started Stewart on peritoneal dialysis with the help of his colleagues, doctors A.E. Lewis and E. Hassid. She improved after the first dialysis session, with a decrease in serum creatinine form 20 to 13 mg/dL, but after a week her condition deteriorated again. They decided to leave the catheter in place in case they might need to use it again. As it turned out, the patient had small, shrunken kidneys and chronic renal failure due to glomerulonephritis, so her uremic symptoms returned after several days. She continued to receive in-hospital, weekly peritoneal dialysis treatments, using the same catheter left in place, so the Murphy-Doolan catheter was the first one used for chronic peritoneal dialysis. She was allowed to go home between treatments, where she was able to continue to take care of her family. Sometimes during treatments, she was disconnected form the closed system after inflow and was allowed to ambulate.

She was kept on intermittent or “periodic” peritoneal dialysis for 7 months, and the catheter was replaced only once at 3 months after starting the treatment. Intraperitoneal antibiotics were administered occasionally to prevent the occurrence of peritonitis. Later during the treatment, the patient developed pericarditis, refused to continue further treatments, and died. Mae Stewart was the first patient maintained on chronic dialysis; she started treatment in January 1960 several months before Clyde Shields started chronic hemodialysis in Seattle in March 1960. Ruben and his collaborators wrote a report of this case and submitted it to the New England Journal of Medicine, but the manuscript was rejected for publication.

During the early 1960 s, many centers were trying to use “periodic” peritoneal dialysis in patients with end-stage renal failure. The results were disappointing mainly because of frequent episodes of peritonitis due to access infection or contamination during the repeated maneuvers of changing the bottles [35]. Survival was commonly limited to only few months. As the nylon catheters were found to be unsatisfactory for long-term use, many attempts in different centers were being made to design a safe and easy method to insert an access device that would permit reliable dialysate flow and limit the infectious and mechanical complications. During that time, significant results in chronic dialysis were accomplished at the University of Washington in Seattle, where Belding Scribner and Wayne Quinton were able to maintain end-stage renal disease patients on chronic hemodialysis. However, the number of patients was much higher than what they could accommodate, and also some of the patients were running out of sites for hemodialysis access, requiring other forms of chronic therapy. Scribner, the same as others during that time, thought that peritoneal dialysis could be a good alternative to hemodialysis and invited Dr. Fred Boen, an Indonesian physician working in Holland, to come to Seattle and work on peritoneal dialysis [36, 37]. Boen had become known for his work on the kinetics of peritoneal dialysis, which was the subject of his M.D. thesis and was later published [38]. In January 1962, Boen and his team in Seattle began a program of long-term peritoneal dialysis, one of the first in the world. Around the same time, John Merrill started doing chronic peritoneal dialysis in Boston, at the Peter Bent Brigham Hospital, Harvard Medical School. Both groups presented their 3 months experience in April 1962 at the American Society for Artificial Internal Organs Meeting in Atlantic City, New Jersey [3941]. At the same meeting, Dr. J. Garrett from the Albany Medical College, New York, mentioned that he had maintained a patient on intermittent dialysis for 9 months [42]. The Seattle group developed the first automatic peritoneal dialysis machine, which was designed with the goal of minimizing the risk of contamination of the dialysis solution at the time of each exchange and also to reduce the need for nursing attendance [40]. They returned to the closed system developed earlier by Frank, Seligman, and Fine and designed a similar but automatic one. The sterile dialysis solution was contained in 20-L carboys from where it was pumped into an elevated reservoir where the fill volume was preset, usually at 2 L. From there the fill volume would enter in the peritoneal cavity using gravity flow. The inflow, dwell, and outflow times were monitored by the system’s timers, which controlled the opening of the clamps and made the procedure automatic. They used a dwell time of 30 min. The disadvantage of the system was that the 20-L glass bottles were bulky and difficult to handle and, again, required special equipment for preparation and sterilization of the dialysis solution. The advantage was that it eliminated the need for frequent system openings during each exchange by replacing the individual 1-L bottles with large carboys containing the sterile dialysis solution; this way they decreased the risk of peritonitis by contamination. Later, they used a 48-L carboy, which made it possible to use a single container and a completely closed system for each dialysis session.

Boen’s group, the same as Merrill’s group in Boston [41], tried to use a permanent, indwelling peritoneal device in order to make frequent access into the peritoneal cavity easier. Their idea was to create an artificial and permanent conduit or channel through the abdominal wall, which would allow easy passage of a catheter into the peritoneal cavity and eliminate the need for repeated paracentesis.

Boen’s access was the modification of a system developed by Garrett [40]. It was initially a Teflon hollow tube, replaced later by a silicon rubber, which was surgically implanted in the abdominal wall with one exit at the skin and the other one in the peritoneal cavity (see Fig. 1.4 ). The hollow tube had two perpendicular discs, one located just below the peritoneum in the peritoneal cavity and the other one in the abdominal wall. This tube allowed the repeated introduction of a catheter into the peritoneal cavity. At the end of the treatment, the catheter was withdrawn from the tube, which was then capped. The cap looked like a button at the skin surface and Boen’s device was later called “Boen’s button” or the “silastic button.” Others tried to create a subcutaneous access button, which required cannulation through multiple stab wounds in the skin [43]. The overall performance of these buttons turned out to be poor. Merrill reported the use of such a device in five patients who received two to 17 dialysis treatments [41]; one of the patients had acute renal failure and did not recover, two patients were dialyzed intermittently for 2 months and the other two for 3 months. One of them was even able to do eight dialysis treatments at home, with the help of her spouse. None of the patients developed clinical peritonitis, but all of them had technical failure of the access device: occlusion of the lumen due to fibrous tissue or omentum, bowel penetration, or disruption of the conduit. Garrett was also not getting good results with his button device, even after further improvements [42].

Fig. 1.4
figure 1_4_978-0-387-78940-8

Schematic representation of the “Boen’s Button

Kevin Barry and his team from the Walter Reed Army Institute of Research, Washington, D.C., considered the permanent, artificial intra-abdominal conduit as a viable access option, and in order to make it easier to insert without the need of a surgical procedure, they developed a flexible, polyvinyl cannula implanted with the help of a trocar [44, 45]. The cannula had a balloon at the intraperitoneal end, which kept the device in place after it was expanded by infusion of saline. This device was able to accommodate the standard, nylon catheter. They recruited 116 investigators from several countries to participate in trials involving these polyvinyl cannulae [45]. Frequent complications were noted, including fluid leaks, separation of the intraperitoneal balloon, massive bleeding, and bowel perforation. This device has never gained popularity. Norman Lasker was one of those who used the Barry pericannula with some success [46, 47], but abandoned this method later in favor of the Roberts and Weston stylet catheter [47].

Other investigators were trying to find ways to use implanted, indwelling catheters without the need for artificial conduits. Gutch, for instance, from the Medical Service and Dialysis Unit of the V.A. Hospital in Lincoln, Nebraska, experimented with long-term catheters of different materials and found that silicon catheters were the least irritating and caused the least protein loss in the dialysis fluid [48, 49]. He reported the use of such silicone catheters for as long as 17 months, which was a significant achievement in survival of peritoneal dialysis patients; of note, this group preferred to dialyze patients daily, rather than two or three times a week, which was the common dialysis schedule during that time. Insertion of the catheter was done as usual, through a 24 French trocar.

In 1964, Boen and his group became convinced that chronic indwelling conduits or catheters of any material were not practical for long-term peritoneal dialysis because of frequent episodes of peritonitis and adhesion formation causing technical difficulties and poor general condition [50]. Their experience in humans was limited to only two patients, but none of them did well using Boen’s button; their research in rats had demonstrated that polyethylene, Teflon, or silastic indwelling tubes inevitably produced adhesions and infections. As a result, Boen started using the “repeated puncture technique”: a new puncture and a new nylon catheter were used each time the patient was dialyzed. Using this technique in combination with the closed sterile dialysis system and their automatic cycling machine, their second patient had no peritonitis for more than 8 months, compared with the development of peritonitis after 10 weeks when using Boen’s button. The patient was dialyzed once weekly for 14–22 h and maintained a good quality of life. The trocar used for the repeated puncture was the one described by McDonald [51, 52]. Dr. Harold McDonald was a urologist who became familiar with peritoneal dialysis while training at the Peter Bent Brigham Hospital in Boston, with John Merrill’s group in the early 1960 s. There he witnessed an unsuccessful event of catheter insertion for peritoneal dialysis and became interested in developing a tool that could facilitate catheter placement and resolve the pericatheter leakage. He designed a smaller, 14 French trocar, with a triface pointed tip. The common catheters used at that time were 11 French in size and were introduced using a 24 French Duke or Ochner paracentesis trocar. This new trocar made a smaller hole in the abdominal wall for catheter insertion, which helped in diminishing leakage around the catheter.

The Tenckhoff Catheter

In 1963, Henry Tenckhoff, a German physician, accepted a fellowship position at the University of Washington in Seattle, where he replaced Charles Mion, who was returning to France [36]. Tenckhoff also developed his interest in nephrology and dialysis while working in Boston with John Merrill. He could not nurture his interest for dialysis in Germany, and decided to return to the United States to continue working in dialysis. In 1964, Boen’s team started training patients for home intermittent peritoneal dialysis using the repeated puncture technique and the Seattle, automatic closed system [36, 53]. The patients were trained in the hospital and then sent home with the dialysis equipment; the dialysis solution was prepared in the hospital, sterilized in the 40-L glass containers, and delivered to the patient’s home at regular intervals. Dialysis was done weekly, usually on weekends. Tenckhoff had to go to the patient’s home and insert the peritoneal dialysis catheter and start the dialysis treatment, which was carried out for 20–22 h each session. After this time, the patient with the help of the spouse, would terminate the treatment, turn off the machine, and remove the catheter. Initially they used the McDonald trocar to insert the catheter, but afterwards they started using the Weston and Roberts stylet catheter, which helped further in reducing the problems of bleeding and leakage. The procedure was simple and allowed long-term survival without peritonitis. In 1965, the Seattle group had treated one patient at home for 1 year, and another patient was treated using the same technique, but in the hospital, for 2 years. Soon it became clear that more than once weekly treatments were needed for the home peritoneal dialysis patient, and Tenckhoff had to go now twice a week to the patient’s residence to start dialysis. Although the previous experience with permanent, indwelling catheters was not favorable, Tenckhoff recognized that, in order to make home peritoneal dialysis a viable procedure, a safe, permanent access to the peritoneal cavity was crucial [54]. Of the previously designed catheter, he believed that the Palmer–Quinton catheter was most appropriate for chronic use, with some improvements.

Russell Palmer, a Canadian physician from Vancouver, was one of the first to do hemodialysis in North America, starting in 1946 [55]. In the early 1960 s he also became interested in peritoneal dialysis and became familiar with the work done in Seattle, including with the work of Wayne Quinton in developing the silicone arteriovenous fistula for hemodialysis. He asked Quinton to help him design a permanent peritoneal dialysis access and, after experimenting with different materials, they decided to use silicon rubber. Their final product was an 84-cm-long catheter, with a lumen of 2 mm [56]. The intraperitoneal portion was coiled and had numerous perforations extending 23 cm from the tip. At the middle of the length of the catheter, there was a triflange step for placing the catheter in the deep fascia and peritoneum. The catheter was introduced surgically in the peritoneal cavity through a midline incision located about 5 cm below the umbilicus. From this level, the extraperitoneal portion was tunneled under the skin and the exit site was in the left upper quadrant. This long, tunneled portion was designed to decrease the risk of infection due to migration of bacteria from the skin. The external portion of the catheter was capped between dialysis treatments. Although this design was innovative and allowed peritoneal dialysis treatments for more than a year, peritonitis continued to occur [57].

Tenckhoff took this catheter a step further and designed the access that is even today most commonly used for peritoneal dialysis [58]. The most important improvement was the addition of two Dacron felt cuffs, obviating the need for the triflange step, which was eliminated from the new design [54]. At that point it was recognized that Dacron felt attached to the catheters improves tissue fixation, permits tissue ingrowth, and this way creates a barrier that reduces the chances of infection. McDonald also created a permanent silicon peritoneal catheter equipped with a Teflon velour skirt in the subcutaneous tissue and a Dacron sleeve in the intramural portion [59]. After extensive animal studies, Tenckhoff and Schechter decided that they would use a silicon catheter, of 40 or 75 cm length [54]. Two Dacron felt cuffs were attached to the silastic catheter in two places, dividing the catheter into three portions: the intraperitoneal portion was a straight 20-cm tube with 60 perforations in the area15 cm from the tip. Some of the catheters also had a curled intraperitoneal section, similar to the one described by Palmer. One of the Dacron felt cuffs was located in the peritoneal cavity, abutting the parietal peritoneum. The intramural section was also tunneled under the skin, but in an arcuate pattern and varied in length from 45 to 10 cm. They shortened this segment as they felt that the presence of the two cuffs closed the catheter sinus tract at both sides and thus decreased the risk of bacterial invasion. The second cuff was placed in the subcutaneous tissue just beneath the skin. They also recommended the arcuate tunnel so that the external part of the catheter and the sinus were directed caudally. In 1968, Tenckhoff and Schechter presented their 4-year experience in eight patients: one catheter had been used without complication for as long as 14 months in one of the patients [54]. Although the Tenckhoff catheter has not completely eliminated the risk of peritonitis, it was a major breakthrough and became the most important factor in promoting peritoneal dialysis in other centers.

The Growth of and Disappointment with Intermittent Peritoneal Dialysis

The next limiting steps in the widespread use of home peritoneal dialysis was represented by the difficulties in providing the adequate supply of sterile peritoneal dialysis fluids to the increasing number of patients using this technique and patients' problems in handling the large and heavy bottles [60]. The Seattle group was still preparing the dialysate in their hospital’s “fluid factory” and was shipping the 40-L containers to the patients' homes. Charles Mion in France was using smaller, 10-L plastic containers connected in series for closed-circuit peritoneal dialysis [61]. The next proposal was to design a machine that could make sterile dialysate in the home of the patients, obviating the need for shipping large quantities of dialysate. Harold McDonald from the Department of Surgery – Urology, State University of New York, Downstate Medical Center, Brooklyn, New York, created a system that used tap water and dialysate concentrate, which could be integrated in an automatic peritoneal dialysis machine for hospital or home use [62]. Cold tap water, after going through a purifying and warming system, was mixed with the dialysate concentrate, and the resulting dialysis solution was further sterilized by passing through a 0.22-µ millipore filter before entering the peritoneal cavity. McDonald presented his system at the American Society for Artificial Internal Organs Meeting in 1969 [62]. At the same meeting, Tenckhoff presented the first system of water purification, which was developed by the Seattle group [60]. The latter experimented with different methods of water or dialysate purification, including bacterial filtration, heat sterilization, and UV-light irradiation, and found that heat sterilization using a pressure boiler tank was the only way to achieve perfect sterilization. This system was further improved and allowed production of large quantities of safe, sterile dialysate in the hospital and at home; its disadvantages were the large weight and bulkiness, high cost, and requirement for high pressure and temperatures to operate. As a result of progress made in water treatment technology, Tenckhoff and his team were able to design a new, much smaller and extremely efficient and safe system [63]. This system used the reverse osmosis method to produce large quantities of sterile, pyrogen-free water from tap water and contributed to the increase in the number of home peritoneal dialysis patients, making the Seattle center one of the largest centers for home intermittent peritoneal dialysis in the 1970 s. In 1973, they reported the experience of 12,000 peritoneal dialysis treatments in 69 patients [61]. In 1977, 161 dialysis patients had been on dialysis at this center. The other large peritoneal dialysis center in North America at that time was in Toronto, Canada, [61]. In Europe, Charles Mion, formerly trained in Seattle, was directing the third most important center in the world, which was located in Lyon, France [64].

Dimitrios Oreopoulos accepted a position at the Toronto Western Hospital in 1970, to manage a four-bed intermittent peritoneal dialysis program with approximately 16 ambulatory patients [64]. He acquired knowledge about peritoneal dialysis while training in Belfast, Ireland, where he was using the Deane prosthesis to establish access. At the beginning of his experience in Toronto, he was able to maintain patients on peritoneal dialysis for up to 20 months, and their chronic peritoneal dialysis patient population increased steadily to close to 40 patients in a few years. At the same time, one of Oreopoulos' former colleagues from Belfast, Dr, Stanley Fenton, started working at the Toronto General Hospital. Fenton had trained in Seattle with Scribner and Tenckhoff after leaving Belfast and before coming to Toronto. He had a few Tenckhoff catheters, which he showed to Oreopoulos who tried them and was so impressed with the results that he abandoned the Deane prosthesis and converted all patients to Tenckhoff catheters. Having this reliable permanent peritoneal access available, Oreopoulos began sending patients home with reverse osmosis systems. During the early 1970 s. the president of American Medical Products visited Toronto and introduced a simpler cycler machine to Oreopoulos, the one designed by Lasker.

Norman Lasker [47, 65] was another pioneer in peritoneal dialysis, who had visited Seattle and studied the automated systems developed by Tenckhoff. He considered they were superior over the manual technique and the wider application of peritoneal dialysis could be facilitated if simpler machines would be available. With the help of Gottscho Packaging Equipment Company, he designed a simpler “peritoneal dialysis cycler.” Ira Gottscho was a business man whose daughter died of kidney disease and he established a foundation in her memory. Lasker’s peritoneal cycler was simple, efficient, and easy to use: it used gravity principles and eliminated pumps, and used commercially available 2-L bottles of dialysis solution and presterilized disposable tubing and bags. By connecting four bottles, an 8-L reservoir was obtained each time [47, 65].

Oreopoulos was the first to see the value of Lasker’s work and had 40–45 patients use his cycler [64]. Another innovation was also available in Canada: in 1973–1974, Baxter provided dialysis solution (Dianeal) in plastic bags and all patients started using this product, which became available in the United States in 1978.

The high cost of care of dialysis patients was an additional factor that held back the dissemination of peritoneal dialysis. In the United States, new renal care legislation was approved in 1972 and Medicare started to cover medical expenses of end-stage renal disease patients in 1973 [47, 66], making peritoneal dialysis affordable. This modality was available not only in the hospital, but also as a home therapy, being delivered intermittently, usually three to four times a week for about 10 h per session [67]. As more experience started to accumulate, it became evident that real long-term success could not be achieved with intermittent peritoneal dialysis. In a 1979 analysis of the outcomes of chronic peritoneal dialysis therapy at the Seattle center, it was found that the cumulative technical survival rate was 72% for 1 year, 43% for 2 years, and only 27% for 3 years [68]. One of the leading causes of intermittent peritoneal dialysis failure was inadequate dialysis. Different approaches had been investigated in humans and animal models to increase the efficiency of intermittent peritoneal dialysis and summarized by Gutman also in 1979 [69]: increase of dialysate flow rate from the standard of 4 L/h to 12 L/h allowed only modest increase in clearance and was expensive, and increases in the dwell volumes over 3 L were uncomfortable for the patients. Other modalities explored were the use of tris-hydroxymethyl aminomethane (THAM) to increase the permeability of the peritoneal membrane, use of vasodilators to increase the effective surface area of the peritoneum, or the use of hypertonic solutions to increase solute removal by solvent drag [69]. None of these methods found applicability and intermittent peritoneal dialysis remained inferior to hemodialysis in terms of achievable small solute clearance. For this reason, peritoneal dialysis was considered a “second-hand” therapy for chronic renal failure until a new form of peritoneal dialysis was born in Austin, Texas.

Continuous Ambulatory Peritoneal Dialysis (CAPD)

In 1975, a young, otherwise healthy patient entered the chronic hemodialysis program directed by Jack Moncrief at the Austin Diagnostic Clinic in Austin, Texas [67]. Each arteriovenous fistula that was created in this patient failed, and in the absence of a hemodialysis access, he was advised to move to Dallas, where an intermittent peritoneal dialysis program was available. He refused to relocate and his doctors were in the situation of losing this young father of four children due to the impossibility of providing life-saving dialysis therapy.

One of Moncrief’s collaborators was Robert Popovich [67], a young biomedical engineer formerly trained in Seattle under Belding Scribner and Albert Babb. The case of their unfortunate patient was reviewed during a routine weekly meeting and the team decided to try peritoneal dialysis but in a new form, which would allow complete equilibration of plasma urea with peritoneal fluid and thus maximum urea removal with each dwell. They calculated what was the minimum volume of dialysis fluid required to remove the urea generated daily on a 1 g/kg protein diet, knowing that dialysate urea equilibrates with plasma urea in 2 h. In a 70-kg man who eats 1 g of protein per kilogram of body weight, daily urea generation will be 7,000 mg per day. They decided that a level of 70 mg/dL for the blood urea nitrogen was desired. If at equilibrium dialysate urea concentration will be 70 mg/dL, then 10 L of dialysate are required daily in order to remove the generated urea and maintain a constant plasma urea concentration. The commercial dialysate solutions were available in 2-L glass bottles. Their prescription was for 2-L fill volumes, dwell time of at least 3 h, and a total of five exchanges per day. This prescription was applied to the patient using a Tenckhoff catheter as access, and improved and controlled the patient’s chemistries, volume, and clinical status [67]. Moncrief and Popovich called this procedure the “portable/wearable equilibrium peritoneal dialysis technique” and they submitted the results of their first application of the technique as an abstract to the American Society of Artificial Organs in 1976 [67, 70]. This abstract was not accepted for presentation, probably because the name was confusing. In January 1977, Moncrief and Popovich attended the National Institute of Health Contractors Meeting, where they met Karl Nolph [71], who was practicing nephrology at the University of Missouri in Columbia, Missouri. Nolph became interested in the new technique and decided to start collaboration with the Austin group. During their initial discussions, they agreed that “continuous ambulatory peritoneal dialysis” (CAPD) might be a more appropriate name for the new modality [71]. They published the experience with nine patients treated for 136 patient weeks in a classical article that established the use of CAPD [72]. The procedure was simple and involved the continuous presence (i.e., all day long, 7 days a week) of dialysis solution in the peritoneal cavity. Manual exchanges were done 4–5 times a day, and the dialysis catheter was capped between the exchanges, allowing participation in daily activities. The dialysis was called “portable” or “internal” and did not require the presence of a machine [72]. Other advantages of CAPD were identified: it allowed continuous, steady state chemistries after a few weeks because there was constant removal of waste products from the body; the procedure could be done by the patient unaccompanied at home or “anywhere” dietary restriction was not necessarily severe (later it has been recognized that sodium restriction is actually very important); CAPD was better tolerated from the cardiovascular perspective; and larger molecule clearance was significantly higher compared with hemodialysis. CAPD did not eliminate one of the most important problems encountered in peritoneal dialysis, namely peritonitis. On the contrary, the risk of peritonitis was higher because of increased number of connections per day. Their patients had peritonitis, on average, every 10 weeks [72].

Two further modifications of the technique contributed to the decrease of peritonitis rates and facilitated worldwide acceptance of CAPD: the first was the use of dialysate in plastic bags, introduced by Oreopoulos in Canada, and the second was the introduction of the innovative Y-set connector system by Buoncristiani in Italy [64, 71, 73].

CAPD with Plastic Bags

In 1976, Jack Rubin [64, 71], one of the fellows trained at Toronto Western Hospital under Dr. Oreopoulos, was accepted by Dr. Nolph to come for further training and research at the University of Missouri in Columbia, Missouri. He became involved in the emerging CAPD program and was impressed with the new technique. A year later, he returned to Toronto and tried to convince Oreopoulos to adopt this new, continuous procedure that seemed to be better than the typical intermittent peritoneal dialysis. Because of the high peritonitis rates, he was hesitant to introduce it, until one of his patients, who had been on intermittent peritoneal dialysis for about 2 years, had to be admitted with complications related to uremia. She was severely underdialyzed, and Oreopoulos decided to give CAPD a try: she was started on CAPD on September 27, 1977 [64]. Her improvement was so dramatic that he decided to convert all his almost 40 home intermittent peritoneal dialysis patients to CAPD and was able to do this in only a few weeks [71, 73]. Patients' acceptance was excellent and the patient population continued to grow at a fast rate at his center [64]. Because, at the time, dialysis solutions were available in plastic bags were only available in Canada, they adopted a slightly different technique: after filling the peritoneal cavity with 2 L of dialysis solution, the tubing connecting the bag with the dialysis catheter was clamped and the plastic bag was wrapped around the patient, without disconnecting the bag from the catheter. After 6 h, the empty bag was placed on the floor, the tubing was unclamped, and the dialysate was allowed to drain by gravity. When the drainage was complete, the bag was disconnected from the system and a new bag was connected to the permanent catheter to repeat the cycle. They initially used the standard Y-set for acute peritoneal dialysis. The unused arm of the Y-tube was closed and tightly wrapped with the bag, making the tubing system bulky and uncomfortable. Oreopoulos was trying to improve the tubing by eliminating the redundant part and create a straight tube with a Luer connector for connection with the catheter at one end and a spike for connection with the plastic bag at the other end. After consulting with his Baxter representative, he realized they had the straight tube available from the reverse osmosis machine [64]. They started using this tube and developed the Toronto Western Hospital Technique for CAPD, known also as the “spike technique” [73, 74]. With this technique, their rate of peritonitis was decreased to one episode every 10.5 patient months [74]. As a result of this remarkable improvement and with substantial pressure from the groups in Columbia, Missouri, and Austin, Texas, and also from the National Institutes of Health, the Food and Drug Administration (FDA) finally approved the use of the plastic bags in the United States in October 1978 [64, 71].

The Y-Set and “Flush Before Fill” Technique

In the 1980 s, Dr. Umberto Buoncristiani [73] from Perugia, Italy, published incredible results with an innovative Y-set, which resulted in a significant drop in the peritonitis rates to one episode to every 40 patient-months [73]. Buoncristiani was searching for an original technique in part because his patients were refusing to switch from the intermittent modality to the more efficient CAPD, as they found the “wearable bag” distasteful [75]. He was more concerned about the high rate of peritonitis and was also trying to develop a system to decrease the risk of infection. He realized that the “contaminating act” takes place at the time when a connection is made between a new bag and the transfer set, followed by the filling phase, when the infused fluid carries bacteria into the peritoneal cavity [75]. He reintroduced the Y-tubing, connected with one arm to the catheter, and the other two arms of the “Y” to bags, one containing dialysate and the other one empty. With this technique, after the connections are made, before the draining is started, some fresh dialysate is washed out into the drainage bag, flushing with it any bacteria that might have contaminated the tubing at the time of the connection. This is followed by drainage of dialysate into the empty bag and then filling of the abdominal cavity with the new solution. After the infusion is finished, the two bags are disconnected and the Y-set is filled with an antiseptic. This technique is known as “flush before fill” or “flush after connect,” and the system is known as “the disconnect system” [73, 75]. The results were impressive, but they were not easily accepted in North America. The Italian group carried out a prospective, randomized controlled study to compare the Y-set with the standard spike system [76]. Their results were published in 1983 and were again remarkable: the peritonitis rate was one episode every 33 patient-months in the Y-set group, compared to one episode every 11.3 patient-months in the standard system [76]. A multicenter, randomized clinical trial was then carried out in Canada [77] and the results confirmed the Italian experience: their Y-connector group had one episode of peritonitis in 21.53 patient-months compared to one episode in 9.93 patient-months in the standard system group [77]. Subsequently, the Y-set technique has been accepted worldwide as standard.

With these changes, the use of CAPD increased considerably all over the world, a trend that continued through the early 1990 s.

Automated Peritoneal Dialysis (APD)

The use of machines for peritoneal dialysis was left behind for a while, as the CAPD technique proved to be much simpler and efficient than intermittent peritoneal dialysis [78]. After long-term use of CAPD, new problems have been discovered: patients were losing motivation after long periods of time using manual peritoneal dialysis; adequate dialysis was difficult to attain after the residual renal function was lost, especially in large patients, requiring an increase in the total volume of daily dialysis solution; recurrent peritonitis, especially due to touch contamination, continued to remain a problem and one of the main causes of technique failure. Interest in using the machines for peritoneal dialysis was re-established in early 1980 s.

Diaz-Buxo and his collaborators [79] introduced an automated cycler to deliver three exchanges at night, during sleep. In the morning before disconnection, the machine filled the peritoneal cavity with fresh dialysate to be drained at night, when the patient connected again to the machine. The main goal of this modality was to reduce the number of manually performed connections, and thus decrease the risk of touch contamination and peritonitis. This procedure was also a continuous one, as the fluid was always present in the peritoneal cavity and it was thus called “continuous cyclic peritoneal dialysis” (CCPD). The dwell time was supposed to be for at least 3 h, allowing complete equilibration, and small solute clearance was comparable with CAPD. Basically, the CCPD schedule was a reversal of the CAPD schedule, with the three shorter dwells performed at night and one long dwell during the day.

Around the same time, Price and Suki [80] described an “automated modification of prolonged-dwell peritoneal dialysis” (PDPD) [80], which was comparable to CAPD in improving blood chemistries and had lower peritonitis rate than CAPD, similar to results reported by Diaz-Buxo [79].

The continuous development of simpler cyclers and patient preferences has driven an increase in the use of cyclers over the years. This technique became more appealing for physicians after the development of the peritoneal equilibration test (PET) as a tool of defining peritoneal transport characteristics of individual patients [81]. Cycler-based prescription made it easier to deliver increased number of short time dwells for high transporters and maintain them on peritoneal dialysis. Later, the CANUSA study [82] showed a strong, positive correlation of the total small solute clearance with survival. National Kidney Foundation – Dialysis Outcome Quality Initiative (NKF – KDOQI) guidelines were published in 1997 and recommended a target weekly Kt/V of 2.0 for CAPD and 2.1 for CCPD, based on results of the CANUSA study [83]. The need to achieve these high targets drove an increase in APD utilization, because the use of automated machines allowed easier delivery of higher daily dialysate volumes. Subsequent reanalysis of the CANUSA study [84] showed that the decrease in the solute clearance over time was caused by uncompensated loss of residual renal function, suggesting that lower peritoneal clearance targets might be appropriate. This was confirmed by another landmark study, the ADEMEX study in 2002 [85]. Even so, APD use has increased significantly since its reintroduction, probably because it allows positive changes in the lifestyle of dialysis patients.

Peritoneal Dialysis Catheters

The Tenckhoff catheter remains the gold standard for peritoneal dialysis access and is the most widely used [58]. Individual dialysis centers' preferences for dialysis catheters are based on their particular experience and availability of other catheters. Various improvements were tried over the years in many centers, with the purpose of finding the best design with the lowest rates of mechanical and infectious complications.

In Toronto, Dr. Oreopoulos, in collaboration with Gabor Zellerman, attached three silicon discs to the intraperitoneal segment in order to prevent obstruction and migration of the catheter [39, 64]. This catheter was further improved and two variations were described a few years later [86]: Toronto Western Hospital – type 1 and 2. The first type was a double cuff straight catheter with two silicone discs attached to the intraperitoneal segment, and the type 2 was further equipped with a Dacron disc and a silicone ring at the base of the intraperitoneal cuff, which was meant to improve the seal at the peritoneal hole and prevent leaks [86].

In 1980, Ash et al. [87] introduced a “column disc catheter,” which was abandoned later in favor of the “T-fluted” Ash catheter [88]. In 1983, the Valli catheter was described: the intraperitoneal segment was enclosed and protected from omental obstruction by a silastic balloon with many holes, which also was supposed to help self-positioning [89].

The permanently bent intramural segment, known as the “swan neck” [90], decreased the risk of external cuff extrusion and it was introduced by Twardowski et al. in 1986. Two somewhat similar designs were introduced later: the Cruz (pail-handle) catheter in 1992 and the “Swan neck with elongated superficial cuff (Moncrief–Popovich)” in 1993 [39]. Twardowski and his collaborators at the University of Missouri, Columbia, Missouri, introduced several more modifications [39]: a Swan neck Missouri catheter has a slanted flange and bead attached below the deep cuff, to improve catheter fixation and decrease leaks; the Swan neck presternal catheter was introduced in 1992 and has a long, tunneled segment with the exit site located in the presternal area. This design was intended to decrease the rate of peritonitis.

Conclusion

Development of peritoneal dialysis has been a fascinating intellectual, scientific, and medical journey. Table 1.2 illustrates the most important moments in the history of peritoneal dialysis, and the most important scientists who made the development of this life-saving treatment possible and applicable to patients afflicted with severe kidney disease. In recent years, there has been a trend for a decrease in peritoneal dialysis utilization, a trend that is more pronounced in the United States and Canada. As a peritoneal dialysis modality, APD is becoming the preferred one, with most of the new peritoneal dialysis patients opting for cycler therapy. At the end of 2004, there were 1,371,000 dialysis patients worldwide and 11% of them were on peritoneal dialysis. Thirty percent of the 149,000 global peritoneal dialysis patients and 60% of the U.S. patients were on APD [91]. The decrease of peritoneal dialysis utilization is thought to be multifactorial, but there is no doubt for those who use it that it remains a very important tool in an integrated renal replacement program. Patient survival in peritoneal dialysis seems to be better on the first 2 years of treatment than with hemodialysis [92] and the technique survival at 5 years is around 50–70% [93]. Peritoneal dialysis can no longer be viewed as a second class treatment and with further technological improvement in cyclers and newer, biocompatible peritoneal dialysis solutions, and capability to deliver adequate dialysis in terms of solute and fluid removal, there is a hope that peritoneal dialysis will continue to grow.

Table 1.2 Milestones in the development of clinical peritoneal dialysis