Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

As a renal replacement therapy, dialysis, at best, can approximate only a portion of normal renal function (Table 16.1 ). Despite these deficiencies, however, dialysis has remarkably extended the lives of chronic kidney disease patients, some for decades. Although many patients do very well on dialysis, the survival of patients is markedly reduced compared to age- and race-matched people in the general population [1]. The extent to which ongoing uremia in the form of underdialysis contributes to this overall mortality rate in the dialysis population is unknown. Some have suggested that inadequacy in the prescribed dose of dialysis contributes to these high mortality rates [2, 3]. As a result, more attention has been paid to patient outcome and to optimizing total solute clearance.

Table 16.1 Solute removal by dialysis and by the human kidney

Although “adequate” dialysis is crucial for the wellbeing of any dialysis patient, adequacy of peritoneal dialysis or any renal replacement therapy is difficult to define. The word “adequacy” comes from the Latin word “adequare,” which means “to equalize.” Ideally, this would imply that adequate dialysis would return the patients' lifestyle and life-expectancy to what it would have been if the patient never had renal disease. “Optimal” dialysis prescriptions are said to be those in which there is no further incremental improvement in patient outcome as the dose is increased further, while also imparting minimal negative effects on the patient’s quality of life.

One powerful reason for the lower life-expectancy in dialysis patients is that they have at least one serious chronic medical condition (kidney disease) and therefore are “patients” and not healthy individuals. In addition, these patients tend to have multiple co-morbid diseases, particularly of the cardiovascular system, at initiation of dialysis that can also adversely influence outcome [4]. However, in providing dialysis we have some influence on the patient’s total solute clearance and ultrafiltration. Nephrologists must do their best to provide enough renal replacement therapy to be sure that the amount of dialysis delivered is not the rate-limiting step that determines whether the patient lives or dies. This chapter discusses adequacy issues for peritoneal dialysis (PD) in terms of total solute clearance and ultrafiltration. We will try to emphasize that adequacy of dialysis is much more than total small solute clearance.

Which Yardstick for Adequacy of Dialysis Should We Use?

Many of the known clinical manifestations of uremia, such as decreased appetite, metallic taste, nausea, vomiting, pericarditis, pleuritis, and encephalopathy, are obvious. There is evidence to suggest that underdialysis may be associated with hypertension [5] and lipid abnormalities [6], both of which may increase the risk of atherogenesis, cardiovascular disease and mortality. Uremic neuropathy may not be diagnosed until it is far advanced, at which time it may be irreversible [7]. Because of the insidious onset and potentially fatal or irreversible nature of some manifestations of uremia, nephrologists needed a laboratory parameter that measures the delivered amount of solute clearance while predicting patient outcome. There is no documented single substance that has been shown to be the “uremic toxin.” Undoubtedly, the clinical manifestations of the uremic syndrome are the result of the synergistic effect of an entire family of uremic toxins of both small and middle molecular weights. Therefore, because there is no single uremic toxin, we have to rely on surrogate markers for uremia. Currently, solutes such as urea nitrogen, creatinine, and β2-microglobulin are commonly used.

The Case for Small Solute Clearance

It is a common clinical experience that, if PD patients manifest uremic symptoms, they usually improve after increasing the volume or number of exchanges per day [8]. Fig. 16.1 demonstrates the theoretical influence of the number of PD exchanges on the weekly solute clearance for a wide range of molecular weights. Since increasing the number of exchanges per day results in a marked increase in small solute (MW < 500 Da) clearance, but only a minimal or negligible increase in large or middle molecule clearance, it seems that overall small solute clearance, not middle or large molecular weight clearance, may be at least partly responsible for uremic toxicity.

Fig. 16.1
figure 16_1_978-0-387-78940-8

The influence of the number of CAPD exchanges on the weekly solute clearance for a range of solute molecular weights derived from a computerized model of peritoneal transport [9]

Total solute clearance (K renal + K peritoneal = K rp) for PD is commonly estimated in terms of urea clearance as weekly Kt/V urea or, if using creatinine as weekly creatinine clearance (CCr), normalized to 1.73 m2. These calculations assume that residual renal and peritoneal clearances are equal, although this is not the case. Residual renal GFR is typically determined as the sum of urea and creatinine clearances divided by 2, and is felt to be an estimate of glomerular filtration [10]. The equations for calculating total solute clearance can be found in the Appendix.

The Case for Larger Solutes and “Middle Molecules”

In the early days of dialysis, it was recognized that patients receiving peritoneal dialysis had the same, or even improved, correction of the uremic state compared to patients receiving intermittent hemodialysis (HD). This improvement occurred even though the PD patients had higher levels of serum creatinine and urea compared to those on HD. It was postulated that the PD patients likely had better removal of larger molecular weight uremic toxins, the so-called “middle molecules,” through the more porous peritoneal membrane compared to the cellulosic HD membranes. Furthermore, because removal of middle molecules is time-dependent, peritoneal dialysis done 24 h a day, 7 days a week, would have a major advantage over intermittent therapies. Therefore, PD was recognized by many as working in a way different from HD, and the reduced small solute clearance was compensated by better removal of these middle molecules. Indeed, many working in the HD field thought that outcome in HD might be improved by optimizing middle molecule removal, and the “square meter-hour” hypothesis in HD suggested that this should be done by using larger kidneys and longer duration of HD. In other words, HD tried to recapitulate PD with longer time on dialysis and better middle molecule removal.

However, the environment changed with the publication of the National Cooperative Dialysis Study (NCDS) in [11]. This study of dose of hemodialysis compared longer and shorter times on HD and a higher and lower time-averaged concentration (TAC) of urea in a 2 × 2 factorial design. There were approximately 40 patients in each of the four groups, and the primary outcome was hospitalization for non-access-related reasons. This important study demonstrated that good outcome (remaining unhospitalized) was associated with a lower TAC urea. What also is important to recognize is that the number of hours the patient was on HD just missed statistical significance at p = 0.06, and was stated to be not significant in the abstract. This study renewed emphasis on urea removal as an important factor in the outcome of hemodialysis patients. The data from this study was reworked from the ungainly TAC urea to the new term Kt/V urea in the so-called “mechanistic analysis” of Gotch and Sargent [12]. This very important study suggested that, in patients with an adequate protein intake, the results of the NCDS could be explained by a “step function” where attainment of a per-session Kt/V urea of 0.9 was associated with good outcome, whereas values less than this were associated with “failure,” i.e., hospitalization. (Subsequent re-analyses by others suggested that the Kt/V urea and outcome could be modeled as a continuous, rather than step function [13].) The publication of the NCDS in a high-impact journal, and the development of the easier-to-use Kt/V urea refocused adequacy of dialysis toward small solute removal. The middle molecule removal, and its surrogate, time on dialysis, received much less attention. It is interesting to speculate how concepts of adequacy might have evolved if the p value for time on HD in the NCDS had been 0.05 instead of 0.06.

Small Solute Kinetics and Peritoneal Dialysis After the NCDS and Kt/V

Since urea kinetics appeared to be a good prognosticator of adequacy and outcome in patients on HD, many working in PD assumed that these indices could apply equally well to their patients. It appears that the appreciation that PD worked in a way not well-captured by small solute kinetics fell by the wayside for many as part of the new enthusiasm for urea kinetics.

Many studies, therefore, looked at patient outcomes in terms of relative risk of death or morbidity and its relationship to total small solute clearance. Their significance and relevance to predicting patient outcome have been reviewed elsewhere [14]. The multiple-outcome studies published during the 1990s differed in methodology and number of patients enrolled. All tended to conclude that outcome (relative risk of death) was, in some way, related to total small solute clearance. These studies are briefly reviewed below.

Kt/V Data: Small Solute Clearance and Outcome

Theoretical Constructs

In the original description of CAPD, Popovich and Moncrief predicted that an anephric 70 kg patient (total body water or V = 42 L) would remain in positive nitrogen balance when prescribed five 2-L exchanges/day [15]. Based on theoretical data, others felt a patient would need the equivalent of a weekly Kt/V of 2.0–2.25 [16, 17].

Based on theoretical constructs, if maintenance of positive nitrogen balance were the desired outcome, a target weekly Kt/V of 2.0–2.25 would be necessary.

Univariate Analysis

A series of cohort studies attempted to provide clinical validation of the theoretical data described above. Blake et al. [18], using urea kinetics and an anthropometric method to calculate V, found limited value in predicting patient outcome by total Kt/V (patients with a total Kt/V of < 1.5 had a higher relative risk of death). Lameire et al. [19, 20] reported that the mean weekly total Kt/V in a group of 16 patients who had been on PD at least 5 years was > 1.89 (most of whom were anuric), while DeAlvaro et al. [21] found that patients with a weekly Kt/V of >2.0 were more likely to survive. Taken together, these studies, using univariate analysis, which did not examine the role of other important variables (such as diabetes, cardiovascular disease, and age), suggested that a total Kt/V of < 1.9 was associated with an increased risk of death.

Multivariate Analysis

Using multivariate analysis of data, Teehan et al. [22] suggested that increased patient age, time on dialysis, lower serum albumin levels, and lower weekly total Kt/V were predictive of decreased patient survival. The 5-year survival for patients with a total Kt/V of > 1.89 was >90%. Maiorca et al. [23] evaluated a group of prevalent PD patients who had been on dialysis for a mean of 35 ± 26 months and followed them for up to 3 years. Patients with a mean weekly total Kt/V during the study period of at least 1.96 had a better overall survival. They did not find an increased survival rate for patients with a mean weekly total Kt/V of >2.03 versus those with a total Kt/V of 1.96–2.03. It is important to note that these prevalent patients had a mean residual renal glomerular filtration rate (GFR) of 1.73 mL/min at enrolment into the study. Indeed, this was one of the first studies to suggest that residual renal clearance had an independent effect on patient survival. Genestier et al. [24] retrospectively evaluated 201 CAPD patients followed for 23.95 ± 21.37 months using baseline values only. They found that baseline weekly total Kt/V must be higher than 1.7 for optimal survival, and their data did not support a decrease in the relative risk with any further increase in Kt/V. None of these studies evaluated the effect of the decline in overall solute clearance over time, and it was uncertain as to what benefit, if any, patients would achieve if the weekly Kt/V was higher than the recommended cutoffs.

A prospective multicenter cohort study of incident patients in several participating centers in Canada and the United States (CANUSA) evaluated the association of total solute clearance (Kt/V, creatinine clearance) and nutrition as time-dependent covariates with patient mortality, technique failure, and hospitalization [25]. Baseline residual renal GFR in this cohort was approximately 3.8 mL/min at enrolment into the study (39 L/week). These data suggested that total solute clearance (K pr) predicted outcome. Every 0.1 unit increase in total weekly Kt/V was associated with a 6% decrease in the relative risk of death; similarly, every 5 L/1.73 m2/week increase in total creatinine clearance (CCr) was associated with a 7% decrease in the relative risk of death. Over the range of the clearances studied there was no evidence of a plateau effect. The predicted 2-year survival associated with a constant Kt/V of 2.1 was 78%. The weekly total CCr that was associated with a 78% 2-year survival was 70 L/1.73 m2.

Results of the CANUSA study, therefore, in its first iteration [25] showed a significant correlation between total Kt/V urea or creatinine clearance and outcome, defined as mortality. This finding suggested that PD patients were not really different from HD patients, and that this study was the PD version of the NCDS, showing once again the importance of small solute kinetics on outcome.

Although the CANUSA study gave the most cogent evidence that survival on PD is related to total solute clearance, it is important to note that the results were based on theoretical constructs and two very important assumptions: 1) total solute clearance remained stable over time, and 2) one unit or mL/min of clearance due to residual renal function is equal to one unit or mL/min of clearance due to PD. In fact, total solute clearance decreased over time as residual renal function decreased with no corresponding increase in the peritoneal component (Fig. 16.2 ).

Fig. 16.2
figure 16_2_978-0-387-78940-8

Total solute clearance over time for residual renal function (RRF) and peritoneal clearance (K p) for both Kt/V and creatinine clearance (CCr) in the CANUSA study [25]

Around the time of the publication of the CANUSA study, however, some attention was beginning to focus on the role of residual renal function (RRF). As mentioned above, Maiorca demonstrated that patients with more residual renal function were more likely to survive, compared to those who had less renal function or were anuric [23]. An important study by Diaz-Buxo, using a large PD patient database, demonstrated that, statistically at least, the dose of PD had no impact on patient survival, whereas the amount of RRF did predict this endpoint [26].

Re-analysis of the CANUSA study, which had been seminal in refocusing adequacy of PD on small solute parameters, demonstrated that the survival benefit of higher Kt/V urea and creatinine clearance lay solely in that contributed by the RRF. The dose of PD, similar to the analysis of Diaz-Buxo, had no effect on survival (RR 1.00 for each increment in peritoneal Kt/V urea). However, for each 5 L/week of residual renal GFR, there was a 12% reduction in mortality (RR 0.88) [27]. (It is important to recognize that a weekly GFR of 5 L corresponds to a GFR of just 0.5 mL/min.) Around the same time, similar analyses of other databases came to the same conclusion: when there is RRF, the dose of PD is not associated with improved survival [20]. Within the span of a few years, studies from the Netherlands [28], Hong Kong [29], and the United States [20] all suggested that residual renal function bore a closer relationship to survival than the dose of peritoneal dialysis as measured by small solute kinetics.

These findings raise two important questions: why is residual kidney function so important, and why isn’t the dose of PD important?

The Importance of RRF

Firstly, it is important to address the possibility that the association between RRF and survival represents an epiphenomenon. It may be that intrinsically healthier patients keep their RRF longer, and they have better survival because they are healthier, and not because of anything that the kidneys are contributing. Another possibility is that, on the other hand, those who lose their RRF are selectively transferred to HD, a form of “informative censoring” [30]. However, since the association between preserved RRF and survival is also noted in those patients on HD [31], the issue of “informative censoring” may not be so important.

While ongoing GFR contributes to small solute clearance, it is likely that the renal contribution of excretion of the larger molecular weight uremic toxins is proportionately more important [32] In other words, it vastly underestimates the renal excretory function to measure it only by Kt/V urea or even creatinine clearance. For example, it has been appreciated for years that RRF is important in reducing levels of β-2-microglobulin in dialysis patients [3335]. So while the RRF may add to the calculation of Kt/V urea or creatinine clearance, it does not capture the more powerful effect that this function has on survival.

Another important aspect of RRF is salt and water excretion by the kidneys and the contribution this makes toward maintenance of the euvolemic state. Several studies have presented indirect evidence that this may be the case. For example, ongoing RRF is closely associated with PD patients remaining normotensive. Put another way, loss of RRF appears to predict the patient on PD becoming hypertensive [36, 37]. Another study that examined risk for the development of left ventricular hypertrophy (LVH) in PD patients found that those with the most preserved RRF had the lowest incidence of LVH, and, conversely, patients who had lost RRF had the highest left ventricular mass index [38]. In the reanalysis of the CANUSA study, urine volume was a stronger predictor of survival than GFR [27]. Whether diuretic-induced increases in urine salt and water excretion have the same protective effect is unknown [39].

Finally, it may be that the intrinsic “anti-inflammatory” effect of functioning renal parenchyma that has been described in the general, nondialysis population, also applies to RRF in those on dialysis. Studies in the general population have shown that loss of renal function is associated with cardiovascular death in a way that cannot be fully explained by “conventional” risk factors [40]. This effect is so powerful that a patient has a much greater chance of dying than reaching dialysis-dependence. Whatever this effect is that leads to greater cardiovascular death at a GFR of 30 mL/min compared to a GFR of 60 mL/min may also be operative at a GFR of 1 mL/min compared to a GFR of 10 mL/min

Why Isn’t the Dose of PD Important?

This is the corollary question that follows from the observation that, in the presence of RRF, the dose of PD does not predict survival. There are a number of different explanations that may not be mutually exclusive.

Dose of PD Is Important but Is Statistically Eclipsed by the Overwhelming Effect of RRF

This indeed has face validity, and is supported by studies in anuric patients that suggest that, in the absence of renal function, dose of PD as measured by small solutes [41] or ultrafiltration [42] does contribute to survival. (See section on anuric patients, below.)

Increased Dose of PD Is Important but Will Not Statistically Affect Survival in a Study with Relatively Small Numbers of Patients and Short Follow-up

Compared to studies in other subspecialties, such as cardiology, studies in dialysis have relatively small numbers of patients. Further, power calculations often tend to predict a mortality rate higher than what eventuates, leading to an inability to find statistical significance. The ADEMEX study was a randomized, controlled study of dose of dialysis in incident and prevalent patients in Mexico [43]. Close to 1,000 patients in total were assigned to either the control group (2 L four times a day) (n = 484) or the “intervention” group (n = 481), where the dose of PD was increased to the Dialysis Outcomes Quality Initiative (DOQI) target at the time of the study of a weekly peritoneal creatinine clearance of 60 L. The two cohorts were well matched in all aspects, and approximately 55% of patients in both groups were functionally anephric. During this study, the control group continued on the 2 L four times a day regimen. In the intervention group, 64% of the patients received four exchanges of 2.5 L, and the rest four exchanges of 3 L. Subsequently, 22% of the intervention group was assigned a fifth daily exchange via a night exchange device to try to get the peritoneal clearance to target. Despite all these machinations, only 59% of the patients reached the target peritoneal creatinine clearance of > 60 L/week. The use of the larger fill volumes and the night exchange device also resulted in greater daily ultrafiltration in the intervention group.

There were 157 deaths in the control group and 159 in the intervention group. The primary analysis demonstrated that the relative risk (RR) of death in the intervention group was 1.00 compared to the control group. In other words, there was no difference in the two groups, despite adequate difference achieved in small solute removal. When patients were stratified according to a number of other predefined characteristics, including age, body size, anuria, serum albumin, diabetes, and nPNA, there was still no difference in outcome between the two dosing groups. Cox multivariate analysis revealed that age, the presence of diabetes, serum albumin, and residual renal function (but not dose of PD) were all predictors of survival.

One question about the negative results of this study is whether a difference could have been seen using more patients or longer follow-up. Certainly the survival curves do not show evidence of divergence that might have become important with longer follow-up. The intervention group did have fewer deaths attributed to congestive heart failure (5.7% versus 13.4% in the control group), but the intervention group also had greater daily ultrafiltration, so this may be a volume effect, not a result of more small solute clearance. Also, the deaths were more often attributed to “uremia/hyperkalemia/acidosis” in the control group (12.2%) compared to the deaths in the intervention group (5.1%). However, it is important to remember that the physicians were not blinded to the dose of dialysis their patients received, and may have been more inclined to assign the cause of death associated with underdialysis to the patients they knew to be in the control group.

A second randomized, controlled trial of dose of dialysis and outcome was performed in Hong Kong [44]. Patients were randomized into three different dose groups of PD. There were just over 100 patients in each group. Despite randomization, there were some differences in baseline characteristics that bordered on statistical significance. The three assigned total Kt/V ranges were 1.5–1.7; 1.7–2.0; and >2.0. As with the ADEMEX study, there was a statistical difference in dose of dialysis received over the course of the study. The 2-year survival was 87.3, 86.1, and 81.5% (the highest-dose group), not statistically different from each other. Indeed, the only secondary outcome difference was that the group receiving the lowest peritoneal dose were more likely to receive erythropoietin treatment after 1 year of treatment. Having said that, however, this group also had the highest serum hemoglobin at last follow-up of 31 months (10.0 versus 8.8 versus 8.9) [44].

Greater Dose of PD May Improve Well-Being or Quality of Life but Not Affect Survival

It may be difficult to prove a difference in survival by modification of dose of dialysis over a 2-year period. Patients with serious co-morbidity will be prone to succumb to their illness regardless of the dose of dialysis, and those with little co-morbidity may tolerate underdialysis without dying, or be censored to follow-up because of supervening renal transplantation. Interestingly, in the HEMO study, which also examined the effect of dose of hemodialysis on mortality [45], the effect of low versus high flux was seen only in the subgroup of patients who were on dialysis longer than the median time of 3.7 years. One explanation may be that in the group remaining on HD for 3.7 years or longer, those with serious co-morbidity have died, and those with little co-morbidity may have been transplanted. In other words, it is possible that the dose of dialysis (in this case high versus low flux) is important only in the subset of patients with “moderate” co-morbidity. This remains conjectural.

We have witnessed poorly adherent patients with little co-morbidity who are obviously clinically underdialyzed and may have complications related to that (erythropoietin resistance, hypertension, hyperparathyroidism) and may look and function poorly, but who do not necessarily die within 2 years. Therefore, 2-year mortality may not be the best measure when examining for the effect of solute clearance on outcome. Also, an increase in the dose of dialysis may have beneficial effects other than prolonging survival. One small study showed that increased dose of PD in patients with complaints construed as uremic resulted in resolution of many of the symptoms. The benefit may have been the result of a placebo effect, as the study wasn’t blinded, but it does suggest the possibility that more dialysis can make the patient feel better, despite not affecting the “hard” outcome of survival [8]. Neither a prospective multicenter study in the Netherlands [28] nor data from the ADEMEX study [46] were able to show an association between dose of peritoneal dialysis and benefit in quality of life. In these studies, however, there was a correlation between the amount of residual renal function and quality of life.

Dose of PD and Outcome in Patients without RRF

It is clear from the review of the studies cited above that the presence of RRF has an overwhelming influence on mortality, and this may be part of the reason why the dose of PD is not significant. In patients without RRF, however, dose of PD should be statistically and clinically more important to outcome. If one were to imagine a hypothetical study in anuric patients receiving just one exchange a day versus the usual full CAPD or APD prescription, it would be much more likely that there would be a difference in endpoints of morbidity and mortality.

Studies of PD in patients without renal function support the assumption that there is an effect of PD on outcome with respect to small solute clearance parameters (in many but not all studies) and ultrafiltration. However, in the studies that demonstrate this association of dose of PD and survival in anuric patients, it is not a continuous function. In other words, dose is important to outcome to a point, after which higher doses are not associated with further improvement in survival.

Many studies have suggested that small solute dosing targets, especially creatinine clearance, are difficult to obtain in the absence of renal function. (Whether the inability to reach these targets will affect patient outcome is a separate issue, discussed later.) A cross-sectional study of 147 PD patients receiving the standard four exchanges of 2 L found that only a minority could reach even urea targets [47]. In a subsequent study by the same group, increasing the dialysis volume was only modestly effective in enabling the patients to reach target [48]. Forty percent of patients were unable to reach Kt/V targets in a UK study, even with individualization of the dialysis prescription [49]. A small point prevalent study, however, of just under 50 anuric patients in a large Canadian PD program demonstrated that adequate Kt/V and creatinine clearance targets could be obtained with increased prescription of either CAPD or APD, especially in larger male patients. Once again, outcome measures were not examined [50]. A retrospective analysis of data on 122 functionally anuric CAPD and APD patients found that the majority of patients were able to reach the then-DOQI target for Kt/V urea, but only approximately 35% could obtain the target for creatinine clearance. Given that the renal contribution to the excretion of creatinine is more substantial than the renal urea excretion, it is perhaps not surprising that creatinine targets were harder to reach in this cohort without kidney function. Follow-up of the patients showed that 27 patients died, 30 were transferred to hemodialysis (mainly for peritonitis, not inadequate dialysis), and nine were transplanted. The rest remained on PD [51]. No relationship was found between small solute clearance parameters and technique survival. However, there was an overall patient survival advantage for those with Kt/V greater than a weekly Kt/V urea of 1.85 (RR mortality 0.54, p = 0.10). The peritoneal dose effect was statistically significant when adjusted for age, sex, the presence of diabetes, and multiple other co-morbidities.

Szeto et al. examined the effect of dose of PD and outcome in a single centre in Hong Kong [41]. One hundred and forty CAPD patients were followed prospectively for a median period of 20 months once they had become anuric. During that time there were 46 deaths, and actuarial patient survival at 24 months was approximately 70%. By multivariate analysis, independent predictors of survival included the presence of diabetes, duration of dialysis before anuria, serum albumin concentration, and the dose of PD, measured either by Kt/V urea or creatinine clearance. The authors concluded that peritoneal clearance does have an effect on clinical outcome when there is no renal clearance [41].

The NECOSAD database in the Netherlands recently reported on the outcome of 130 patients who became “anuric” (urine volume < 200 mL/day, a generous definition of anuria) over the course of the study [52]. The statistical approach can affect the outcome and is rather complex. Most of the patients were on CAPD, and not cycler dialysis. When the small solute parameters were analyzed as continuous variables, or divided into quintiles, there was no association with survival. However, as in the Baskharan study from Toronto, when a predefined cutoff Kt/V urea or creatinine clearance was used, it was found that values below a Kt/v urea of 1.5 or creatinine clearance of 40 L/week were associated with an increased risk of mortality. These values were lower than those in the Toronto analysis. Two-year survival was 67% in this cohort, compared with 84% in the whole NECOSAD group. Other predictors of survival included older age, more co-morbidity, longer duration of dialysis before “anuria,” and low serum albumin. These were similar variables predicting outcome as were found in the Hong Kong study. The NECOSAD analysis also suggested that ultrafiltration volume was associated with better outcome, although again the statistical analysis for this is rather complicated [52]. Furthermore, given that the reanalysis of the CANUSA data showed that every 250 mL of urine output was associated with a 36% reduction in mortality [27], taking a 24-h urine output of less than 200 mL may have confounded the analysis with patients who received benefit of some residual renal function.

The European APD Outcome Study (EAPOS) prospectively examined outcome in anuric cycler patients in multiple centers [42]. Patients received a dosing regimen of APD to try to reach predetermined targets of a total weekly creatinine clearance of 60 L and a daily ultrafiltration of 750 mL. The cohort included 177 patients, 31 of whom died over the course of the study. In multivariate analysis, predictors of survival included age, co-morbidity, and malnutrition as assessed by Subjective Global Assessment. Increased ultrafiltration at baseline was associated with better survival, but dose of dialysis by small solute parameters did not predict survival. Interestingly, time-dependent analysis of residual renal function in this almost-anuric population came close to reaching statistical significance, again pointing to the powerful effect of even the smallest amount of renal function. Unlike the other studies cited, time on dialysis before entry into this study did not influence survival. The authors suggested that there is a minimum level of small solute clearance necessary to prevent uremic complications, but once that is reached, perhaps other risks such as volume overload or vascular disease become more important [42].

Indeed, the conclusion from the EAPOS is congruent with the other studies on anuric patients. A unifying hypothesis could be that the dose of PD in anuric patients is important to a point; higher doses may not have an impact on survival. However, this hypothesis does not directly address the following question: if residual renal function is so strongly predictive of survival in patients on PD and eclipses PD prescription, what should we do with the PD patient once the residual renal function is gone? Demonstrating, as these studies have, that there is a plateau effect of PD dose and outcome doesn’t really address the possibility that this is a high-risk population who might be better served by a change to hemodialysis, with its attendant higher weekly small solute clearance, that might compensate for the loss of residual renal function. (Having said that, however, there is a small body of literature that suggests that residual renal function is important in patients in hemodialysis also [31], and that dose of hemodialysis may not affect outcome when there is significant RRF [53].) This is totally conjectural, however, and could only be addressed by a study that randomized anuric PD patients to either continuing on PD or changing to hemodialysis. This study is unlikely to be done, and there are likely important quality-of-life issues associated with a change of dialysis modality, especially if it entails moving from a home-based therapy to in-center hemodialysis.

Therefore, the anuric PD patient needs to be monitored closely, with particular attention paid to achieving a minimum weekly small solute clearance (Kt/V urea 1.5? 1.7? 1.85?) and maintaining sufficient ultrafiltration and dietary adherence to keep the patient euvolemic. Furthermore, since removal of middle molecular weight uremic toxins is time-dependent, anuric PD patients should receive dialysis 24 h a day, if at all possible, to maximize removal of these toxins.

Current Recommendations for Removal of Small Solutes in Patients on PD

The International Society of Peritoneal Dialysis (ISPD) commissioned a working group with representation from North America, Asia, Australia, and Europe to formulate recommendations for the delivery of adequate peritoneal dialysis, taking into account the more recent studies cited in this chapter. Their recommendations were published in 2006 [54]. The focus of the recommendations pertain to targets for the removal of both solutes and fluid. With respect to small solute removal, it is recommended that the total (renal and peritoneal) weekly Kt/V urea should be greater than 1.7. In the face of residual renal function, the renal and peritoneal Kt/V urea can be added together, although the contribution of renal and peritoneal clearances are likely very different from each other, and it is an oversimplification to simply add the two together. While it was felt that a separate recommendation for creatinine clearance wasn’t necessary, it was recognized that in patients on cycler a weekly minimum clearance of 45 L/1.73 M2 should be reached. (The reason for adding a creatinine clearance target for the patients on cycler stems from the temptation to provide rapid cycles that could increase the removal of urea but compromise removal of more slowly-transported, larger uremic toxins.) Peritoneal dialysis should be carried out 24 h a day, except in special circumstances, because of the time-dependence for the transport of larger molecular weight toxins. Although a daily ultrafiltration target was not specified, it was emphasized that maintenance of euvolemia is an important part of adequate dialysis, and attention must be given to urine and ultrafiltration volume [54]. Within the confines of financial and adherence realities and limitations, the dose of peritoneal dialysis should be increased as a trial in any patient who is not doing well because of underdialysis or for uncertain reasons.

The National Kidney Foundation in the United States published similar recommendations in 2006 [55]. Furthermore, both the peritoneal and hemodialysis guidelines had sections on the preservation of residual kidney function. Specifically, it was recommended to try to avoid nephrotoxins and other renal insults in the dialysis patient with significant RRF (defined as > 100 mL/day). Two small but carefully done studies have suggested that the use of angiotensin converting enzyme (ACE) inhibitors [56] or angiotensin receptor blocker (ARB) agents [57] may help to preserve RRF in patients on peritoneal dialysis. Therefore, the 2006 KDOQI iteration also recommends the preferential use of these agents for the treatment of hypertension in PD patients with RRF, and that “consideration” be given for the use of these agents in the PD patient without hypertension, in order to preserve the RRF [58].

The Australian guidelines put forward by CARI (Caring for Australians with Renal Impairment) published on the Internet in 2005 recommends that the weekly total urea Kt/V be ≥ 1.6, and that the creatinine clearance be no less than 60 L/week in high and high average transporters, and 50 L/week in low and low-average transporters (www.cari.org.au). The renal association from the United Kingdom has similar recommendations (1.7 and 50 L/week) and also recommends that if ultrafiltration does not exceed 750 mL/day in an anuric patient, consideration be given to a change in therapy (www.renal.org). The European Best Practice Guidelines published in 2005 have the same target Kt/V urea [59]. A minimum creatinine clearance of 45 L/week/1.73 M2 was recommended for those patients on cycler. Finally, there is an ultrafiltration goal of 1 L/day for those patients who are anuric. [59] While it is important to emphasize the importance of maintenance of euvolemia, especially in patients without RRF, recommendations from groups other than those from Europe/United Kingdom eschewed an arbitrary ultrafiltration goal because of the variability of salt and water intake among patients.

Peritoneal Membrane Transport Characteristics

The first step in tailoring an individual patient’s peritoneal dialysis prescription is to know that patient’s peritoneal membrane transport characteristics. Unlike HD, where the physician has a wide menu of dialyzers to choose from for each individual patient, peritoneal dialysis patients are “born” with their membrane. A change in peritoneal dialysis dose can usually be accomplished only by changing dwell time, dwell volume, or the number of exchanges per day. At present there is no clinically proven way to alter membrane transport.

Solute clearance by PD is related to the dialysate to plasma ratio (D/P) of the solute multiplied by the drain volume (DV). Therefore, patients with small drain volumes tend to have lower clearances. It is important to point out that rapid transporters of creatinine/urea also tend to be rapid absorbers of dialysate glucose. Therefore, in these patients, although the D/P ratios of urea and creatinine at 4 h or longer dwells tend to be close to unity, their drain volumes tend to be small due to reabsorption of fluid once the glucose gradient is dissipated (Fig. 16.3 ). With dwell times associated with standard CAPD, rapid transporters may have drain volumes that are actually less than the instilled volume. For these patients, short dwell times are needed to reduce or minimize fluid reabsorption and optimize clearances. In patients who are slow transporters, peak ultrafiltration occurs later during the dwell and net ultrafiltration can be obtained even after prolonged dwells. In these patients the D/P ratio increases almost linearly during the dwell. It is not until 8–10 h that the D/P ratio reaches unity. For these patients dwell time is the crucial determinant of overall clearance, and they will do best with continuous therapies such as standard CAPD or CCPD, which utilize 24 h/day for dialysis and maximize dwell time/exchange. If a patient has a large body surface area he/she may need higher doses (i.e., large volumes of these therapies). Therefore, although a patient may have rapid transperitoneal solute flux, the amount of solute ultimately removed may be compromised by poor ultrafiltration and low drain volumes. On the contrary, a slow transporter may have solute removal not that different from the rapid transporter because of good ultrafiltration and large drain volumes. There are other ways to classify a patient’s peritoneal membrane transport, but in the usual clinical setting the peritoneal equilibration test (PET) is the most practical.

Fig. 16.3
figure 16_3_978-0-387-78940-8

Idealized curves of creatinine and water transport during an exchange with 2 L of 2.5% glucose dialysis solutions in patients with extremely low and high transport characteristics [60]

Mass transfer area coefficients (MTAC) [6164] are more precise and more succinctly define transport. These define transport independent of ultrafiltration (convection-related solute removal), and hence are not influenced by dwell volume or glucose concentration. The practical use of MTAC for modeling a PD prescription requires additional laboratory measurements and computer models, but once these are obtained MTAC can be used in the clinical setting [64]. The standard peritoneal permeability analysis is another test to follow transport and ultrafiltration characteristics [6567]. The major difference in this test is that it is better able to determine sodium sieving and can better differentiate the causes of ultrafiltration failure.

Influence of Body Size on Solute Clearance

It is intuitive that removing the same amount of solute from a 55 kg elderly female would represent relatively more clearance than the same amount of solute removed from an 80 kg muscular male. These patients have different metabolic rates. Therefore, the absolute daily clearance must be normalized for differences in body size (by V or volume of distribution for Kt/V and by BSA or body surface area for CCr). This modeling also predicts that, if one were to increase the instilled volume/exchange from 2.0 to 2.5 L or greater, under similar conditions total solute clearance should increase, and the instilled volumes needed to achieve small solute clearances can be predicted [68, 69]. However, using larger volume of dialysate may be complicated by higher intra-abdominal pressure, which may be uncomfortable for some patients and put patients at risk for complications related to this pressure Furthermore, larger dwell volumes are not very effective in increasing removal of middle molecular weight uremic solutes [70].

In a review of 806 PD patients, the median BSA was 1.85 m2 (not 1.73 m2), whereas the 25th percentile was 1.71 m2 and the 75th percentile was 2.0 m2 [69]. Despite this finding that most PD patients in North America are larger than the “standard” BSA of 1.73 m2, based on clearances predicted by kinetic modeling, if one were able to individualize therapy (increased instilled volumes, daytime exchange for CCPD), the recommended acceptable target total solute clearances should be achievable for most patients on PD. Those patients who are slow transporters and those who have large body surface areas (> 1.8 m2) would need close observation once anuric.

Special Considerations

Rapid Transporters

As mentioned above, individual peritoneal membrane transport characteristics are important in determining total solute clearance and ultrafiltration rates in PD patients. Rapid transporters tend to optimize both solute clearance and ultrafiltration after a short dwell time (approximately 2 h) and therefore are likely to do well with shorter dwell therapies such as APD, with or without one or two daytime dwells. (The long daytime dwell, however, poses a challenge for the maintenance of euvolemia in the rapid transporter, discussed below.) Despite this relative ease in the ability to achieve recommended total solute clearance goals, these patients have been shown to have an increased relative risk of death and a decreased technique survival [71]. In the CANUSA Study, patients with a 4-h D/P creatinine of > 0.65 (rapid transporters) were compared to those with a 4-h D/P creatinine of < 0.65 (slow). The 2-year probability for technique survival was 79% among slow transporters compared to 71% for rapid transporters. The probability of 2-year patient survival was 82% among slow transporters versus 72% for rapid transporters, with a relative risk of death of 2.18 for rapid versus slow transporters using this definition for rapid transporters (D/P creatinine > 0.65).

Other investigators have made similar observations [72]. Nolph et al. [73] noted that rapid transporters (D/P > 0.81) had increased incidence of malnutrition. Heaf [74] noted increased morbidity in rapid transporters. These findings have not been confirmed in other studies [42, 43, 75].

The reason(s) for this increased relative risk in rapid transporters on peritoneal dialysis are unclear [76]. There may be a tendency towards malnutrition because of the increased peritoneal protein losses in rapid transporters. Many have shown that as the D/P ratio at 4 h increases, there tends to be increased protein loss. As dialysate protein losses increase, serum albumin levels decrease, and serum albumin correlates inversely with peritoneal membrane transport [7779]. However, it has been noted that rapid transporters are hypoalbuminemic even before they start on PD and before peritoneal albumin loss can be implicated [80]. This would suggest that both the hypoalbuminemia and the rapid transport status are associated with some other factor, such as systemic inflammation, discussed below. Another way to explain the low levels of albumin may be secondary to overt or subtle volume overload. The typical dwells associated with CAPD are associated with problems with ultrafiltration in these patients. This volume overload may lead to increased blood pressure and/or increased left ventricular hypertrophy with its associated increased risks of death. Furthermore, to optimize ultrafiltration, these patients will probably increase the percentage glucose in their fluids, leading to better ultrafiltration, but increased glucose absorption. This increased glucose absorption may [81, 82] or may not [83] inhibit appetite. Another explanation is that rapid transport status is a reflection of chronic inflammation, and these patients have reduced survival as a result of the systemic inflammation, rather than because of any downstream effects at the level of the peritoneal membrane [84]. It has been suggested that rapid transporters change to nocturnal intermittent peritoneal dialysis (NIPD) with or without a short daytime dwell. This change maintains total solute clearance, while decreasing the need for hypertonic glucose exchanges, therefore decreasing the relative amount of glucose absorption [85]. There are no data, however, to evaluate outcome for these patients who change to NIPD, although there may be an increase in nutritional parameters. There may be a small but insignificant decrease in protein losses when changing from CAPD to NIPD [78, 85]. Conversely, patients on APD may lose more protein each day than those on CAPD [86].

The problem with changing to NIPD is that the prolonged period with dialysate dwell will dramatically reduce the time-dependent transport of middle molecular weight uremic toxins and toxins that transport slowly, such as phosphorus. Therefore NIPD should be considered only when there is significant residual renal function (GFR > 5 mL/min). Otherwise there is a sizable risk of inadequate dialysis (Other ways to deal with the long dwell of APD in rapid transporters is discussed below.).

Acid–Base Metabolism

An essential component of providing “optimal” dialysis is correction of metabolic acidosis. Chronic acidosis has a detrimental effect on protein, carbohydrate and bone metabolism. In PD patients, body base balance is self-regulated by feedback between plasma bicarbonate levels and bicarbonate gain/loss [84, 87, 88]. Dialysis must provide sufficient replenishment of buffers to compensate for the daily acid load. Lactate (concentration 35–40 mmol/L) is the standard buffer in currently available PD solutions, although recently there has been more experience with bicarbonate-based solutions. Lactate is converted to pyruvate and oxygenated or used in gluconeogenesis with the consumption of H+ and the generation of bicarbonate [89]. Some CAPD patients remain acidotic. With lactate-containing buffer solutions, buffer balance is governed by the relative amounts of H+ generation, bicarbonate loss, lactate absorption, and lactate metabolism [9094].

Ultrafiltration volume can affect bicarbonate loss [87]. Acid-base status is also related to peritoneal membrane transport type [95]. The D/P creatinine ratio from PET is positively correlated with lactate gain, dialysate base gain, and arterial bicarbonate concentration. Rapid transporters gain more lactate during the typical dwell and tended to have higher arterial bicarbonate levels. In 44 patients with metabolic alkalosis the mean D/P creatinine was 0.66 ± 0.12, whereas in those with metabolic acidosis the mean D/P creatinine was 0.59 ± 0.09 (p < 0.005). Most CAPD patients have stable mean plasma bicarbonate levels of about 25.6 mmol/L using a dialysate lactate of 35 mmol/L. Increasing dialysate lactate results in a higher serum bicarbonate level [96]. Lowrie and Lew noted no correlation between serum albumin levels and bicarbonate concentration in hemodialysis patients [97], and this was confirmed by Bergstrom [98]. Control of acidosis is important to prevent protein catabolism [99101].

A recent survey of serum bicarbonate levels in PD patients showed that only consumption of the phosphate binder sevelamer hydrochloride was associated with a decline in serum bicarbonate concentration [102].

Other aspects of acid-base metabolism are discussed in the chapter 20 on noninfectious complications.

Normalization

Severely malnourished patients tend to have a lower ratio of total CCr to total Kt/V urea. This is because malnutrition causes a relatively greater decrease in V than BSA (Table 16.2 ). Therefore, when normalizing Kt or CCr, the Kt/V is increased proportionally more than CCr. Total body water (V) can be estimated as a fixed percentage of body weight, or more accurately, by using anthropometric formulas based on sex, age, height, and weight such as the Watson [103] or Hume [104] formulas in adults (Table 16.3 ). These equations provide unrealistic estimates for V in patients whose weights are markedly different from normal body weight (NBW). BSA is usually calculated using the formula by Dubois and Dubois [105]. Jones [106] noted that when actual body weight (ABW) was used for normalization there was no difference in total solute clearance (Kt/V or CCr) between well-nourished and malnourished patients. However, when calculated V and BSA were determined using desired body weight (DBW), there was a statistically significant difference between the groups for both weekly Kt/V (1.68 ± 0.46 versus 1.40 ± 0.41, p < 0.05) and for creatinine clearance in L/1.73 m2/week (52.5 ± 10.3 versus 41.6 ± 19.0, p < 0.01). These data suggest that, in malnourished, underweight individuals, if ABW is used to calculate V, the resultant V will be much smaller than that found if the larger DBW is used. In these instances, when Kt is divided by V, the resultant Kt/V may look inappropriately at target, despite progressive anorexia and weight loss in the patient [107, 108].

Table 16.2 Adequacy calculations: what weight should be used?
Table 16.3 Equations for normalization (calculating V for Kt/V or BSA for creatinine clearance)

For well-nourished patients with a stable weight that is not markedly different from desired or ideal weight, the patient’s actual body weight can be used in these equations for determinations of V and BSA. However, when markedly different from ideal, the choice of which weight to use in clearance calculations becomes important. One option is to use the patient’s “desired body weight” during PD training and always use that weight in adequacy calculations, to avoid this problem if the weight changes significantly. DBW is expressed in kilograms, and is the midpoint of the range of body weights associated with the greatest longevity for normal individuals of the same age range, sex, and skeletal frame size as the individual in question. These body weights are published in the Metropolitan Life Insurance Company actuarial tables.

How to Monitor Dialysis Dose

The most accurate way to measure dialysis dose is to measure the total amount of the solute in question cleared from the body during a specific time interval. In practice, for CAPD, this means that 24-h collections of dialysate and urine should be obtained. Total solute clearance is calculated as described in the Appendix. An alternative would be to estimate the daily clearance either mathematically or with the use of computer-assisted kinetic modeling programs. It is important to remember that these estimations are truly approximations and should be used with this limitation in mind. Although there is a correlation with the actual clearance measured from 24-h collections, there is a high degree of discordance [109, 110]. Therefore, the “gold standard” for measurement of dialysis dose is to obtain 24-h collections of both urine and dialysate to document the delivered dose of dialysis. It has been suggested that these studies should be obtained quarterly and within 1 month of any prescription change [55]. However, if the patient has a lot of residual kidney function, it may not be necessary clinically to repeat these tests so often as long as the kidney function is being monitored monthly or bimonthly. Despite these recommendations, data from the United States suggest that adequacy studies are not done in many PD patients [68, 111, 112].

PET data are obtained in order to characterize the patient’s peritoneal membrane transport characteristics (see above), not to determine clearance. The two tests are complementary to each other and are routinely used together for developing a patient’s dialysis prescription and for problem solving.

Several studies have documented that an individual patient’s peritoneal membrane tends to be stable over time [19, 25]. However, in some patients it may change. Therefore, peritoneal transport should be monitored to optimize clearance and ultrafiltration. PET is the most practical way to do this, and it is recommended that it be obtained if there is a suspicion of a significant change in membrane properties, such as new-onset difficulty with ultrafiltration. It has been shown that, over time, if there tends to be any change in transport characteristics, the D/P ratios are likely to increase slightly, associated with a small decrease in ultrafiltration. Alternatively, one can estimate D/P values for PET from D/P values on 24 h dialysate collections (Dialysis Adequacy and Transport Test, or DATT) when followed sequentially in an individual patient [113, 114]. These values tend to be slightly higher than the D/P for creatinine on the standard PET. The DATT is not recommended for work-up of ultrafiltration failure, but if the patient’s dialysis prescription has not changed (i.e., instilled volume and percentage glucose), sequential DATT measurements do predict peritoneal membrane transport as long as instilled volume/exchange has not changed. Twenty-four-hour collections can also be used to calculate lean body mass (LBM), creatinine generation rates, and estimate protein intake.

Noncompliance

Noncompliance with a medical regimen is not uncommon and can adversely affect patient outcome. The degree of noncompliance with the dialysis prescription itself is easily documented in in-center HD populations [115], although its impact on outcome has been difficult to define. Certainly there is an element of noncompliance with a home therapy such as PD. Until recently there have been no simple ways to evaluate noncompliance in these populations [116]. It has been suggested that a value above unity for the ratio of measured to predicted creatinine production may be an indication of recent periods of noncompliance [117, 118]. These authors speculated that, if the patient had been noncompliant prior to collecting 24 h of dialysate and urine for adequacy studies, but was compliant on the day of collection, then a “washout” effect would occur. In such a case the creatinine production would be higher than would be predicted from standard equations, resulting in an elevated ratio of measured to predicted creatinine production. In other words, if a patient has been underdialyzing chronically and then increases the dialysis just before testing, there will be a disproportional amount of creatinine lost in the dialysate (similar to urinary creatinine excretion during recovery from acute renal failure). However, others have shown that the index is not a good indicator of compliance and that higher-than-expected creatinine removal may simply reflect good muscle mass and creatine generation [67, 119]. Nevertheless, noncompliance with the prescription is a real issue, as documented by patient questionnaires [120] and by looking at patient home inventories [121]. Interestingly, noncompliance may be more of an issue in the United States than in other countries, such as Canada.

At present there is no definitive test, short of asking patients and looking at home inventories, to determine patient compliance. A patient who has, say, a sudden doubling of the serum creatinine in the absence of any change in RRF likely has stopped performing exchanges. More importantly, one should discuss compliance with the patient and have a heightened awareness of the problem. Be sure to design PD prescriptions with patients' lifestyle needs and abilities in mind. Five manual exchanges a day is not realistic for most patients. Automated therapies may help, but it is important to find out how much time the patient usually spends in bed, and what time they have to get up in the morning. For example, 9 h overnight is not a good fit for many people, who have to get to work or get their children off to school. A more patient-friendly prescription might entail 7 or 8 hours overnight, compensated by one exchange during the daytime. For example, the daytime exchange could be at lunchtime, after work, or after dinner. Education and importance of compliance with prescription should be emphasized.

Prescription Dialysis

Timing for Initiation of Dialysis

The NKF-DOQI Guidelines [55] and others [122] have highlighted the need to treat patients throughout the stages of chronic kidney disease (CKD) as a continuum. Data are emerging that suggest that the patient’s clinical status, especially nutritional status and left ventricular functional status, at the onset of dialysis is a major predictor of eventual morbidity and mortality.

The traditional absolute indications for initiation of dialysis, such as pericarditis, encephalopathy, refractory hyperkalemia, nausea, vomiting and volume overload, raise little controversy. Weight loss and signs of malnutrition are other more subtle, “relative” indications for initiation. Interestingly, as opposed to stage 5 CKD where minimal target values for weekly total solute clearance have been established, minimal values for total solute clearance by residual renal function alone prior to initiation of dialysis have not been established. This seems paradoxical. In fact, in the CANUSA Study the mean GFR at initiation was approximately 3.8 mL/min [25]. Recent data demonstrates that the RRF at the start of dialysis is increasing over time, and the patients with the greatest co-morbidity have the most RRF at the start of dialysis. Presumably this is because they are not able to compensate for the decline in kidney function, where those patients with little or no co-morbidity can tolerate a very low glomerular filtration rate. Since there is compelling data for PD, and less so for HD that suggests that RRF is a predictor of long-term outcomes, it might follow that the degree of renal function at the start of dialysis could play a role in outcome. However, study of this is confounded by the association of more RRF at the start of dialysis with more co-morbidity, as noted above, and also that the amount of predialysis care and monitoring could be associated with an earlier start, with more renal function, onto dialysis. Furthermore, if the observation by Berlanga is correct, an earlier start onto PD might be associated with prolonged persistence of RRF and a survival advantage [123]. Finally, early referral can be associated with “lead time bias,” wherein longer survival on dialysis may simply be a function of an earlier start onto the therapy, and not with an overall prolongation of life expectancy [124]. This guideline does not preclude the importance of quality-of-life issues, blood pressure control, treatment of anemia, and consideration of protein restriction and use of acetylcholinesterase inhibitors to prevent progression of disease. It still suggests individualization of therapy initiation. The recommendations are based on the following indirect evidence.

Outcome Data and the Confounding Effect of Early Start Dialysis, Co-morbidity, RRF, and Predialysis Education

Some databases suggest that outcomes are better for patients who start dialysis “early” than in those who start “late.” Bonomini et al. [125] showed that the 5-year survival in 34 patients who started dialysis when their residual renal CCr was > 10 mL/min was 100% compared to an 85% survival in 158 patients starting dialysis with a CCr < 5 mL/min. Tattersall et al. [126] have shown that the level of renal function at the initiation of dialysis was an independent predictor of patient outcomes. Indeed, many studies have suggested that those referred late (< 1 month prior to initiation of dialysis) had a higher hospitalization rate and cost [127], and morbidity and mortality rates on dialysis [128133].

Taken in aggregate these studies suggest, at least in part, that outcome on dialysis is related to the level of residual renal function at the initiation of dialysis. One explanation for these observations may be the influence of residual renal solute clearance on the patient’s nutritional status, and the ongoing, poorly understood anti-inflammatory effect of persistent kidney function.

Initial Prescription

When a patient presents with advanced CKD and PD is initiated, there are two alternatives for writing the initial prescription. These are based on the patient’s residual renal function and symptoms. For those patients with minimal residual renal function (the definition of which is not evidence-based, but is taken to be < 100 mL/day) the initial prescription should consist of a “full dose” of PD in order to meet minimal total solute clearance goals. If the patient has a significant amount of residual renal function “incremental” dosage of PD can be initiated.

In both instances the initial prescription is based on the patient’s body size (BSA) and amount of residual renal function (both potentially known variables at initiation of dialysis). At initiation, peritoneal transport is not known. Initial prescriptions are based on the assumption that the patient’s peritoneal transport is average. Once an individual patient’s transport type is known, his/her prescription can be more appropriately tailored. During training, transport type can be predicted from drain volume during a timed (4-h) dwell with 2.5% glucose [134].

Because peritoneal transport can change over first 1–4 weeks of therapy, it is recommended that baseline PET be delayed until after 4 weeks on PD.

“Incremental” Dialysis

One can use empirical prescriptions based on kinetic modeling for implementing PD using an “incremental” approach. These are also based on BSA and residual renal clearance. The 2006 KDOQI guidelines suggest that the total (peritoneal and renal) Kt/V urea be no less than 1.7. Incremental dialysis requires close monitoring and proactive adjustment of dialysis prescriptions and 24-h collection of dialysate and urine to document clearances. Residual renal clearance should be obtained every 1–2 months so that the peritoneal component can be modified if indicated to make sure total weekly Kt/V is 1.7 or higher. The argument for incremental dialysis and the suggestions that this approach could be better/more practical for PD are reviewed elsewhere [135]. Certainly, there are advantages to prescribing “incremental” dosing: the patient can gradually adapt to the constraints of performing the dialysis without being overwhelmed by “full dose” prescription; very often the use of smaller doses of PD will bring about remarkable improvement in symptoms without a major change in small solute parameters (again pointing to the benefits of PD that are not explainable by small solute kinetics); and, finally, the intriguing possibility that, in some patients, PD may be nephroprotective and itself slow the decline in residual GFR [123].

Pitfalls in Prescribing PD

There are some common pitfalls in prescribing the peritoneal component of total solute clearance. The following is a brief summary of some of these, and should be considered whenever a patient appears underdialyzed. Because PD is a home therapy, noncompliance with the prescription must always be considered as a reason for underdialysis [120]. Certainly, patients may be compliant with their prescription when bringing in their 24-h collections; however, if the patient is noncompliant at home, he/she may be underdialyzed on a daily basis. Home visits to monitor the amount of supplies on hand, and keeping track of monthly orders, may help sort this out [121, 136, 193].

Some issues to consider in patients on standard CAPD are: 1) inappropriate dwell times (a rapid transporter would do better with short dwells); 2) failure to increase dialysis dose to compensate for loss of residual renal function; 3) inappropriate instilled volume (patient may only infuse 2 L of a 2.5-L bag) [137]; 4) multiple rapid exchanges and one very long dwell (patient may do three exchanges between 9 a.m. and 5 p.m., and a long dwell from 5 p.m. to 9 a.m., limiting overall clearances) [138, 195]; finally, 5) inappropriate selection of dialysate glucose for long dwells, which may not maximize ultrafiltration.

In general, when the goal is to increase total solute clearance, it is best to increase dwell volume, not number of exchanges [139, 140] (Fig. 16.4 ). It is important to emphasize that the increased fill volume will increase small, but not large solute removal [70, 140] Increasing the number of exchanges decreases dwell time per exchange, making the therapy less effective for the average patient .

Fig. 16.4
figure 16_4_978-0-387-78940-8

Increasing the prescription from 4× 2.0 L to 5 × 2.0 L increased the clearance only 10%. Increasing the prescription from 4 × 2.0 L to 4 × 2.5 L increased clearance by 21%. This demonstrates the need, in CAPD, for 2.5 L fill volume [141]

Other problems are specific for those patients on cycler therapy. The drain time may be inappropriately long (> 20 min), thus increasing the time the patient must be connected to the cycler, perhaps limiting the number of exchanges a patient would tolerate. Inappropriately short dwell times may also be prescribed, making the therapy less effective for the average patient in whom length of dwell is crucial. Failure to augment total dialysis dose with a daytime dwell (“wet” day versus “dry” day) could also result in underdialysis. Cycler patients typically “cycle” for 9 h per night. Therefore, the daytime dwell is long (15–14 h). During this long dwell (longer than the “long” night-time dwell for typical CAPD patients), diffusion stops and reabsorption often begins, compromising small solute clearance. Use of a midday exchange is an effective way to optimize both clearance and ultrafiltration in these patients (Fig. 16.5 ). Also, use of alternative osmotic agents such as icodextrin, which maintains ultrafiltration during long dwells, will be helpful [142]. These optimize both clearance and ultrafiltration without hypertonic glucose. [143, 144]. Finally, poor selection of dialysate glucose may not allow maximization of ultrafiltration, resulting in less total clearance.

Fig. 16.5
figure 16_5_978-0-387-78940-8

A 10 L wet day (CCPD) delivers 8% more clearance than a 20 L dry day (NIPD). A 12.5 L prescription with a wet day and a daytime exchange improved clearance 57% over the 20 L dry day prescription. This demonstrates the need, in APD, for 2.5 L fill volume, wet day, and a daytime exchange [141]

When changing from standard CAPD (long dwells) to cycler therapy (short dwells), it is important to remember the difference in transport rates between urea and larger solutes and the effect that this change will have on the patient’s overall clearance. These differences and their relevance for CAPD, CCPD, NIPD, and other modifications are reviewed by Twardowski [145]. Transport of urea into the dialysate tends to occur faster than for other solutes. Therefore, if total solute clearance targets are measured using urea kinetics, keeping Kt/V constant going from long to short dwells may decrease clearance of larger molecular weight uremic toxins [146]. There is also the risk of sodium sieving with more rapid exchanges, which will be discussed in the section on fluid management.

Knowledge of the individual patient’s peritoneal transport characteristics and familiarity with the differences in dialysis needs for rapid versus slow transporters is imperative to avoid problems or confusion.

Adjusting Dialysis Dose

Minimal established dialysis doses are reviewed above. When determining an individual patient’s prescription, one should aim for a total solute clearance that is above this minimum, but also allow for other indices of adequate dialysis such as quality-of-life issues, blood pressure control, and dietary protein intake. If during routine monitoring or clinical evaluation of the patient the delivered dose of dialysis needs to be altered, this can easily be done if the patient’s present transport characteristics (PET) are known. For instance, in a patient who is a slow transporter and thus clearance is critically dependent on dwell time, changing from standard CAPD (infused volume 8 L) to a form of cycler therapy using 2-h dwells, where the infused volume may be as high as 10–14 L may not always result in an overall increase in that patient’s clearance because of the rapidity of the cycles sometimes prescribed in APD.

Once familiar with these relationships, to adjust the dialysate prescription the D/P ratios at the anticipated dwell time and the patient’s drain volume for that dwell time should be known. By altering dwell time, the D/P ratio and the drain volume may change. By altering instilled volume, the total drain volume and therefore clearance will change. In general, increasing the instilled volume without changing dwell time will result in an increase in solute clearance.

Another means to tailor a dialysis prescription would be with the use of computer-assisted kinetic modeling programs [92, 147] that allow for ease in adjusting a patient’s dialysis prescription. Baseline PET data, drain volumes, and patient weights are needed for input data. Use of these programs usually allows one to set targets for solute clearance, glucose absorption, and anticipated dietary protein intake. These computer simulations then produce a menu of prescriptions that should achieve these targets, and the one that would best suit the patient’s lifestyle and meets guidelines can be chosen.

Tidal peritoneal dialysis is a form of automated dialysis in which, after an initial dialysate fill, only a portion of the dialysate is drained from the peritoneum, and this is replaced with fresh dialysate after each cycle [93, 94, 148]. This leaves some percentage of the dialysate in constant contact with the peritoneal membrane, which allows for ongoing solute removal. A typical tidal dialysis prescription would usually require 23–28 L of instilled volume, but preliminary studies suggest that tidal dialysis may be approximately 20% more efficient than nightly PD at dialysate flow rates of about 3.5 L/h [93]. Another advantage of tidal dialysis is that the residual volume may alleviate “fill” or “drain” discomfort, and may also be helpful if incomplete draining is causing the cycler to alarm frequently overnight [94].

Clinical Assessment of Adequacy

As mentioned at the beginning of this chapter, the clinical assessment of adequate dialysis does not just consist of any one laboratory measurement. It includes lack of signs or symptoms of uremia, as well as a patient’s feeling of “well-being,” control of blood pressure and anaemia and other biochemical parameters [149151]. The minimal target dialysis dose should be delivered and an adequate PNA achieved. If the clinical judgement is that the patient is manifesting signs of uremia despite what appears to be adequate laboratory measurements of “dialysis dose,” it would be prudent to increase the patient’s dialysis dose if no other cause for this symptomatology is found. Certainly, a patient could be very compliant with the prescription during the period of dose monitoring but, because this is a home therapy, there may not be compliance during the rest of the therapy interval. Furthermore, there may be a subset of patients who require more than the recommended dose of dialysis for optimal well-being. Clinical judgment is important, since “one size fits all” should not apply to care of the patient.

Another explanation would be the effect weight loss has on calculation of V and the effect normalization of Kt in a malnourished patient has on Kt/V calculation (described in the section on discrepancy, above). Furthermore, if the dose of dialysis appears inadequate despite an adequate clinical assessment, these patients should be monitored very closely, and, once again, a trial of increased dialysis is recommended. In the end, however, patients usually enter dialysis therapy with a large burden of co-morbid disease that may not be ameliorated by more dialysis, and so clinical judgment must be used not to push the patient into an burdensome regimen which may not have any effect on outcome.

Volume Homeostasis

Patients with end stage renal disease (ESRD) have a high prevalence of co-morbidities such as coronary artery disease, left ventricular hypertrophy, hypertension, and left and right heart failure. These are all risk factors for an increased relative risk of a cardiovascular death [152]. Hypertension itself is an independent risk factor for all of these co-morbid conditions and in patients with chronic kidney disease is in part related to plasma volume overload [153]. In patients on dialysis, although the data is less clear, plasma volume overload is thought to be a major factor in the development of hypertension [Konings, 2002 #524]. One of the many functions of the normal kidney is to maintain volume homeostasis. Therefore it seems prudent that if one is discussing adequacy of any renal replacement therapy, one should consider the optimization of blood pressure and plasma volume as a part of the “adequacy” of the therapy.

The Importance of Volume Status

Observational studies of PD patients suggest that volume status (inferred by UF volume of presence of HTN) is related to patient survival (Table 16.4 ). Four studies found an association between fluid removal and mortality, while four studies found an association between fluid removal and BP control. Historically, textbook chapters and national guidelines on “adequacy” have not focused on volume control in PD because there are little to no high level studies (such as prospective randomized trials) that have evaluated the effect of achieving a certain ultrafiltration volume/day on a patients: Blood pressure control, volume status, and relative risk of death. However, recent circumstantial evidence from observational trials, have warranted formulation of national guidelines on maintenance of euvolemia and blood pressure control. This evidence will be reviewed here and includes the following: data that suggest that low transporters (who tend to have better PD ultrafiltration volumes) as opposed to high transporters have a lower relative risk of death; data that suggest that PD patients tend to be volume expanded; and data that suggest that greater volumes of daily fluid removal (PD, renal, or combined) is associated with an increased survival. Reasons for these observations will be examined. It is important to note that, in terms of why increased fluid removal/day is associated with a decreased relative risk of death, it is not known if it is the amount of volume of fluid removed per se (PD, renal, or combined) that is causative in reducing mortality, or if the fluid volume is a surrogate for something else (sodium removal, middle molecule removal, salt and water intake, etc). For instance, healthier patients may eat and drink more and hence need to have a higher UF volume.

Table 16.4 Effect of fluid removal on selected clinical outcomes

Low Transporters Tend to UF Better and May Have a Lower Relative Risk of Death than High Transporters

Peritoneal membrane transport characteristics can affect the potential drain volume of any PD dwell. With standard glucose (dextrose)-containing dialysis solutions, the osmotic stimulus for transcapillary ultrafiltration dissipates during the dwell as the glucose is absorbed. If the dwell time is long enough, not only will transcapillary ultrafiltration stop, but lymphatic absorption will predominate, absorbing fluid and therefore decreasing potential drain volume. Peritoneal membrane solute transport type influences the rate of the glucose absorption and hence the expected drain volume per dwell. Rapid transporters (patients with higher D/P creatinine ratios) will be more prone to poor ultrafiltration and potential fluid retention as a result of rapid reabsorption of glucose. They also may be at a higher cardiovascular risk due to the greater systemic exposure to glucose than low transporters. As a result, in these patients, if the PD prescription is not managed correctly, they may become volume overloaded and theoretically may develop subsequent complications such as LVH and hypertension that may result in an increased risk of death.

Early observational studies suggested that higher membrane transport predicted a poorer outcome on CAPD [71, 158]. This was not confirmed in subsequent studies [71, 159]. A meta-analysis of studies published between 1987 and end of December 2005 that correlated membrane transport and relative risk of death was conducted. Twenty studies were identified, 19 of which were pooled to generate a summary mortality relative risk of 1.15 for every 0.1 increase in the D/P creatinine (95% confidence interval 1.07 to 1.23; p < 001) [58] (Fig. 16.6 ). It was concluded that higher peritoneal membrane status is associated with a higher mortality risk (increased mortality risk of 21.9, 45.7, and 77.3% in low-average, high-average, and high transporters, respectively) as compared to low transport status. On the other hand, studies that enrolled a greater proportion of patients on cycler therapy demonstrated a lower mortality risk for a given increase in D/P creatinine compared to studies that mainly enrolled CAPD patients. This is consistent with the reported observational finding that in patients all given the same CAPD (but not APD) prescription, those with high transport status were more likely to be hypertensive (100% versus 0%) and have LVH (100% versus 33%) than the low transporters on the same prescription (3 × 1.36% glucose daytime dwells and 1 × 3.86% glucose overnight) [156] (Fig. 16.7 ). In these patients, when prescription was altered to increase ultrafiltration, blood pressure improved (discussed later). When these observations were noted, more attention was paid to individualizing the PD prescription for patients with high peritoneal membrane transport. Guidelines were developed that recommended limiting glucose exposure during long dwells, matching dwell time to transport type, and considering alternative osmotic agents that have a more favorable ultrafiltration profile during the long dwell that is independent of transport type (Fig. 16.8). With such maneuvers, it has been shown that patients with high membrane transport can be managed on PD without the increased mortality risk demonstrated in historical studies [160] (Fig. 16.9). As with hemodialysis or any therapy one should individualize therapy to achieve expected results. It turns out that patients starting PD now do so on cycler therapy in many parts of the world, so there may be little risk to baseline transport type if cyclers are considered and individualization of prescription with the goal of normalization of BP and euvolemia is attempted.

Fig. 16.6
figure 16_6_978-0-387-78940-8

Meta-analysis of the relationship between peritoneal membrane transport type and relative risk of death. Relative risk (RR) estimates and 95% confidence intervals (CI) for the ratio of the creatinine concentration in the dialysate to that in the plasma (D/P creatinine; per 0.1 increment) and mortality in peritoneal dialysis (PD) patients. From [58]

Fig. 16.7
figure 16_7_978-0-387-78940-8

Correlation between 24 h ambulatory BP reading and daily ultrafiltration in CAPD. Correlation between 24 h ambulatory BP reading and daily ultrafiltration in CAPD. From [156]

Fig. 16.8
figure 16_8_978-0-387-78940-8

Diagrammatic representation of ultrafiltration profiles related to transport type and osmotic agent. Theoretical constructs representing change in ultrafiltration profile with high and low transporters with glucose or icodextrin dwells. From [160]

Fig. 16.9
figure 16_9_978-0-387-78940-8

Correlation between transport type and survival in two PD cohorts commencing therapy during different years. Survival on PD (patients censored at transplant or transfer to HD) according to transport category at start of treatment comparing 2 cohorts commencing therapy between (a) 1990–1997, n = 320 and (b) 1998–2005, n = 300. Low (solid line), low average (dotted line), high average (dashed line,) and high (solid bold line). In the first cohort, transport category was significantly (p = 0.0009) associated with survival, whereas in the more recent cohort transport type was not associated with survival. From [160]

Peritoneal Dialysis Patients Tend to Be Volume Overloaded

Multiple observational studies have measured volume status for PD patients and compared it to patients on hemodialysis (HD). Using bioimpedance measurements (BIA) to determine intracellular and extracellular fluid spaces, it was noted that both total body water (TBW) and extracellular fluid volume (Vecf) was actually greater in PD patients than both before and after dialysis values in chronic hemodialysis patients and baseline values of healthy controls [160] (Fig. 16.10 ). Furthermore, the ratio of Vecf/TBW positively correlated with systolic BP and negatively correlated with serum albumin levels. These observations are somewhat counterintuitive in that PD patients are able to adjust their ultrafiltration (UF) on a daily basis because of the continuous nature of the therapy. Therefore, in theory they should be able to better attain a true “dry weight.” Interestingly this observation is not an isolated one. Others have shown that CAPD patients have an increase in mean pulmonary artery pressure when compared to HD patients. In one study, pulmonary artery pressures were measured pretransplant in both PD and HD patients and found to be a mean of 22 + 7 mm Hg in 56 CAPD patients and 16.3 + 7.2 mm Hg in 296 HD patients (p < 0.01) [162].

Fig. 16.10
figure 16_10_978-0-387-78940-8

Volume status in PD and HD (both pre- and post-dialysis). Ratio of Vecf/TBW in hemodialysis patients (measured pre- and post-dialysis) in PD patients and in healthy subjects. Measurements were performed after 20 minutes in the supine position using multifrequency bioimpedance. Where Vecf = volume of extracellular fluid and TBW = total body water. From [161]

In another study it was shown that CAPD patients (n = 28) had a considerable decrease in “dry weight” (66.6 + 2.3 kg on PD versus 62.4 + 2.4 kg on HD, p < 0.05) after transfer to hemodialysis [163]. This was associated with no change in systolic BP but a significant improvement in diastolic BP. These observations suggest that PD patients tend to be chronically plasma volume overloaded. Interestingly in hemodialysis patients, it has been noted that with chronic expansion of extracellular space, BP increases slowly over weeks to months [164]. Short term expansion of the extracellular space tended not to have a major effect on BP and slow reduction in dry weight was associated with a gradual decrease in blood pressure [165]. This suggests that acute volume expansion per se may not be the only mechanism that causes hypertension in ESRD patients. The increase in peripheral vascular resistance associated with chronic volume expansion may play a key role, and, therefore, close attention to dry weight and blood pressure needs to be addressed in every PD patient. Possible indirect evidence to support these observations is data that suggests that, historically, blood pressure control in PD patients has been suboptimal [68, 166] (Fig. 16.11 ). Furthermore it has been noted that long-term PD patients were more likely to have LVH than long- term HD patients [167]. In that study 51 CAPD patients were compared to a group of 201 HD patients. Left ventricular hypertrophy was more severe in CAPD patients than in HD patients (p < 0.0001) and CAPD patients were more likely to need antihypertensive medications (65 versus 38%, p < 0.001) despite CAPD patients being younger.

Fig. 16.11
figure 16_11_978-0-387-78940-8

Blood pressure control in PD patients. Blood pressure control in PD patients in Italy and the United States per Joint National Commission recommendations version 6. Modified from [68] and from [166]

The Amount of Fluid Removal in Peritoneal Dialysis Patients (Peritoneal Ultrafiltration, Residual Renal Volume, or Both) Is Predictive of the Relative Risk of Death

The European Automated Peritoneal Dialysis Outcomes Study (EAPOS) reviewed the efficacy of automated peritoneal dialysis (APD) in anuric patients. Despite similar baseline co-morbidities, the study showed that patients who were below the predefined UF target of > 750 mL/day at the start of the study proved difficult to get above the UF target and had a significantly higher 2-year mortality risk [42].

Peritoneal membrane transport and small solute clearance did not predict survival. An important caveat about this study was that although EAPOS was an interventional study with a predefined minimal UF volume there was no randomization scheme, so any attempt to draw a cause and effect between UF volume and risk must be interpreted with caution. Therefore, a secondary analysis was conducted to see what other factors, if any, could be found in that database that were associated with the link between risk of death and UF volume (i.e., poor intake due to poor health; subtle volume overload with resultant hypertension; inability to increase UF due to co-morbidity such as hypotension; and, other unmeasured difference between groups) [168].

This analysis was unable to clearly define if there was a link with nutritional status. It did not find an association between UF volume, mortality, and blood pressure (BP) control. In fact, there was no relationship between BP control and survival. There were no data on other parameters such as inflammatory state. What was observed is that patients with decreased UF at baseline had a statistically significant lower UF capacity but similar peritoneal transport and a relatively lower use of hypertonic glucose.

Does a Targeted Increase in Peritoneal Ultrafiltration Improve an Individual's Relative Risk of Death or Any Other Surrogates for an Improvement in Death Risk?

Incident patients on PD need to be treated in such a way that prevents volume overload and development of LVH, and maintains control of BP. Can blood pressure or LVH be improved by an increase in ultrafiltration or fluid removal? In a report of CAPD patients who were originally all treated with the same prescription (3 × 1.36% glucose during the daytime and 1× overnight 3.86% glucose) and had a residual renal volume of < 200 mL/day, an increase in peritoneal UF from a mean of 1,086 ± 256 mL/day to a mean of 1,493 ± 223 mL/day was associated with an improvement in systolic and diastolic BP (systolic BP 145 ± 13 versus 128± 5, p < 0.001) [156] (Table 16.5 ). In an open-labeled randomized trial that evaluated the effect of icodextrin dialysis fluid on extracellular water and left ventricular mass, it was found that an increase in ultrafiltration volume (744 + 767 mL versus 1,670 + 1,038 mL) was associated with a decrease in extracellular water and an improvement in left ventricular mass [167].

Table 16.5 Results of Ambulatory Blood Pressure Monitoring Parameters Before and After Changes in CAPD Regimens

Another study showed that strict volume control (via sodium restriction and increased peritoneal ultrafiltration) could normalize BP in 37/47 patients on PD who previously needed antihypertensive therapy (47/78 at baseline) [170] (Fig. 16.12 ). In these patients, there was also a decrease in the cardiothoracic index. However, it should be cautioned that this intervention was associated with a decrease in urine volume (Fig. 16.13 ).

Fig. 16.12
figure 16_12_978-0-387-78940-8

Increased peritoneal ultrafiltration is associated with an improvement in LVH. Panel A: Change in extracellular water in the icodextrin treated and control groups. Box indicates the 25th and 75th percentiles (thick line is median value) and capped bars equal minimum and maximum value. Differences between icodextrin and controls at p = 0.013. Panel B: Change in left ventricular mass in the icodextrin-treated and control groups. Box indicates the 25th and 75th percentiles (thick line is median value) and capped bars equal minimum and maximum value. Differences between Icodextrin and controls at p = 0.050. From [169]

Fig. 16.13
figure 16_13_978-0-387-78940-8

Schematic outline of treatment plan for HTN in CAPD. From [170]

In summary, targeted optimization of fluid removal (in the cited studies mainly by PD ultrafiltration, although decreased sodium intake and increased urine volume may also play a role) has been shown to result in an improvement in surrogate clinical parameters associated with a lower relative risk of death (normotension, reduced LVH). This would suggest that optimization of volume status should be an important component of “adequacy” of peritoneal dialysis.

What is the Target Blood Pressure?

Hypertension is a common finding in the PD patient, with reported prevalence in PD populations of between 29 and 88% [166, 171]. Furthermore, uncontrolled hypertension due to volume overload is thought to contribute to higher left ventricular mass in CAPD patients [112]. The relationship between BP control and risk of death in ESRD patients is controversial. Some observational studies suggest that, in fact, there seems to be a “reverse epidemiology,” where, in contrast to what is noted in the general population, the lower the systolic BP, the higher the relative risk of death [172, 173].

Others have shown a U-shaped curve, where in patients with both the lowest and highest post dialysis systolic BP have an increased relative risk of death [174]. This observation could be due to the fact that in the ESRD population we have a “selective” cohort and, in fact, patients with the lowest BP may have co-morbidities (congestive heart failure, malnutrition) that contribute to this increased risk, alternatively, this finding may be related to the hemodialysis procedure itself. PD patients would not have this reverse epidemiologic observation. In a prospective observational study of 125 PD patients where the mean BP was 131.2 ± 17.4/83.4 ± 9.8 mm Hg, BP was found to be an independent predictor of mortality. Hypertensive patients (not defined) had a significantly worse 3-year survival than normotensives (60.5 versus 92.1%, respectively, p = 0.0001). Similarly, total (p = 0.001) and cardiovascular (p < 0.01) hospitalizations were significantly worse in hypertensive patients [37].

This study did not define a BP target. To examine the association between BP control and various clinical outcomes, 1,053 random PD patients from the USRDS DMMS wave 2 study were evaluated [175]. Using a Cox model and adjusting for covariates and a mean follow up of only 23 months, these authors found that the two lowest BP categories (systolic BP < 100 and systolic BP 101–110 mm Hg) were associated with an increased all cause and cardiovascular (CV) related mortality. They did not find a similar association for systolic BP. This study could not evaluate for the development of co-morbidities during follow up (new onset congestive heart failure), nor were they able to adjust for differences in confounding factors between group or over time. Higher systolic BP was associated with shorter duration hospitalization. They concluded that based on their data aggressive treatment of hypertension should be done with caution in the PD population, and again did not establish a BP target.

In the EAPOS study, there was no significant relationship between BP and survival when examined as a continuous variable (RR = 0.99/mm Hg, p = 0.188). When evaluated by tertiles, there was no significant difference by category (p = 0.23), although there was a tendency toward an earlier death in the lowest BP tertile [168] (Fig. 16.14 ). As a result of these studies, a “target” BP for PD patients cannot be recommended. In fact, there have been no prospective randomized trials done in an attempt to identify what BP should be targeted in an attempt to improve an individual patient’s outcome as there has been in the general population. Furthermore, there is little evidence to show which antihypertensive would be “best” to use in the PD patient for BP control. However, because there are data to suggest using an ACE or ARB for CV protection, preservation of residual renal function [56, 57, 177] or stabilization of peritoneal membrane transport characteristics, it is recommended that one consider giving preference to using these agents. NKF-DOQI guidelines did not give a target BP, but recommended that one try to control BP with less aggressive approaches than those used in the general population and that one should consider use of an ACE or an ARB. They further recommended that “one should optimize volume status because of the detrimental effect of volume overload on CHF, LVH and BP control,” stating: “Each facility should implement a program that monitors and reviews peritoneal dialysate drain volumes, residual renal function and BP control on a monthly basis, and that some of the therapies one should consider to optimize extracellular water and blood volume include but are not limited to restricting dietary sodium intake, use of diuretics in patients with residual renal function and optimization of peritoneal ultrafiltration volume and sodium removal” [55].

Fig. 16.14
figure 16_14_978-0-387-78940-8

Patient survival related to BP. Patient survival according to baseline systolic blood pressure (BP) divided into tertiles: BP < 126 mm Hg (solid light line); 126–150 mm Hg (solid heavy line); and > 150 mm Hg (broken line). There is no significant difference in survival (log rank test, p = 0.23), although the lowest BP group appears to have a disproportionate early death rate. From [176]

What Daily Ultrafiltration Volume Should be Targeted?

Although there are no prospective randomized trials that have examined the relationship between peritoneal UF volume and risk reduction for hypertension, myocardial infarction, or stroke, there is a consensus based on cardiovascular literature that normalization of volume status and optimization of BP is desirable. As mentioned above, there is indirect evidence that blood pressure in ESRD patients is in part related to volume status. Therefore, it is reasonable to assume that if the patient is hypertensive they may be volume expanded. When there is lack of evidence for other secondary causes of hypertension, one should advise dietary salt and water restriction, slowly increase daily UF volume (monitoring intake and output) so that dry weight slowly decreases with a resultant improvement in the patients blood pressure.

While it is acknowledged that, in health, the kidneys play a key role in maintaining euvolemia, the practice of targeting a “minimal” UF volume (or if patient has significant residual renal function, total net fluid removal) as a “yardstick” of adequacy should be approached with caution. It may be prudent to achieve 1 L of UF/day. However this would only be appropriate if the patient drank at least 1 L of fluid a day. Similarly if the patient drank 2 L a day, a peritoneal UF of significantly less than that would be inadequate. Although studies reviewed above were able to show a UF volume below which a cohort had an increased relative risk, it is important to remember that these were not randomized studies and that patients with higher UF volumes may have “needed” them because they felt better and ate more.

At this point a specific drain volume can not be recommended. This must be individualized for each patient based on intake and residual renal volume. One should try to normalize BP by adjusting PD ultrafiltration volumes so that a “dry weight” is achieved that allows normalization of BP with minimal to no BP medications used exclusively for blood pressure control. Certainly one may want/need to use an ACE inhibitor, ARB, or beta blocker for other clinical indications such as preservation of residual renal function, reduction of CV risk, treatment of angina, or rate control. In these situations, UF volume will need to be adjusted accordingly so that patients will not be symptomatic from low BP or hypovolemia.

How Is Optimal Volume Homeostasis Achieved in PD Patients?

Optimization of volume status in PD involves a group effort by the patient, the home dialysis nurses, the dietician, and the physician. It involves attention to sodium and water intake as well as sodium and water removal (by the kidneys and by peritoneal dialysis). Each facility should develop a program that reviews the patient’s dietary sodium and water intake, BP readings, residual renal urine volume, and PD effluent volume on a monthly basis. The caregivers must be aware of the patient’s peritoneal membrane transport type and their social constraints and desires. The NKF-DOQI 2006 publication, guideline #4 – Maintenance of Euvolemia – states that: “Some of the therapies one should consider to optimize extracellular water and blood volume include, but are not limited to: restricting dietary sodium and water intake, use of diuretics in patients with residual kidney function, and optimization of peritoneal ultrafiltration volume and sodium removal.” [55] These therapies should be used in concert to individualize and optimize a patient’s total body water and BP status.

What About Residual Kidney Volume?

Maximizing and maintaining residual kidney urine volume is an important way to augment daily fluid removal. As noted above, prospective randomized trials and observational studies have confirmed that there is a strong survival advantage for patients who have residual kidney function. The mechanism for this robust association with a reduction in mortality risk is unknown. It may be that residual kidney volume is a surrogate for other unrecognized systemic kidney functions, or it may be related to salt, solutes, or the urine volume itself.

An observational study [177] and two prospective randomized trials, one using an ACE inhibitor [56], the other using an ARB [57], have shown that use of these medications is associated with a preservation of residual kidney function. It must be noted that the patient numbers were small. However, although not well documented in these studies, residual renal volume was also maintained. Diuretics may be used to increase urine volume. A prospective randomized trial has shown that urinary sodium and water removal can be augmented and better maintained with the use of high-dose loop diuretics [39].

These studies suggest that it is important to monitor residual kidney function in patients on peritoneal dialysis. If hypertensive medications are need, preference should be given to the use of ACE inhibitors or angiotensin blockers. Avoid nephrotoxic agents when clinically possible. Finally, when attempting to optimize volume status, it would also be wise to consider the use of loop diuretics to augment urine volume.

How Do You Optimize Peritoneal Ultrafiltration?

To optimize a patient’s UF, one must be aware of the overall treatment goals, the peritoneal membrane transport characteristics and the patient’s quality of life, schedule, and treatment requests. The PD prescription can be onerous for the patient. Social factors and burnout are common causes of treatment failure. Therefore, prescriptions should always be reviewed and one should be sure they are consistent with the patient’s personal and social needs. The implications of more frequent exchanges, longer overnight time connected to the cycler, or need for a mid-day exchange must be considered and discussed with the patient. If the prescription is decided without patient input and it is contrary to the patient’s social desires or constraints, it is likely to fail and treatment goals – small solute removal, UF volume, and optimization of BP will not be achieved.

When appropriate, dietary sodium and fluid intake should be restricted. The patient’s record of effluent volumes/dialysis solution used (% glucose/dextrose, icodextrin, or amino acids) should be reviewed. Special attention should be directed toward the “long” dwells of CAPD (overnight) and cycler therapies (daytime). To minimize any untoward side effects from glucose, the minimal percentage of glucose/dextrose solutions should be used when possible. At times this may mean changing the prescription to alter the potential UF profile of short dwells and long dwells to minimize glucose absorption while maximizing UF. Drain volumes should be optimized for the long dwells of CAPD (night) and cycler (daytime) and when possible net UF should not be negative (i.e., no net fluid absorption) during these dwells. These principles are outlined in the 2006 NKF-DOQI guidelines [55].

In the previous chapters on the physiology of peritoneal dialysis (Chapter XX) and the principles of ultrafiltration (Chapter XX), the basic mechanisms of how the various osmotic agents stimulate transcapillary ultrafiltration, their theoretical ultrafiltration profiles and the role of competing forces such as lymphatic absorption of fluid are explained in detail. It is important to remember that osmotic forces (crystalloid for glucose and amino acids; colloidal for icodextrin) are used to induce transcapillary ultrafiltration. The magnitude of these forces can change during the dwell as the osmotic agent is absorbed and the rapidity of the absorption of the osmotic agent varies from patient to patient based on transport type and lymphatic absorption rates. Rapid transporters absorb the glucose faster than low transporters, hence the osmotic gradient for potential transcapillary ultrafiltration with any glucose dwell is mitigated more quickly in rapid transporters than in low transporters. One needs to be aware of transport type and dwell time per exchange in order to optimize ultrafiltration volumes. If a macromolecule is used as the osmotic agent (polyglucose/icodextrin), although it is very slowly absorbed by diffusion (hence only minimal difference in rate of disappearance of colloid osmotic gradient based on transport types), there can be a difference between rates of absorption based on differences in lymphatic absorption rates of the icodextrin and intraperitoneal fluid. In some patients, the actual UF will be less than expected if there is an increased rate of lymphatic absorption. These issues are most important during the long dwell, especially once residual kidney function is lost.

A special consideration during hypertonic glucose dwells is the difference between sodium and water removal (called sodium sieving). It is recognized that, during the process of transcapillary ultrafiltration, solutes (in this case sodium) can be removed at concentrations similar to that of plasma water (convective solute removal) if there is no resistance to their movement across the endothelial barrier. In order to optimize salt and water removal we need to know what pathways the water and solutes move through. Is the sodium concentration in the ultrafiltrate volume the same as in the ECF, or is some water being removed without solute? As mentioned in Chapter XX, there are three sets of pores via which solutes and water moves as they cross the endothelial barrier [63]. Large pores where macromolecules, water, and small solutes readily pass make up only about 3% of the total pore area, small pores where small solutes readily pass make up 95% of pore area, and ultra small pores or aquaporin water channels represent about 2% of the pore surface area. These aquaporins are water-only channels and it is here that as water moves from the capillaries to the dialysate that sodium is left behind or “sieved” [178]. Water that moves across these aquaporins is the equivalent of “free water.”

Early on during a glucose-containing dwell, about 50% of the UF volume is via aquaporins and therefore sodium-free. The other 50% of the ultrafiltrate is via the small pores and therefore is sodium replete. As a result, early in the dwell dialysate sodium concentration will fall, but with longer dwell times sodium will move down its concentration gradient and eventually the D/P ratio of sodium will be near unity (D/P sodium about 1.0). The more hypertonic the solution, the more “free water” is removed early in the dwell. With icodextrin, because there is such a small transcapillary pressure difference across the aquaporin pores, very little of the UF volume is “free water.” As a result, if one was to do multiple short overnight dwells (more than four exchanges over 8 h) with hypertonic solutions, the ultrafiltrate will be mainly free water and the serum sodium will increase slightly. The UF volume will be misleading as it will represent removal of mostly sodium-free water, and BP may not be controlled, whereas changing to APD with fewer overnight dwells followed by a daytime icodextrin dwell will likely result in an improvement in BP due to better sodium removal. Attention to these details will help optimize volume status and BP in PD patients.

Low Sodium Solutions

Lowering dialysate sodium allows for more diffusive removal of sodium independent of convective sodium removal. Although not currently available in the United States, many investigators have evaluated these solutions. This was first evaluated by Ahearn and Nolph. They found that using 7% dextrose dialysis solutions with sodium concentration of 100–130 mmol/L compared to 140 mmol/L resulted in increase sodium removal/exchange [179]. Others have found similar results. In one study the authors demonstrated that one could increase total sodium removal without any change in UF volume. Nine anuric patients were studied in a crossover trial in patients on APD. A dialysate sodium of 132 mmol/L was compared to one with a sodium concentration of 126 mmol/L. On the lower sodium solution, daily sodium removal increased to 94 mmol/L of UF compared to 32 mmol/L while on standard dialysate sodium. Four patients experienced an improvement in BP and three were able to discontinue their BP medications. There was no significant change in UF or body weight during the study [180]. In another study, the effects of ultralow-sodium dialysis solutions (Na 102 mmol/L) were compared to baseline (when at least one exchange was 3.86% glucose/4.25% dextrose and one exchange with icodextrin) in five anuric PD patients [181]. Better sodium removal (80 ± 14 mEq versus 56 ± 15.9 mEq/day, p < 0.05), better BP control (mean BP 98 ± 5.5 mm Hg versus 109 ± 7.6 mm Hg, p + 0.046) with the ultralow-sodium solutions. Clearly these solutions may work but current data is limited and more studies are needed. For a further review of low sodium dialysate solutions, see [182].

What About Electrolytes and Other Issues?

As mentioned in the introduction to this chapter, “adequacy of dialysis” involves more than just small solute clearance. It theoretically involves replacing “all” the functions of the native kidneys. Some of these can not be replaced by dialysis and others require medications. It is known that the kidneys are important in bone and mineral metabolism. As such, peritoneal dialysis needs to be prescribed in a way that does not limit treatment of bone and mineral metabolism, acid base balance, anemia, and electrolyte control. These issues are discussed in other chapters of this book.

It is recognized that the calcium concentration of dialysis fluids can influence the ability to optimally treat the bone and mineral related complications of ESRD. Dialysis solutions need to be individualized as a part of this treatment program. In the past, it was common to use relatively high dialysate calcium concentrations (3.5 mEq/L) so that hypocalcemia would be prevented (this was in an era when aluminum-containing binders and calcium restriction were commonly used to prevent hyperphosphatemia). Once the toxicity of aluminum was recognized, nephrologists began to use calcium-containing binders. This was often associated with hypercalcemia and metastatic calcification. As a result, the approach to bone mineral metabolism has changed and one of these changes is to lower the dialysate calcium concentration. Most PD fluids now contain 2.5 mEq/L of calcium. These are more physiologic compared to the 3.5 mEq/L solutions and are less likely to result in positive calcium balance. However, if one is using multiple hypertonic dwells and not using calcium-containing binders, there is a small risk of negative calcium balance [183]. It has been shown that this is especially true at lower serum calcium levels (Fig. 16.15 ). Therefore, care must be taken to ensure that patients are on the appropriate amount of calcium supplement and that serum calcium and PTH levels are monitored as indicated.

Fig. 16.15
figure 16_15_978-0-387-78940-8

Calcium mass transfer with dialysate fluids containing 1.25 and 1.75 mmol/L calcium. Cell A and B. Calcium (Ca2+) mean mass transfer values (CMT, mmol) plotted against serum calcium levels (SiCa) with 1.25 and 1.75 mmol/L calcium in 1.5 g/dL dextrose (A) and 4.25 g/dL dextrose containing peritoneal dialysate (B) solutions. From [182]

Summary

When considering adequacy of dialysis, one must monitor small solute clearance, volume status, and blood pressure in addition to the clinical assessment of how the dialysis prescription is affecting the treatment of anemia, bone and mineral metabolism, and other co-morbid diseases. Periods of inadequate dialysis can result in subtle symptoms of uremia that are insidious in onset and may not be reversible. These can influence outcome in a negative way. To prevent this symptomatology and provide as close to optimal dialysis as possible, it is important to monitor dialysis dose and ultrafiltration volumes so that changes in dialysis prescription can be made proactively rather than reactively. When tailoring PD prescriptions it is important that the prescribed dialysis dose be targeted to at least achieve these minimal doses and optimize replacement of as many other aspects or normal renal function as possible.