Abstract
The decade of the 1990s saw an improvement in cadaveric renal graft function and dramatic reduction in the acute rejection (AR) rate. The purpose of this study was to determine whether the reduction in rejection rate was the primary cause of the improvement in graft function seen and whether this improved long-term graft survival. All adult patients who received a cadaver renal transplant between 1991 and 2000 and had graft survival of at least 6 mo and complete data for creatinine at 6 mo, HLA mismatch, delayed graft function, and acute rejection (AR) were identified in the United Network for Organ Sharing database. A total of 40,164 cases that met the inclusion criteria were identified. The mean Modification of Diet in Renal Disease GFR at 6 mo improved from 49.94 ml/min per 1.73 m2 in 1991 to 54.59 ml/min per 1.73 m2 in 2000 (P < 0.001). The improvement in GFR was not gradual but occurred over a 4-yr period between 1994 and 1997, coinciding with the introduction of new immunosuppressive agents mycophenolate mofetil and tacrolimus into maintenance immunosuppression regimens. The improvement was seen in all subgroups of patients, even patients without clinical AR or delayed graft function. The magnitude of improvement in patients without clinical AR was similar to that seen in patients with AR. The drop in clinical AR rate accounted for a minority of the improvement in graft function in the 1990s. Other factors, such as reduced drug toxicity and improved control of subclinical rejection, seem to account for the majority of the improvement. This improvement in graft function at 6 mo did not translate into improved long-term graft survival, however.
Adramatic drop in the incidence of acute rejection (AR) during the 1990s coincided with the introduction of the first new maintenance immunosuppressants since cyclosporine. During the same time period, an improvement in deceased donor renal allograft function was observed (1). The quality of graft function is an important determinant of long-term graft survival; however, improvement in graft survivals for this decade as a result of the better immunologic control of AR has been less than projected (1–3). The purpose of the study was to determine whether the drop in the AR rate was the major cause for the improvement in graft function and whether this improvement in graft function led to better long-term graft survival.
Materials and Methods
All adult patients (age >17 yr) who received a deceased donor renal transplant between 1991 and 2000 and whose grafts survived for at least 6 mo were identified in the Organ Procurement and Transplant Network database obtained from the Standard Transplant Analysis and Research File, based on OPTN data as of August 27, 2002. Transplants with incomplete data for creatinine at 6 mo, HLA mismatch, dialysis in the first week posttransplantation, or AR in the 6 mo were excluded from the analysis. GFR was estimated using the abbreviated Modification of Diet in Renal Disease (MDRD) equation shown below (4):
MDRD GFR (ml/min per 1.73 m2) = 186.3 × (creatinine)−1.154 × (age recipient)−0.203 × 0.742 (if recipient female) × 1.21 (if recipient black)
GFR was calculated using the creatinine, age, gender, and race. For understanding the time course in the improvement of GFR seen during the 1990s, the mean GFR was determined for each transplant year. Delayed graft function (DGF) was defined as the need for dialysis in the first postoperative week. AR was defined as treatment of AR in the first 6 mo. The incidence of DGF, AR, mean HLA mismatch, and mean donor age were determined for each transplant year. Data were also obtained regarding the discharge maintenance immunosuppression, type of induction therapy, weight, gender, race, age of donors and recipients, cold ischemia time, cause of ESRD, donor hypertension and diabetes, retransplantation, peak panel reactive antibodies, and type of transplant, either en bloc or solitary kidney. To determine the impact of these variables on graft function, we performed multivariate linear regression analysis, and only variables that reached P < 0.05 were included in the model. Because the effect of donor age on GFR appeared parabolic in nature, donor age was transformed to donor age squared and included in the model. Adjusted data in the analysis were based on the results of the multivariate linear regression modeling.
Statistical Analyses
To determine whether the improvement in graft function noted in the 1990s improved long-term graft, we performed survival Kaplan-Meier plots as well as multivariate Cox models, adjusted for potential confounding factors. Covariates in the Cox models were assessed for adherence to the proportional hazards assumption, and univariate analyses were tested using the log rank tests. SPSS software version 11.0 was used for statistical analysis.
Results
A total of 64,446 adult patients who received a deceased donor transplant between 1991 and 2000 and had graft survival of at least 6 mo were identified. A total of 24,282 of these were missing data for one or more of the primary variables of interest (creatinine at 6 mo = 13,049; AR in first 6 mo = 14,632; dialysis in the first week = 638; and HLA mismatch = 421) and were excluded. The final study cohort contained 40,164 cases, or 62.3% of reported cases in the database.
Table 1 shows the mean MDRD GFR at 6 mo and 1 yr posttransplantation. The improvement in GFR occurred over a period of 4 yr, between 1994 and 1997. Before and after this period, the mean GFR was stable. Mean donor age increased during the 1990s (see Table 2), and this may have dampened the increase in GFR observed during this period. To determine impact of the donor age change, we adjusted GFR to reflect the 1991 donor age composition (Figure 1). After adjustment for donor age, the improvement in mean GFR during the 1990s was approximately 6.9 ml/min per 1.73 m2, or 14%. The improvement in mean GFR between these two eras, before 1994 and after 1997, was seen in all subgroups of patients according to various recipient and donor characteristics (Table 3).
Actual and donor age-adjusted mean GFR at 6 mo by transplant year. Donor age-adjusted GFR are adjusted to the mean donor age in 1991.
Mean MDRD GFR at 6 months and 1 year posttransplant based on year of transplantation
Trends in recipients, donors, transplants, and immunosuppression during the decade of the 1990sa
Comparison of the mean MDRD GFR between two transplant eras, 1991 to 1993 and 1998 to 2000, and recipient and donor characteristicsa
Table 2 shows the characteristics of recipients and donors including the discharge maintenance immunosuppression and induction therapy by transplant year. The largest changes among the recipients were the mean age (increase of 4.6 yr), mean duration of dialysis before transplantation (increase of 4.8 mo), and mean recipient weight (increase of 5.4 kg). Among the donors, the mean age increased during the decade by 4.5 yr, as did the number of donors with hypertension (0 to 10%) or diabetes (0 to 3.5%). The percentage of patients who received an en bloc transplant and the weight of the donors increased, whereas the number of male donors decreased slightly during the decade. The mean HLA mismatch remained relatively constant, as did the incidence of DGF. The mean cold ischemia time gradually declined during the decade by 5.2 h.
The largest change after transplantation was the AR rate in the first 6 mo, which declined from 43.5% in 1991 to 15.5% in 2000. This decline in the rejection rate was not linear as the rejection rate declined between 1991 and 1994, then increase and peaked in 1996, then declined again thereafter. To understand this peak in the rejection rate observed at a time when the mean graft function was improving, we determined the difference in GFR between patients with and without AR for each transplant year (Figure 2). During the years 1995, 1996, and 1997, the impact of AR on graft function declined dramatically, and in 1997 the difference in GFR between patients with and without AR was only 1.3 ml/min per 1.73 m2. Whether this apparent lessening of impact of AR on graft function is related to differences in the criteria used to diagnosis AR, an increase in false-positive reporting of AR, or more aggressive surveillance and treatment of rejection is unclear. The improvement in GFR that occurred during this apparent peak in the rejection rate coincided with the reduction in the functional consequence of rejection.
The relationship between acute rejection (AR) rate and difference in mean GFR at 6 mos. The decrement in GFR as a result of AR was the difference between the mean GFR in patients without rejection and patients with a history of rejection in the first 6 mo. No other adjustments for differences between the two groups, such as donor age or delayed graft function, were made.
Induction and maintenance immunosuppressive regimens have undergone marked changes during the past decade (see Table 2). The use of cyclosporine has gradually declined, whereas the use of tacrolimus gradually increased, and in the year 2000, equal numbers of patients were receiving the two calcineurin inhibitors. Among patients who were on various cyclosporine formulations, Neoral accounted for 89.2% and Sandimmune accounted for 10.2%. The remaining 0.6% received other generic forms of cyclosporine. Mycophenolate mofetil supplanted azathioprine, with the use of azathioprine declining from 87.8% in 1991 to 7.6% in 2000, whereas the use of mycophenolate mofetil use increased to 79.7% in 2000 from <1% of transplants in 1991. The increase in mycophenolate mofetil use occurred rapidly during the 4-yr period between 1994 and 1998. Sirolimus use was 16.3% in 2000 but was insignificant before 2000. Steroid use remained relatively constant during the decade.
Table 4 show the combinations of maintenance immunosuppression used during the 1990s. Early in the decade, the combination of cyclosporine and azathioprine was dominant. In the mid-1990s, the combination of cyclosporine and mycophenolate mofetil became the dominant combination, and by the year 2000, tacrolimus and cyclosporine with mycophenolate mofetil were evenly divided as the dominant combinations.
Trends in maintenance immunosuppression combinationsa
To determine which factors were predictive of graft function at 6 mo, we performed step-wise linear regression analysis. Among the 29 variables tested in the model, 21 were found to have predictive value for graft function and are listed in Table 5. Eight variables were found to be insignificant and are listed at the bottom of Table 5. Even with the large number of variables in the analysis, the final model accounted for only 19% of the variance in GFR (R2 = 0.193). AR was associated with a mean loss of GFR of 6.52 ml/min per 1.73 m2, and DGF was associated with a mean loss of GFR of 3.81 ml/min per 1.73 m2. Even after adjusting for AR, mycophenolate mofetil and tacrolimus treatment were associated with improvement in graft function of 3.66 ml/min per 1.73 m2 and 1.56 ml/min per 1.73 m2, respectively. Cyclosporine treatment was associated with a 1.37-ml/min per 1.73 m2 reduction in GFR. After adjusting for AR, induction agents OKT3, anti-IL-2 receptor, and polyclonal antibodies had only a small influence on GFR at 6 mo.
Results of linear regression analysis for GFR at 6 monthsa
Because all subgroups of patients improved GFR, the population was divided into two cohorts on the basis of history of rejection in the first 6 mo and between the two eras, 1991 to 1993 and 1998 to 2000. Figure 3 shows the unadjusted and adjusted (donor age composition and DGF rate of 1991 to 1993) improvement in GFR between the two eras on the basis of presence or absence of AR. After adjusting for the differences in donor age composition and DGF rates between the two eras, there was a similar increase in mean GFR for both groups. This global upward shift in GFR seen in all patients accounted for the majority of the improvement in GFR between the two eras. The drop in the rejection rate, with its consequent improvement in renal function, accounted for a minority of the improvement.
Difference in mean Modification of Diet in Renal Disease GFR between two eras (1991 to 1993 and 1998 to 2000) on the basis of the presence or the absence of AR in the first 6 mo. Adjusted difference adjusted for donor age and delayed graft failure rate in the 1991 to 1993 era using results of multivariate linear regression.
The majority of patient exclusions from the initial analysis were due to missing data regarding AR and creatinine at 6 mo. Table 6 shows the transplant characteristics by era and exclusion or inclusion in the study. Although there are minor differences in some parameters, the most powerful demographic predictor of graft function, donor age, was actually similar between the patients who were included and excluded in the two eras. Table 7 shows the GFR and AR rate between two eras on the basis of exclusion or inclusion in the study. Among the excluded patients for whom data were available, the AR rate was identical in the two eras to the rejection rate seen in the study patients. The mean GFR in the excluded patients for whom data were available was similar in the early era (1991 to 1993), whereas it was statistically significantly lower in the excluded group in the latter era (1998 to 2000). Despite the lower value, it still showed a significant improvement from the earlier era (P < 0.001). The characteristics of the included and excluded cases were very similar in the two eras, and when the data were available regarding GFR at 6 mos and AR, the excluded cases behaved very similarly to the included cases, making it highly unlikely that the exclusion criteria significantly biased the population and altered the primary finding that graft function improved between the two eras.
Comparison of transplant demographics of patients included and excluded in the two erasa
Comparison of mean GFR and acute rejection rate in the first 6 months among patients who were excluded and for whom data existed compared with the study patients by eraa
Figure 4 shows the Kaplan-Meier survivals for death-censored and overall graft survival between the two eras. Despite the improvement in graft function at 6 mo, both graft and death-censored graft survival were similar. Because of the significant differences in the donor and recipient demographics between the two eras, multivariate Cox proportional modeling was done, which after adjusting for 15 covariates including donor age, recipient age, and duration of dialysis before transplantation showed a small graft survival benefit for the latter era transplants (Table 8).
Kaplan-Meier plots of graft and death-censored graft survival between the two eras, 1991 to 1993 and 1998 to 2000.
Cox multivariate analysis for graft survival between the two eras, 1991 to 1993 and 1998 to 2000a
Discussion
The improvement in graft function observed in the 1990s occurred during a discrete period of time between 1994 and 1997. Before and after these years, the mean GFR remained stable. This period of improvement coincides with the rapid adoption of mycophenolate mofetil for azathioprine in calcineurin-based immunosuppression protocols. The multivariate linear regression analysis revealed that not only mycophenolate mofetil but also use of tacrolimus was associated with improved GFR. Although the reduction in the rejection rate had a positive impact on graft function during the 1990s, it cannot explain the majority of the improvement noted in GFR between the two eras, because both patients with and without rejection improved almost equally in the two eras. To what extent immunologic versus nonimmunologic factors resulted in this improvement is unclear.
Subclinical rejection, which may be an important determinant of graft function, is a possible immunologic cause for the improvement in graft function. If subclinical rejection is pathophysiologically similar to clinical AR, then the decreasing clinical rejection rate should be accompanied by a reduction in the subclinical rejection rate. Indeed, surveillance biopsy studies of patients on cyclosporine and azathioprine maintenance immunosuppression have shown rates of subclinical rejection as high as 60%, whereas a study of patients on tacrolimus and mycophenolate mofetil reported rates of subclinical rejection of <5% (5–8).
Several nonimmunologic factors may also be contributory. With the adoption of mycophenolate mofetil, dosing of calcineurin inhibitors may have decreased, thus reducing the renal toxicity. Also, tacrolimus may have a better therapeutic window than cyclosporine, and as a result, similar immunosuppression can be attained with less renal toxicity. A paired kidney study of deceased donor kidneys with discordant use of calcineurin inhibitors showed better graft function at all time periods examined but no improvement in graft survival in the patients who received tacrolimus when compared with Neoral (9).
Although there is little evidence to support this hypothesis, another possible explanation for the improvement in renal function is that better donor management, preservation techniques, and donor quality improved functional outcomes. The DGF rate did not significantly change in the 1990s despite an improvement in average cold ischemia time, and the average donor age and the percentage of donors with diabetes and hypertension all have increased, suggesting that the quality of donors in worsening. For a change of this magnitude in GFR to occur as a result of better donor management and organ preservation, one would expect an equally dramatic change in the rate of DGF, which was not seen.
A fourth possibility is that a systematic downward shift in creatinine measures occurred in the mid-1990s in the laboratories that supplied creatinine data to the United Network for Organ Sharing. The creatinine measures in the United Network for Organ Sharing data set come from a multitude of laboratories across the United States, among which there is no standardization. Therefore, small variations in estimated GFR are present as a result of differences in the measurement of creatinine between laboratories (10). The method for measuring creatinine has not changed significantly in the past decade, and although it is under discussion, to date, there has been no attempt to standardize creatinine measure between laboratories. It seems highly unlikely that a downward drift in the creatinine measures could explain such a discrete improvement in renal function unless there were a major change in method or a nationwide effort to standardize creatinine measures among competing laboratories.
The spike seen in the rejection rate and the decrease in the GFR reduction between rejectors and nonrejectors in the mid-1990s suggest that milder episodes of rejection or misdiagnosis of rejection occurred more frequently during this period. Indeed, the improvement in graft function in this period coincides with a decreased effect of AR on graft function. If more aggressive surveillance was occurring during this period, then it suggests that such an approach may also improve graft function.
Perhaps the most concerning finding of this analysis is that the impressive improvement in renal function noted in the mid-1990s did not translate into better long-term graft and death-censored graft survival between the two eras. The improvement in graft outcome in the 1990s is confined almost exclusively to the improvement in graft outcome in the first 6 mo. Among the patients whose grafts survive the first 6 mo, the rate of graft loss did not change during the two eras. Clearly, better immunosuppression has resulted in better short-term outcomes related to graft survival and function. The larger issue is the long-term impact of increasing immunosuppression for this short-term gain in renal function and graft survival. Whether more intense immunosuppression leads to an increase in deaths and graft loss as a result of infection and malignancy, offsetting the benefit of improved short-term graft function, is not clear and needs further study. Also, this increase in the intensity of immunosuppression achieved in the past decade clearly leads to some and possibly a majority of patients’ being “overimmunosuppressed.” Whether immunologic risk-stratified protocols for immunosuppression can achieve similar short- and long-term graft survival needs to be explored. Finally, the profound demographic shift in the donor and recipient characteristics seen in the 1990s may be offsetting any improvements in graft survival (i.e., recipients and donors are older and the exposure to dialysis among recipients is increasing, all of which worsen graft survival) as suggested by the multivariate analysis.
Given the reality of declining quality of the deceased donor kidney pool, further improvements in graft function and long-term graft survival in the recipient population are probably going to require a new approach to immunosuppression. For the first time since calcineurin inhibitors became available, new calcineurin inhibitor-free regimens are being explored. Because withdrawal or avoidance of calcineurin inhibitors has been associated with an 8 to 29% increase in GFR, calcineurin-free regimens have the promise of significantly improving graft survival (11–16). So far in early trials with early withdrawal or complete avoidance of calcineurin inhibitors, AR rates similar to those seen with calcineurin inhibitors have been achieved (13–16). However, if AR rates are significantly higher with calcineurin-free regimens, then the detrimental effect of AR could negate the benefit of reduced renal toxicity with this approach to immunosuppression. The improvement in graft function associated with calcineurin avoidance may be especially important for patients who receive older donor kidneys, for which the best achievable GFR may be only 50 ml/min per 1.73 m2.
Deceased donor renal transplantation has made remarkable advances in short-term outcomes during the past decade primarily as a result of improved immunologic control of AR. The rapidly growing waiting list with its pressure to use more expanded criteria donors and the expansion of the recipient pool to increasingly higher risk and older recipients, however, may limit improvements in the future. How we judge outcomes must be understood in the context of the quality of kidneys used and the comorbidity of the recipients. We are currently swimming against a demographic tide in which maintaining one’s current position may actually represent progress, albeit difficult to see.
Acknowledgments
Statistical analyses of data was performed by D.S.K.
Footnotes
Published online ahead of print. Publication date available at www.jasn.org.
- © 2005 American Society of Nephrology