Project description:Socioeconomically deprived individuals with renal disease are less likely to receive a live-donor kidney transplant than less-deprived individuals. This qualitative study aimed to identify reasons for the observed socioeconomic disparity in live-donor kidney transplantation.A qualitative study using face-to-face in-depth semistructured interviews.A UK tertiary renal referral hospital and transplant centre.Purposive sampling was used to select deceased-donor transplant recipients from areas of high socioeconomic deprivation (SED) (19 participants), followed by a low SED comparison group (13 participants), aiming for maximum diversity in terms of age, gender, ethnicity, primary renal disease and previous renal replacement therapy.Participants were interviewed following their routine transplant clinic review. Interviews were digitally audio-recorded and transcribed verbatim. Transcripts were coded using NVivo software and analysed using the constant comparison method described in Grounded Theory.Themes common and distinct to each socioeconomic group emerged. 6 themes appeared to distinguish between individuals from areas of high and low SED. 4 themes were distinct to participants from areas of high SED: (1) Passivity, (2) Disempowerment, (3) Lack of social support and (4) Short-term focus. 2 themes were distinct to the low SED group: (1) Financial concerns and (2) Location of donor.Several of the emerging themes from the high SED individuals relate to an individual's lack of confidence and skill in managing their health and healthcare; themes that are in keeping with low levels of patient activation. Inadequate empowerment of socioeconomically deprived individuals by healthcare practitioners was also described. Financial concerns did not emerge as a barrier from interviews with the high SED group. Interventions aiming to redress the observed socioeconomic inequity should be targeted at both patients and clinical teams to increase empowerment and ensure shared decision-making.
Project description:Introduction:Much of the higher risk for end-stage kidney disease (ESKD) in African American individuals relates to ancestry-specific variation in the apolipoprotein L1 gene (APOL1). Relative to kidneys from European American deceased-donors, kidneys from African American deceased-donors have shorter allograft survival and African American living-kidney donors more often develop ESKD. The National Institutes of Health (NIH)-sponsored APOL1 Long-term Kidney Transplantation Outcomes Network (APOLLO) is prospectively assessing kidney allograft survival from donors with recent African ancestry based on donor and recipient APOL1 genotypes. Methods:APOLLO will evaluate outcomes from 2614 deceased kidney donor-recipient pairs, as well as additional living-kidney donor-recipient pairs and unpaired deceased-donor kidneys. Results:The United Network for Organ Sharing (UNOS), Association of Organ Procurement Organizations, American Society of Transplantation, American Society for Histocompatibility and Immunogenetics, and nearly all U.S. kidney transplant programs, organ procurement organizations (OPOs), and histocompatibility laboratories are participating in this observational study. APOLLO employs a central institutional review board (cIRB) and maintains voluntary partnerships with OPOs and histocompatibility laboratories. A Community Advisory Council composed of African American individuals with a personal or family history of kidney disease has advised the NIH Project Office and Steering Committee since inception. UNOS is providing data for outcome analyses. Conclusion:This article describes unique aspects of the protocol, design, and performance of APOLLO. Results will guide use of APOL1 genotypic data to improve the assessment of quality in deceased-donor kidneys and could increase numbers of transplanted kidneys, reduce rates of discard, and improve the safety of living-kidney donation.
Project description:Two apolipoprotein L1 gene (APOL1) renal-risk variants in donors and African American (AA) recipient race are associated with worse allograft survival in deceased-donor kidney transplantation (DDKT) from AA donors. To detect other factors impacting allograft survival from deceased AA kidney donors, APOL1 renal-risk variants were genotyped in additional AA kidney donors.The APOL1 genotypes were linked to outcomes in 478 newly analyzed DDKTs in the Scientific Registry of Transplant Recipients. Multivariate analyses accounting for recipient age, sex, race, panel-reactive antibody level, HLA match, cold ischemia time, donor age, and expanded criteria donation were performed. These 478 transplantations and 675 DDKTs from a prior report were jointly analyzed.Fully adjusted analyses limited to the new 478 DDKTs replicated shorter renal allograft survival in recipients of APOL1 2-renal-risk-variant kidneys (hazard ratio [HR], 2.00; P = 0.03). Combined analysis of 1153 DDKTs from AA donors revealed donor APOL1 high-risk genotype (HR, 2.05; P = 3 × 10), older donor age (HR, 1.18; P = 0.05), and younger recipient age (HR, 0.70; P = 0.001) adversely impacted allograft survival. Although prolonged allograft survival was seen in many recipients of APOL1 2-renal-risk-variant kidneys, follow-up serum creatinine concentrations were higher than that in recipients of 0/1 APOL1 renal-risk-variant kidneys. A competing risk analysis revealed that APOL1 impacted renal allograft survival, but not recipient survival. Interactions between donor age and APOL1 genotype on renal allograft survival were nonsignificant.Shorter renal allograft survival is reproducibly observed after DDKT from APOL1 2-renal-risk-variant donors. Younger recipient age and older donor age have independent adverse effects on renal allograft survival.
Project description:ImportanceIn the US, live donor (LD) kidney transplant rates have decreased in pediatric recipients. Pediatric patients with kidney failure will likely need more than 1 kidney transplant during their lifetime, but the optimal sequence of transplant (ie, deceased donor [DD] followed by LD or vice versa) is not known.ObjectiveTo determine whether pediatric recipients should first receive a DD allograft followed by an LD allograft (DD-LD sequence) or an LD allograft followed by a DD allograft (LD-DD sequence).Design, setting, and participantsThis decision analytical model examined US pediatric patients with kidney failure included in the US Renal Data System 2019 Report who were waiting for a kidney transplant, received a transplant, or experienced graft failure.InterventionsKidney transplant sequences of LD-DD vs DD-LD.Main outcomes and measuresDifference in projected life-years between the 2 sequence options.ResultsAmong patients included in the analysis, the LD-DD sequence provided more net life-years in those 5 years of age (1.82 [95% CI, 0.87-2.77]) and 20 years of age (2.23 [95% CI, 1.31-3.15]) compared with the DD-LD sequence. The net outcomes in patients 10 years of age (0.36 [95% CI, -0.51 to 1.23] additional life-years) and 15 years of age (0.64 [95% CI, -0.15 to 1.39] additional life-years) were not significantly different. However, for those aged 10 years, an LD-DD sequence was favored if eligibility for a second transplant was low (2.09 [95% CI, 1.20-2.98] additional life-years) or if the LD was no longer available (2.32 [95% CI, 1.52-3.12] additional life-years). For those aged 15 years, the LD-DD sequence was favored if the eligibility for a second transplant was low (1.84 [95% CI, 0.96-2.72] additional life-years) or if the LD was no longer available (2.49 [95% CI, 1.77-3.27] additional life-years). Access to multiple DD transplants did not compensate for missing the LD opportunity.Conclusions and relevanceThese findings suggest that the decreased use of LD kidney transplants in pediatric recipients during the past 2 decades should be scrutinized. Given the uncertainty of future recipient eligibility for retransplant and future availability of an LD transplant, the LD-DD sequence is likely the better option. This strategy of an LD transplant first would not only benefit pediatric recipients but allow DD kidneys to be used by others who do not have an LD option.
Project description:Children receive priority in the allocation of deceased donor kidneys for transplantation in the United States, but because allocation begins locally, geographic differences in population and organ supply may enable variation in pediatric access to transplantation. We assembled a cohort of 3764 individual listings for pediatric kidney transplantation in 2005-2010. For each donor service area, we assigned a category of short (<180 days), medium (181-270 days), or long (>270 days) median waiting time and calculated the ratio of pediatric-quality kidneys to pediatric candidates and the percentage of these kidneys locally diverted to adults. We used multivariable Cox regression analyses to examine the association between donor service area characteristics and time to deceased donor kidney transplantation. The Kaplan-Meier estimate of median waiting time to transplantation was 284 days (95% confidence interval, 263 to 300 days) and varied from 14 to 1313 days across donor service areas. Overall, 29% of pediatric-quality kidneys were locally diverted to adults. Compared with areas with short waiting times, areas with long waiting times had a lower ratio of pediatric-quality kidneys to candidates (3.1 versus 5.9; P<0.001) and more diversions to adults (31% versus 27%; P<0.001). In multivariable regression, a lower kidney to candidate ratio remained associated with longer waiting time (hazard ratio, 0.56 for areas with <2:1 versus reference areas with ?5:1 kidneys/candidates; P<0.01). Large geographic variation in waiting time for pediatric deceased donor kidney transplantation exists and is highly associated with local supply and demand factors. Future organ allocation policy should address this geographic inequity.
Project description:IntroductionKidney transplantation (KT) remains the treatment of choice for end-stage kidney disease (ESKD), but access to transplantation is limited by a disparity between supply and demand for suitable organs. This organ shortfall has resulted in the use of a wider range of donor kidneys and, in parallel, a reexamination of potential alternative renal replacement therapies. Previous studies comparing Canadian intensive home hemodialysis (IHHD) with deceased donor (DD) KT in the United States reported similar survival, suggesting IHHD might be a plausible alternative.MethodsUsing data from the Scientific Registry of Transplant Recipients and an experienced US-based IHHD program in Lynchburg, VA, we retrospectively compared mortality outcomes of a cohort of IHHD patients with transplant recipients within the same geographic region between October 1997 and June 2014.ResultsWe identified 3073 transplant recipients and 116 IHHD patients. Living donor KT (n = 1212) had the highest survival and 47% reduction in risk of death compared with IHHD (hazard ratio [HR]: 0.53; 95% confidence interval [CI]: 0.34-0.83). Survival of IHHD patients did not statistically differ from that of DD transplant recipients (n = 1834) in adjusted analyses (HR: 0.96; 95% CI: 0.62-1.48) or when exclusively compared with marginal (Kidney Donor Profile Index >85%) transplant recipients (HR: 1.35; 95% CI: 0.84-2.16).ConclusionOur study showed comparable overall survival between IHHD and DD KT. For appropriate patients, IHHD could serve as bridging therapy to transplant and a tenable long-term renal replacement therapy.
Project description:Deceased donor kidneys with acute kidney injury (AKI) are often discarded due to fear of poor outcomes. We performed a multicenter study to determine associations of AKI (increasing admission-to-terminal serum creatinine by AKI Network stages) with kidney discard, delayed graft function (DGF) and 6-month estimated glomerular filtration rate (eGFR). In 1632 donors, kidney discard risk increased for AKI stages 1, 2 and 3 (compared to no AKI) with adjusted relative risks of 1.28 (1.08-1.52), 1.82 (1.45-2.30) and 2.74 (2.0-3.75), respectively. Adjusted relative risk for DGF also increased by donor AKI stage: 1.27 (1.09-1.49), 1.70 (1.37-2.12) and 2.25 (1.74-2.91), respectively. Six-month eGFR, however, was similar across AKI categories but was lower for recipients with DGF (48 [interquartile range: 31-61] vs. 58 [45-75] ml/min/1.73m(2) for no DGF, p < 0.001). There was significant favorable interaction between donor AKI and DGF such that 6-month eGFR was progressively better for DGF kidneys with increasing donor AKI (46 [29-60], 49 [32-64], 52 [36-59] and 58 [39-71] ml/min/1.73m(2) for no AKI, stage 1, 2 and 3, respectively; interaction p = 0.05). Donor AKI is associated with kidney discard and DGF, but given acceptable 6-month allograft function, clinicians should consider cautious expansion into this donor pool.
Project description:BackgroundIn kidney transplantation, nonimmunologic donor-recipient (D-R) pairing is generally not given the same consideration as immunologic matching. The aim of this study was to determine how nonimmunologic D-R pairing relates to independent donor and recipient factors, and to immunologic HLA match for predicting graft loss.MethodsSeven D-R pairings (race, sex, age, weight, height, cytomegalovirus serostatus, and HLA match) were assessed for their association with the composite outcome of death or kidney graft loss using a Cox regression-based forward stepwise selection model. The best model for predicting graft loss (including nonimmunologic D-R pairings, independent D-R factors, and/or HLA match status) was determined using the Akaike Information Criterion.ResultsTwenty three thousand two hundred sixty two (29.9%) people in the derivation data set and 9892 (29.7%) in the validation data set developed the composite outcome of death or graft loss. A model that included both independent and D-R pairing variables best predicted graft loss. The c-indices for the derivation and validation models were 0.626 and 0.629, respectively. Size mismatch (MM) between donor and recipient (>30 kg [D < R} and >15 cm [D < R]) was associated with poor patient and graft survival even with 0 HLA MM, and conversely, an optimal D-R size pairing mitigated the risk of graft loss seen with 6 HLA MM.ConclusionsD-R pairing is valuable in predicting patient and graft outcomes after kidney transplant. D-R size matching could offset the benefit and harm seen with 0 and 6 HLA MM, respectively. This is a novel finding.
Project description:Deceased donor kidneys with AKI are often discarded for fear of poor transplant outcomes. Donor biomarkers that predict post-transplant renal recovery could improve organ selection and reduce discard. We tested whether higher levels of donor urinary YKL-40, a repair phase protein, associate with improved recipient outcomes in a prospective cohort study involving deceased kidney donors from five organ procurement organizations. We measured urinary YKL-40 concentration in 1301 donors (111 had AKI, defined as doubling of serum creatinine) and ascertained outcomes in the corresponding 2435 recipients, 756 of whom experienced delayed graft function (DGF). Donors with AKI had higher urinary YKL-40 concentration (P<0.001) and acute tubular necrosis on procurement biopsies (P=0.05). In fully adjusted analyses, elevated donor urinary YKL-40 concentration associated with reduced risk of DGF in both recipients of AKI donor kidneys (adjusted relative risk, 0.51 [95% confidence interval (95% CI), 0.32 to 0.80] for highest versus lowest YKL-40 tertile) and recipients of non-AKI donor kidneys (adjusted relative risk, 0.79 [95% CI, 0.65 to 0.97]). Furthermore, in the event of DGF, elevated donor urinary YKL-40 concentration associated with higher 6-month eGFR (6.75 [95% CI, 1.49 to 12.02] ml/min per 1.73 m2) and lower risk of graft failure (adjusted hazard ratio, 0.50 [95% CI, 0.27 to 0.94]). These findings suggest that YKL-40 is produced in response to tubular injury and is independently associated with recovery from AKI and DGF. If ultimately validated as a prognostic biomarker, urinary YKL-40 should be considered in determining the suitability of donor kidneys for transplant.
Project description:Dramatic improvements have been seen in short-term kidney allograft survival over recent decades with introduction of more potent immunosuppressant medications and regimens. Unfortunately, improvements in long-term graft survival have lagged behind. The genomics revolution is providing new insights regarding the potential impact of kidney donor genotypes on long-term graft survival. Variation in the donor apolipoprotein L1 (APOL1), caveolin 1 (CAV1), and multi-drug resistance 1 encoding P-glycoprotein genes (ABCB1) are all associated with graft survival after kidney transplantation. Although the precise mechanisms whereby these donor gene variants confer risk for graft loss have yet to be determined, these findings provide novel opportunities for modifying interactive environmental factors and optimizing kidney allocation with the ultimate goal of improving long-term graft survival rates.