Project description:Living-donor lobar lung transplantation (LDLLT) has become an important life-saving option for patients with severe respiratory disorders, since it was developed by a group in the University of Southern California in 1993 and introduced in Japan in 1998 in order to address the current severe shortage of brain-dead donor organs. Although LDLLT candidates were basically limited to critically ill patients who would require hospitalization, the long-term use of steroids, and/or mechanical respiratory support prior to transplantation, LDLLT provided good post-transplant outcomes, comparable to brain-dead donor lung transplantation in the early and late phases. In Kyoto University, the 5- and 10-year survival rates after LDLLT were reported to be 79.0% and 64.6%, respectively. LDLLT should be performed under appropriate circumstances, considering the inherent risk to the living donor. In our transplant program, all living donors returned to their previous social lives without any major complications, and living-donor surgery was associated with a morbidity rate of <15%. Both functional and anatomical size matching were preoperatively performed between the living-donor lobar grafts and recipients. Precise size matching before surgery could provide a favorable pulmonary function and exercise capacity after LDLLT. Various transplant procedures have recently been developed in LDLLT in order to deal with the issue of graft size mismatching in recipients, and favorable post-transplant outcomes have been observed. Native upper lobe-sparing and/or right-to-left inverted transplantation have been performed for undersized grafts, while single-lobe transplantation has been employed with or without contralateral pneumonectomy and/or delayed chest closure for oversized grafts.
Project description:BackgroundDonor-to-recipient lung size matching at lung transplantation (LTx) can be estimated by the predicted total lung capacity (pTLC) ratio (donor pTLC/recipient pTLC). We aimed to determine whether the pTLC ratio is associated with the risk of primary graft dysfunction (PGD) after bilateral LTx (BLT).MethodsWe calculated the pTLC ratio for 812 adult BLTs from the Lung Transplant Outcomes Group between March 2002 to December 2010. Patients were stratified by pTLC ratio >1.0 ("oversized") and pTLC ratio ≤1.0 ("undersized"). PGD was defined as any ISHLT Grade 3 PGD (PGD3) within 72 hours of reperfusion. We analyzed the association between risk factors and PGD using multivariable conditional logistic regression. As transplant diagnoses can influence the size-matching decisions and also modulate the risk for PGD, we performed pre-specified analyses by assessing the impact of lung size mismatch within diagnostic categories.ResultsIn univariate analyses oversizing was associated with a 39% lower odds of PGD3 (OR 0.61, 95% CI, 0.45-0.85, p = 0.003). In a multivariate model accounting for center-effects and known PGD risks, oversizing remained independently associated with a decreased odds of PGD3 (OR 0.58, 95% CI 0.38 to 0.88, p = 0.01). The risk-adjusted point estimate was similar for the non-COPD diagnosis groups (OR 0.52, 95% CI 0.32 to 0.86, p = 0.01); however, there was no detected association within the COPD group (OR 0.72, 95% CI 0.29 to 1.78, p = 0.5).ConclusionOversized allografts are associated with a decreased risk of PGD3 after BLT; this effect appears most apparent in non-COPD patients.
Project description:Donor-derived cell-free DNA (dd-cf-DNA) has been shown to be an informative biomarker of rejection after lung transplantation (LT) from deceased donors. However, in living-donor lobar LT, because small grafts from blood relatives are implanted with short ischemic times, the detection of dd-cf-DNA might be challenging. Our study was aimed at examining the role of dd-cf-DNA measurement in the diagnosis of primary graft dysfunction and acute rejection early after living-donor lobar LT. Immediately after LT, marked increase of the plasma dd-cf-DNA levels was noted, with the levels subsequently reaching a plateau with the resolution of primary graft dysfunction. Increased plasma levels of dd-cf-DNA were significantly correlated with decreased oxygenation immediately (p?=?0.022) and at 72?hours (p?=?0.046) after LT. Significantly higher plasma dd-cf-DNA levels were observed in patients with acute rejection (median, 12.0%) than in those with infection (median, 4.2%) (p?=?0.028) or in a stable condition (median, 1.1%) (p?=?0.001). Thus, measurement of the plasma levels of dd-cf-DNA might be useful to monitor the severity of primary graft dysfunction, and plasma dd-cf-DNA could be a potential biomarker for the diagnosis of acute rejection after LT.
Project description:Background and objectivesSmall donor and/or kidney sizes relative to recipient size are associated with a higher risk of kidney allograft failure. Donor and recipient ages are associated with graft survival and may modulate the relationship between size mismatch and the latter. The aim of this study was to determine whether the association between donor-recipient size mismatch and graft survival differs by donor and recipient age.Design, setting, participants, & measurementWe performed a retrospective cohort study of first adult deceased donor kidney transplantations performed between 2000 and 2018 recorded in the Scientific Registry of Transplant Recipients. We used multivariable Cox proportional hazards models to assess the association between donor-recipient body surface area ratio and death-censored graft survival, defined as return to dialysis or retransplantation. We considered interactions between donor-recipient body surface area ratio and each of recipient and donor age.ResultsAmong the 136,321 kidney transplant recipients included in this study, 23,614 (17%) experienced death-censored graft loss over a median follow-up of 4.3 years (interquartile range, 1.9-7.8 years). The three-way donor-recipient body surface area ratio by donor age by recipient age interaction was statistically significant (P=0.04). The magnitude of the association between severe size mismatch (donor-recipient body surface area ratio <0.80 versus ≥1.00) and death-censored graft survival was stronger with older donor age and recipient age. In all recipient age categories except the youngest (18-30 years), 5- and 10-year graft survival rates were similar or better with a size-mismatched donor aged <40 years than a nonsize-mismatched donor aged 40 years or older.ConclusionsThe association of donor-recipient size mismatch on long-term graft survival is modulated by recipient and donor age. Size-mismatched kidneys yield excellent graft survival when the donor is young. Donor age was more strongly associated with graft survival than size mismatch.
Project description:Living-donor lobar lung transplantation (LDLLT) was first performed in the USA and thereafter it was introduced in Japan in 1998 as an alternative modality to brain-dead donor lung transplantation (BDLT). Although the LDLLT procedure was employed for rapidly deteriorating patients who were hospitalized and mechanically ventilated at the time of transplantation, LDLLT demonstrated better or comparable post-transplant outcomes in comparison to BDLT. Less injured lobar grafts and a significantly shorter graft ischemic time possibly contributed to a significantly lower incidence of severe primary graft dysfunction (PGD) after LDLLT in comparison to BDLT. In standard LDLLT, patients obtained lobar grafts from two different donors, and thus most patients developed chronic lung allograft dysfunction (CLAD) only in the unilateral lung graft. This indicates that the contralateral unaffected lung graft could reserve lung function after the unilateral development of CLAD. In our transplant program, the incidence of CLAD per donor in LDLLT (14.4%) was also significantly lower in comparison to BDLT (24.7%). The 1-, 5- and 10-year survival rates after LDLLT were 90.9%, 75.5% and 57.2%, respectively, which were equivalent to those after BDLT (92.9%, 73.4% and 62.2%). The inherent surgical risk to the living donors should always be considered. In our experience, living-donor surgery was associated with a complication rate of 12.7%, and importantly, all living donors finally returned to their previous social lives. Precise functional and anatomical size matching between donor lobar graft and recipient could provide a favorable pulmonary function after LDLLT. We recently established multimodal surgical approaches, such as native upper lobe-sparing, right-to-left horizontally rotated, segmental, and single-lobe transplantation, in order to resolve the issue of size mismatch between the donor lobar graft and the recipient.
Project description:BackgroundMicroRNAs (miRNAs) involved in the pathogenesis of pulmonary fibrosis have been shown to be associated with the development of chronic lung allograft dysfunction (CLAD) after lung transplantation (LT). We investigated the role of circulating miRNAs in the diagnosis of CLAD after bilateral LT, including cadaveric LT (CLT) and living-donor lobar LT (LDLLT).MethodsThe subjects of this retrospective study were 37 recipients of bilateral CLT (n = 23) and LDLLT (n = 14), and they were divided into a non-CLAD group (n = 24) and a CLAD group (n = 13). The plasma miRNA levels of the two groups were compared, and correlations between their miRNAs levels and percent baseline forced expiratory volume in 1 s (FEV1), forced vital capacity (FVC), and total lung capacity (TLC) values were calculated from one year before to one year after the diagnosis of CLAD.ResultsThe plasma levels of both miR-21 and miR-155 at the time of the diagnosis of CLAD were significantly higher in the CLAD group than in the non-CLAD group (miR-21, P = 0.0013; miR-155, P = 0.042). The miR-21 levels were significantly correlated with the percent baseline FEV1, FVC, and TLC value of one year before and at the time of diagnosis of CLAD (P < 0.05). A receiver operating characteristic curve analysis of the performance of miR-21 levels in the diagnosis of CLAD yielded an area under the curve of 0.89.ConclusionCirculating miR-21 appears to be of potential value in diagnosing CLAD after bilateral LT.
Project description:BackgroundLiver transplantation (LT) outcomes are influenced by donor-recipient size mismatch. This study re-evaluated the impact on graft size discrepancies on survival outcomes.MethodsData from 53 389 adult LT recipients from the United Network for Organ Sharing database (2013-2022) were reviewed. The study population was divided by the body surface area index (BSAi), defined as the ratio of donor body surface area (BSA) to recipient BSA, into small-for-size (BSAi < 0.78), normal-for-size (BSAi 0.78-1.24), and large-for-size (BSAi > 1.24) grafts in deceased donor LT (SFSD, NFSD, and LFSD). Multivariate Cox regression and Kaplan-Meier survival analyses were conducted.ResultsThe frequency of size mismatch in deceased donor LT increased over the past 10 y. SFSD had significantly worse 90-d graft survival (P < 0.01), and LFSD had inferior 1-y graft survival among 90-d survivors (P = 0.01). SFSD was hazardous within 90 d post-LT because of vascular complications. Beyond 1 y, graft size did not affect graft survival. LFSD risk within the first year was mitigated with lower model for end-stage liver disease (MELD) 3.0 scores (<35) or shorter cold ischemia time (<8 h).ConclusionsThe negative impacts on donor-recipient size mismatch on survival outcomes are confined to the first year post-LT. SFSD is associated with a slight decrease in 90-d survival rates. LFSD should be utilized more frequently by minimizing cold ischemia time to <8 h, particularly in patients with MELD 3.0 scores below 35. These findings could improve donor-recipient matching and enhance LT outcomes.
Project description:This study evaluated the combined effect of recipient-to-donor weight and sex mismatch after deceased-donor renal transplantation in a German transplant cohort and the evolution of recipient-to-donor weight difference over a 13-year observation period. The association of absolute weight and sex difference with graft failure was explored in an outpatient cohort of deceased-donor transplant recipients who underwent kidney transplantation between 2000 and 2012. Graft failure was defined as repeated need for dialysis or death with a functioning graft. Recipient and donor sex pairings were classified as sex concordant (MDMR/FDFR) or discordant (MDFR/FDMR). These classes were further stratified into four groups according to recipient-to-donor weight mismatch ≥10 kg (recipient > donor) or <10 kg (recipient < donor). Multivariable Cox proportional hazards models were applied to evaluate the time to graft loss adjusting for donor, immunologic, surgical, organizational, and recipient predictors. Sex-concordant transplant pairings <10 kg weight difference served as the reference group. Among 826 transplant recipients, 154 developed graft failure (18.6%). Median graft survival time was 3.9 years; first quartile (0.2-1.2), second quartile (1.2-2.9), third quartile (2.9-5.8), and fourth quartile (5.8-12.4). After multivariable adjustment, the highest relative hazard for graft failure was observed for sex-discordant transplant pairings with a ≥10 kg weight difference between recipient and donor (compared to the reference group MDMR/FDFR with weight difference <10 kg, MDMR/FDFR with weight difference ≥10 kg, hazard ratio 1.86, 95% confidence interval 1.07-3.32-p = 0.029; MDFR/FDMR with weight difference <10 kg, hazard ratio 1.14, 95% confidence interval 0.78-1.68-p = 0.507, and MDFR/FDMR with weight difference ≥10 kg, hazard ratio 2.00, 95% confidence interval 1.15-3.48-p = 0.014). A recipient-to-donor weight mismatch of ≥10 kg was associated with an increased risk of graft loss or recipient death with a functioning graft. Concurrent sex discordance seemed to enhance this effect as indicated by an increase in the hazard ratio. We detected no significant tendency for increasing recipient-to-donor weight differences from 2000 to 2012.
Project description:The introduction of living donor liver transplantation (LDLT) has been one of the most remarkable steps in the field of liver transplantation (LT). First introduced for children in 1989, its adoption for adults has followed only 10 years later. As the demand for LT continues to increase, LDLT provides life-saving therapy for many patients who would otherwise die awaiting a cadaveric organ. In recent years, LDLT has been shown to be a clinically safe addition to deceased donor liver transplantation (DDLT) and has been able to significantly extend the scarce donor pool. As long as the donor shortage continues to increase, LDLT will play an important role in the future of LT.
Project description:BackgroundThis study assesses the impact of human leukocyte antigen (HLA)-DR mismatch and donor-estimated glomerular filtration rate (eGFR) on outcomes of living donor kidney transplantation (LDKT), which are especially relevant to the availability of multiple donors and paired kidney exchanges.MethodsUsing data from the Scientific Registry of Transplant Recipients (SRTR), we retrospectively analyzed graft survival in adult LDKT recipients transplanted between January 2013 and September 2022. Recipients with 0 HLA-DR mismatches were compared to those with 1-2 HLA-DR mismatches. Cox models assessed the association between donor eGFR and graft and patient survival, stratifying by a) HLA-DR mismatches, and b) HLA-DR mismatches and recipient age.ResultsAmong 44,080 recipients, 7,195 had 0 HLA-DR mismatches and 36,885 had 1-2 HLA-DR mismatches. The recipients' mean age was 49.1 for the 0 HLA-DR mismatch group and 50.4 for the 1-2 HLA-DR mismatch group. The donors' mean age was 43.1 and 43.8, with an eGFR of 101.0 and 99.9 ml/min, respectively. A higher donor eGFR was associated with better graft survival. Stratified analyses showed higher donor eGFR levels reduced the risk of graft loss in cases with DR mismatch (p < 0.001) but not in cases without HLA-DR mismatch (p = 0.81). This effect was significant for recipients aged 18-39 and over 60. Similar results were observed for patient survival.ConclusionsHigher donor eGFR was associated with lower risks of graft loss and patient death in the HLA-DR mismatch group but not the 0 HLA-DR mismatch group. These results emphasize the importance of considering both HLA-DR matching and donor kidney function, particularly for younger recipients to avoid sensitization for future transplants.