Trajectories and Predictors of Allograft Dysfunction after Renal Transplantation in Children

2016 ◽  
Vol 45 (1) ◽  
pp. 63-68 ◽  
Author(s):  
Vandréa Carla de Souza ◽  
Muriel Rabilloud ◽  
Pierre Cochat ◽  
Mario Bernardes Wagner ◽  
Clotilde Druck Garcia ◽  
...  

Background: The survival rates of renal transplant children are indeed on the rise, but it is still important to ensure that there is optimal renal function in these children in all their future growing years. The number of functioning nephrons and the graft ability to adapt to an increasing demand during body growth seem to be the most important factors for long-term allograft function. This study examined the long-term change in the glomerular filtration rate in a pediatric kidney transplant cohort and the importance of the recipient and donor ages in predicting transplant outcome. Methods: Data on 67 renal transplant children who underwent 278 inulin-clearance measurements between 2000 and 2010 were examined. A longitudinal latent class model was used to identify renal function trajectories and classify the children. Results: This model identified 3 trajectories of renal allograft function after pediatric kidney transplantation: ‘low and decreasing', ‘moderate and stable', and ‘high and sharply decreasing'. The probability of belonging to the low and decreasing trajectory - that is, the poorer outcome - was lower in recipients of grafts from living versus deceased donor (adjusted OR (aOR) 0.02; p = 0.03). This probability increased with recipient age (aOR 1.20 per year of recipient ageing; p = 0.07) and donor-recipient age-difference (aOR 1.13 per additional year; p = 0.07). Conclusion: This study suggests that donation from living donors and from younger donors are favorable factors for long-term allograft function.

2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Hatem Kaies Ibrahim Elsayed Ali ◽  
Ahmed Daoud ◽  
Karim Soliman ◽  
Mahmoud Mohamed ◽  
Asam Murtaza

Abstract Background and Aims High donor-recipient age gap among deceased-donor renal transplant patients leads to worse outcomes. However, the impact of this gap among live-donor renal transplants is unclear. The aim of this study is to assess the effect of this age gap on graft survival and acute rejection rates among renal transplants in tacrolimus era. Method 14390 live-donor renal transplant patients who received a single organ transplant, had no previous renal transplants, discharged on tacrolimus-based immunotherapy and were registered in the Organ Procurement Transplantation Network from January 2000 till June 2017 were included in the study. Donor–recipient age difference was divided into 5 groups; group A (difference <−10,n=4375), group B (difference from -10 to 10,n=7229), group C (difference between 10-20, n=861), group D ( difference between 20–29, n=1406) and group E (difference ≥30 years, n=519). Poisson regression analysis was used to assess effect of age gap on acute rejection rates. Kaplan-Meier survival curves and Cox hazard regression analysis were used to assess this effect on graft survival. Results Regarding graft survival, groups with age difference ≥30 years and between 20-29 years showed a significantly higher risk of graft loss when compared to group with age difference <−10 (HR equals 4.6 and 3.8 respectively). Groups with age difference between 10 to 20 years and between -10 to 10 years showed no significant difference in graft survival when compared to same group (HR equals 1.03 and 0.95 respectively). Groups B,C,D,E were not associated with increased risk of acute rejection episodes when compared to group A (IRR=1.001, 1.001, 1.022, 1.027 respectively). Conclusion Donor-recipient age difference up to 20 years has similar renal transplant outcomes to those receiving kidneys from younger donors and therefore, should not be precluded from paired kidney donation programs. The donor-recipient age difference above 20 years is associated with worse outcomes in terms of graft survival.


2013 ◽  
Vol 27 (6) ◽  
pp. 838-843 ◽  
Author(s):  
Ioannis D. Kostakis ◽  
Demetrios N. Moris ◽  
Alexandros Barlas ◽  
Ioannis Bokos ◽  
Maria Darema ◽  
...  

1991 ◽  
Vol 4 (2) ◽  
pp. 88-91 ◽  
Author(s):  
Peter Donnelly ◽  
Peter Veitch ◽  
Peter Bell ◽  
Robin Henderson ◽  
Paul Oman ◽  
...  

2020 ◽  
Vol 35 (3) ◽  
pp. 512-519 ◽  
Author(s):  
Manuela Yepes-Calderón ◽  
Camilo G Sotomayor ◽  
Rijk O B Gans ◽  
Stefan P Berger ◽  
Henri G D Leuvenink ◽  
...  

Abstract Background In renal transplant recipients (RTRs), cardiovascular mortality is the most common cause of long-term renal graft loss. Oxidative stress (OS) has been associated with cardiovascular disease and is known to be enhanced in RTRs. We aimed to prospectively investigate whether the concentration of the OS biomarker malondialdehyde (MDA) is associated with long-term risk of cardiovascular mortality in a large cohort of RTRs. Methods The plasma MDA concentration was measured using the thiobarbituric acid reaction assay in 604 extensively phenotyped RTRs with a functioning allograft for ≥1 year. The association between MDA and cardiovascular mortality was assessed using Cox proportional hazard regression analyses in the overall cohort and within subgroups according to significant effect modifiers. Results Median circulating MDA concentration at baseline was 5.38 [interquartile range (IQR) 4.31–6.45] μmol/L. During a follow-up period of 6.4 (IQR 5.6–6.8) years, 110 (18%) RTRs died, with 40% of deaths due to cardiovascular causes. MDA concentration was significantly associated with the risk for cardiovascular mortality {hazard ratio [HR] 1.31 [95% confidence interval (CI) 1.03–1.67] per 1-SD increment}, independent of adjustment for potential confounders, including renal function, immunosuppressive therapy, smoking status and blood pressure. The association between MDA concentration and the risk for cardiovascular mortality was stronger in RTRs with relatively lower plasma ascorbic acid concentrations [≤42.5 µmol/L; HR 1.79 (95% CI 1.30–2.48) per 1-SD increment] or relatively lower estimated glomerular filtration rates [≤45 mL/min/1.73 m2; HR 2.09 (95% CI 1.45–3.00) per 1-SD increment]. Conclusions Circulating MDA concentration is independently associated with long-term risk for cardiovascular mortality, particularly in RTRs with relatively lower ascorbic acid concentrations or renal function. Further studies are warranted to elucidate whether OS-targeted interventions could decrease cardiovascular mortality in RTRs.


Author(s):  
V. E. Syutkin ◽  
A. A. Salienko ◽  
S. V. Zhuravel ◽  
M. S. Novruzbekov

Objective: to compare changes in estimated glomerular filtration rate (eGFR) in liver recipients with initially normal and impaired eGFR within the first year after immunosuppression conversion.Materials and methods. Enrolled in the study were 215 recipients of deceased-donor livers from February 2009 to February 2020, who received everolimus with dose reduction or complete withdrawal of calcineurin inhibitors (immunosuppression conversion, ISxC) for varying periods of time. GFR was measured using the MDRD-4 formula immediately before ISxC, then 3, 6, and 12 months after orthotopic liver transplantation (LTx). One month was considered an acceptable temporary deviation from the corresponding point.Results. At the time of ISxC, 32 (15%) of 215 recipients had normal renal function. Chronic kidney disease (CKD) increased in 60% of the recipients with normal eGFR by the end of the first year following ISxC; the fall in eGFR was particularly pronounced in older recipients. In the group with a baseline eGFR of 60–89 mL/min/1.73 m2, eGFR normalized in 62% of cases within 12 months; 28% of cases had no changes in renal function. In the subgroup with a pronounced decrease in eGFR at the time of ISxC, increased eGFR was observed as early as 1 month after ISxC, and the maximum was recorded after 3–6 months. The mean eGFR relative to baseline by month 3 after eGFR were higher for ISxC that was done in the first 2 months after LTx (19.7 ± 15.7 ml/minute/1.73 m2) than for ISxC done in the long-term period after LTx (10.1 ± 8.7 ml/minute/1.73 m2, p < 0.05).Conclusion. Changes in eGFR in liver recipients receiving EVR plus low-dose calcineurin inhibitor (CNI) depend on baseline eGFR and are multidirectional. The use of ISxC in the early post-LTx period led to a more pronounced improvement in eGFR. Maximal changes in eGFR were observed by 3–6 months after ISxC.


Sign in / Sign up

Export Citation Format

Share Document