scholarly journals Pre-Transplant Plasma Potassium as a Potential Risk Factor for the Need of Early Hyperkalaemia Treatment after Kidney Transplantation: A Cohort Study

Nephron ◽  
2020 ◽  
Vol 145 (1) ◽  
pp. 63-70
Author(s):  
Bram C.S. de Vries ◽  
Stefan P. Berger ◽  
Stephan J.L. Bakker ◽  
Martin H. de Borst ◽  
Margriet F.C. de Jong

<b><i>Introduction:</i></b> Plasma potassium (K<sup>+</sup>) abnormalities are common among patients with chronic kidney disease and are associated with higher rates of death, major adverse cardiac events, and hospitalization in this population. Currently, no guidelines exist on how to handle pre-transplant plasma K+ in renal transplant recipients (RTR). <b><i>Objective:</i></b> The aim of this study is to examine the relation between pre-transplant plasma K<sup>+</sup> and interventions to resolve hyperkalaemia within 48 h after kidney transplantation. <b><i>Methods:</i></b> In a single-centre cohort study, we addressed the association between the last available plasma K<sup>+</sup> level before transplantation and the post-transplant need for dialysis or use of K<sup>+</sup>-lowering medication to resolve hyperkalaemia within 48 h after renal transplantation using multivariate logistic regression analysis. <b><i>Results:</i></b> 151 RTR were included, of whom 51 (33.8%) patients received one or more K<sup>+</sup> interventions within 48 h after transplantation. Multivariate regression analysis revealed that a higher pre-transplant plasma K<sup>+</sup> was associated with an increased risk of post-transplant intervention (odds ratio 2.2 [95% CI: 1.1–4.4]), independent of donor type (deceased or living) and use of K<sup>+</sup>-lowering medication within 24 h prior to transplantation). <b><i>Conclusions:</i></b> This study indicates that a higher pre-transplant plasma K<sup>+</sup> is associated with a higher risk of interventions necessary to resolve hyperkalaemia within 48 h after renal transplantation. Further research is recommended to determine a cutoff level for pre-transplant plasma K<sup>+</sup> that can be used in practice.

2019 ◽  
Vol 35 (3) ◽  
pp. 526-533 ◽  
Author(s):  
Nadia El Hangouche ◽  
Javier Gomez ◽  
Addis Asfaw ◽  
Jayakumar Sreenivasan ◽  
Tauseef Akhtar ◽  
...  

Abstract Background Mitral annular calcification (MAC) is associated with increased risk of major adverse cardiac events. We hypothesized that MAC, identified on a pretransplant transthoracic echocardiography (TTE), is predictive of cardiac events following renal transplantation (RT). Methods In a retrospective cohort of consecutive RT recipients, pretransplant MAC presence and severity were determined on TTE performed within 1 year prior to transplant. MAC severity was quantified based on the circumferential MAC extension relative to the mitral valve annulus. Post-transplant cardiac risk was assessed using the sum of risk factors (range: 0–8) set forth by the American Heart Association/American College of Cardiology Foundation consensus statement on the assessment of RT candidates. Subjects underwent pretransplant stress single-photon emission computed tomography myocardial perfusion imaging and followed for post-transplant composite outcome of cardiac death or myocardial infarction (CD/MI). Results Among 336 subjects (60.5% men; mean age 52 ± 12 years), MAC was present in 78 (23%) patients. During a mean follow-up of 3.1 ± 1.9 years, a total of 70 events were observed. Patients with MAC had a higher event rate compared with those without MAC (34.6% versus 17.8%, log-rank P = 0.001). There was a stepwise increase in CD/MI risk with increasing MAC severity (P for trend = 0.002). MAC-associated risk remained significant after adjusting for sex, duration of dialysis, sum of risk factors, ejection fraction and perfusion abnormality burden, providing an incremental prognostic value to these parameters (Δχ2 =4.63; P = 0.031). Conclusion Among RT recipients, the burden of pretransplant MAC is an independent predictor of post-transplant risk of CD/MI. MAC should be considered in the preoperative assessment of RT candidates.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Vatsa Dave ◽  
Kevan R. Polkinghorne ◽  
Khai Gene Leong ◽  
John Kanellis ◽  
William R. Mulley

Abstract The evidence supporting an initial mycophenolate mofetil (MMF) dose of 2 g daily in tacrolimus-treated renal transplant recipients is limited. In a non-contemporaneous single-centre cohort study we compared the incidence of leukopaenia, rejection and graft dysfunction in patients initiated on MMF 1.5 g and 2 g daily. Baseline characteristics and tacrolimus trough levels were similar by MMF group. MMF doses became equivalent between groups by 12-months post-transplant, driven by dose reductions in the 2 g group. Leukopaenia occurred in 42.4% of patients by 12-months post-transplant. MMF 2 g was associated with a 1.80-fold increased risk of leukopaenia compared to 1.5 g. Rejection occurred in 44.8% of patients by 12-months post-transplantation. MMF 2 g was associated with half the risk of rejection relative to MMF 1.5 g. Over the first 7-years post-transplantation there was no difference in renal function between groups. Additionally, the development of leukopaenia or rejection did not result in reduced renal function at 7-years post-transplant. Leukopaenia was not associated with an increased incidence of serious infections or rejection. This study demonstrates the initial MMF dose has implications for the incidence of leukopaenia and rejection. Since neither dose produced superior long-term graft function, clinical equipoise remains regarding the optimal initial mycophenolate dose in tacrolimus-treated renal transplant recipients.


2021 ◽  
Vol 80 (2) ◽  
pp. 673-681
Author(s):  
Jin Wang ◽  
Xiaojuan Guo ◽  
Wenhui Lu ◽  
Jie Liu ◽  
Hong Zhang ◽  
...  

Background: Vascular factors and mitochondria dysfunction contribute to the pathogenesis of Alzheimer’s disease (AD). DL-3-n-butylphthalide (NBP) has an effect in protecting mitochondria and improving microcirculation. Objective: The aim was to investigate the effect of donepezil combined NBP therapy in patients with mild-moderate AD. Methods: It was a prospective cohort study. 92 mild-moderate AD patients were classified into the donepezil alone group (n = 43) or the donepezil combined NBP group (n = 49) for 48 weeks. All patients were evaluated with Alzheimer’s Disease Assessment Scale-Cognitive subscale (ADAS-cog), Clinician’s Interview-Based Impression of Change plus caregiver input (CIBIC-plus), Alzheimer’s Disease Cooperative Study-Activities of Daily Living (ADCS-ADL), and Neuropsychiatric Inventory (NPI) every 12 weeks. All patients were monitored for adverse events (AEs). The efficacy was analyzed using multivariate logistic regression analysis. Results: The multivariate logistic regression analysis showed that the changes of ADAS-cog score (OR = 2.778, 95% CI: [1.087, 7. 100], p = 0.033) and ADCS-ADL score (OR = 2.733, 95% CI: [1.002, 7.459], p = 0.049) had significant difference between donepezil alone group and donepezil combined NBP group, while the changes of NPI (OR = 1.145, 95% CI: [0.463, 2.829], p = 0.769), MMSE (OR = 1.563, 95% CI: [0.615, 3.971], p = 0.348) and CIBIC-plus (OR = 2.593, 95% CI: [0.696, 9.685], p = 0.156) had no significant difference. The occurrence of AEs was similar in the two groups. Conclusion: Over the 48-week treatment period, donepezil combined NBP group had slower cognitive decline and better activities of daily living in patients with mild to moderate AD. These indicated that the multi-target therapeutic effect of NBP may be a new choice for AD treatment.


Kidney360 ◽  
2020 ◽  
Vol 1 (7) ◽  
pp. 705-711
Author(s):  
Deirdre Sawinski

Individuals with HIV are at increased risk for ESKD. Kidney transplantation is the best treatment for ESKD in the HIV+ population. Despite reduced access to transplantation, patients who are HIV+ have excellent outcomes and clearly benefit from receiving one. Common post-transplant complications and management concerns, including the optimal antiretroviral regimen, immunosuppression protocols, infectious prophylaxis, hepatitis C coinfection, metabolic complications, and malignancy are all discussed.


2021 ◽  
Author(s):  
Felix Poppelaars ◽  
Mariana Gaya da Costa ◽  
Siawosh K. Eskandari ◽  
Jeffrey Damman ◽  
Marc A. Seelen

Rejection after kidney transplantation remains an important cause of allograft failure that markedly impacts morbidity. Cytokines are a major player in rejection, and we, therefore, explored the impact of interleukin-6 (IL6) and IL-6 receptor (IL6R) gene polymorphisms on the occurrence of rejection after renal transplantation. We performed an observational cohort study analyzing both donor and recipient DNA in 1,271 renal transplant-pairs from the University Medical Center Groningen in The Netherlands and associated single nucleotide polymorphisms (SNPs) with biopsy-proven rejection after kidney transplantation. The C-allele of the IL6R SNP (Asp358Ala: rs2228145 A>C, formerly rs8192284) in donor kidneys conferred a reduced risk of rejection following renal transplantation (HR 0.78 per C-allele; 95%-CI 0.67-0.90; P=0.001). On the other hand, the C-allele of the IL6 SNP (at position-174 in the promoter; rs1800795 G>C) in donor kidneys was associated with an increased risk of rejection for male organ donors (HR per C-allele 1.31; 95%-CI 1.08-1.58; P=0.0006), but not female organ donors (P=0.33). In contrast, neither the IL6 nor IL6R SNP in the recipient showed an association with renal transplant rejection. In conclusion, donor IL6 and IL6R genotypes but not recipient genotypes represent an independent prognostic marker for biopsy-proven renal allograft rejection.


2020 ◽  
Vol 31 (1) ◽  
pp. 1-15
Author(s):  
Onesmo A. Kisanga ◽  
Francis F. Furia ◽  
Paschal J. Ruggajo ◽  
Eden E. Maro

Background: Renal replacement therapy (RRT), which includes dialysis and kidney transplantation, is the treatment of choice for patients with end stage renal failure (ESRF). Most sub-Saharan African countries have not developed renal transplantation services and are relying on referring patients to overseas countries. This study was carried out to describe renal transplantation experience in Tanzania.Methods: Forty-four renal transplant recipients were recruited in this study. Standardized questionnaire and Swahili version of standard form – 36 (SF-36) were used to collect socio-demographic information, clinical data, laboratory test results and health related quality of life information.Results: Ages of transplant recipient ranged from 21 to 66 years with mean age of 45.9 ± 10.5 years. The leading causes of end stage renal failure among participants was hypertension 58.8% (25/44) followed by glomerulonephritis 15.9% (7/44). Twentyeight (63.6%) of transplantations were paid by the government. Most of the donors (97.7%) were living out of which 26 (59.1%) were siblings and 11 (25%) were second-degree relatives (cousins and nephews). Most common complication noted following transplantation was diabetes mellitus 9 (20.5%) and 3 (6.8%) had chronic rejection. Mental health was the domain with highest mean score (75.6 ± 14.3) and role physical had the least mean score (44 ± 45.6).Conclusions:  Hypertension was the leading cause of ESRF in this study. Most of the donors were siblings and the costs of transplantation were largely covered by the government. There is a need for concerted effort to establish local kidney transplantation services in Tanzania. Keywords: Renal transplantation, quality of life in transplantation, Tanzania.


Nutrients ◽  
2020 ◽  
Vol 12 (3) ◽  
pp. 878
Author(s):  
Jae Yeun Lee ◽  
Joon Mo Kim ◽  
Kyoung Yong Lee ◽  
Bokyung Kim ◽  
Mi Yeon Lee ◽  
...  

To investigate the association between nutrient intake and primary open angle glaucoma (POAG) in Koreans, a population-based, cross-sectional survey, the Korean National Health and Nutrition Examination Survey, was analyzed. Glaucoma diagnosis was based on criteria established by the International Society of Geographic and Epidemiologic Ophthalmology. Multivariate regression analysis was used to assess the correlation between dietary intake and the prevalence of POAG in all enrolled subjects. In the low Body mass index(BMI) group (BMI <18.5), females with POAG had significantly lower intakes of energy, protein, fat, carbohydrate, ash, calcium, phosphorus, sodium, potassium, vitamin A, B-carotene, thiamin, riboflavin, and vitamin C than their non-glaucoma counterparts, based on a multivariate logistic regression analysis (all p < 0.05). In females with a medium BMI (18.5 ≤ BMI < 23), POAG showed a significant association with lower food intake, energy, protein, calcium, phosphorus, potassium, thiamin and niacin. (all p < 0.05). Lower protein thiamine intake in medium BMI males was related to POAG. Low dietary intake of several nutrients showed an association with glaucoma in low BMI female subjects. An insufficient intake of certain nutrients may be associated with an increased risk of glaucoma in Koreans. Further large-scale cohort studies are needed to determine how specific nutrients alter the risk of glaucoma.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Hatem Kaies Ibrahim Elsayed Ali ◽  
Ahmed Daoud ◽  
Karim Soliman

Abstract Background and Aims Survival after renal transplantation has improved significantly over the past 20 years. On the contrary, surprisingly little is known about the functional status post-transplant. A fundamental understanding of the functional status of patients surviving renal transplantation is of primary importance to clinicians and families alike, who often pursue renal transplantation with a principal hope that functional status will improve. The aim of our study is to assess the effect of renal transplantation on functional status of the patients in tacrolimus era Method Using data from the United States organ procurement and transplantation network, all renal transplant patients maintained on tacrolimus-based immunotherapy and had functional assessment at time of transplant and five-years post-transplant were retrospectively reviewed. Data including age, sex, gender, ethnicity, functional status, diabetes, body mass index, cold ischemia time, number of previous transplants, panel reactive antibodies, donor type, donor age, HLA-mismatches, number of acute rejection episodes, induction therapies, maintenance immunotherapy on discharge were collected. Functional status was defined according to Karnofsky score measurements. Descriptive analysis was used to assess effect of renal transplantation on functional status. Outcome measured was functional status five-years post-transplant. Multiple logistic regression analysis was used to assess factors affecting functional status post-transplant. Results 19704 patients were included in the study. Among patients with mild impairment at time of transplant, only 13.55% showed worsening of functional status. Among patients with moderate impairment at time of transplant, 65.5% showed improvement of functional status while only 3.92% showed worsening of functional status. Among patients with severe impairment at time of transplant, 88.56% showed improvement in functional status (64.57% showed improvement to mild impairment and 23.99% showed improvement to moderate impairment). Multiple logistic regression analysis showed that steroid withdrawal protocol is associated with improvement in functional status (OR=1.24, P=0.007, CI ranges from 1.06 to 1.45), while dialysis maintenance before transplantation was associated with abnormal functional status post-transplant (OR=0.73, P=0.003, CI ranges from 0.59 to 0.89). Conclusion This study revealed that renal transplantation is associated with substantial improvement in functional status of the patients. Steroid withdrawal protocols is associated with significant improvement in functional status while maintenance dialysis before transplantation is associated with worse outcomes. This study recommends using steroid withdrawal protocols and performing transplantation for the patients at the pre-dialysis state. Functional status at time of transplant shouldn’t be a hindrance to performing transplantation.


2020 ◽  
Vol 2020 ◽  
pp. 1-6 ◽  
Author(s):  
Jong Youn Moon ◽  
Jesang Lee ◽  
Yoon Hyung Park ◽  
Eun-Cheol Park ◽  
Si Hyung Lee

Purpose. To determine the incidence of keratoconus and to determine its possible association with common systemic diseases using a nationwide cohort. Methods. This retrospective nationwide cohort study included 1,025,340 subjects from the Korean National Health Insurance Service-National Sample Cohort database from 2004 to 2013. Estimates for incidence rates of keratoconus were identified. After 1 : 5 matching using propensity scores, associations between keratoconus and certain systemic comorbidities were determined using multivariate logistic regression analysis. Results. The incidence during the same period was 15.1 cases per 100,000 person-years. Adjusted logistic regression analysis after propensity score matching revealed significant associations between keratoconus and allergic rhinitis (odds ratio (OR): 1.86; 95% confidence interval (CI): 1.63–2.13; p<0.001), asthma (OR: 1.20; 95% CI: 1.06–1.36; p<0.001), atopic dermatitis (OR: 1.33; 95% CI: 1.13–1.56; p<0.001), and diabetes mellitus (DM) (OR: 1.35; 95% CI: 1.15–1.58; p<0.001). Conclusion. Estimates of the incidence of keratoconus may help in the planning of eye-care policies, and the results of this study determined the associations between allergic diseases and keratoconus. Conflicting results regarding the association between keratoconus and DM should be further evaluated.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Rianne M. Douwes ◽  
Joanna Sophia Jacoline Vinke ◽  
António W Gomes-Neto ◽  
Hans Blokzijl ◽  
Stefan P Berger ◽  
...  

Abstract Background and Aims Use of proton-pump inhibitors (PPIs) is common practice in renal transplant recipients (RTRs). Emerging data suggest several adverse effects of use of PPIs, including development of iron deficiency (ID). Although the latter has been shown with respect to PPIs, specific analyses for different types of PPIs and the associated risk of ID have not been performed. Method We used data from the TransplantLines Biobank and Cohort study, an ongoing prospective cohort study among all types of solid organ transplant recipients. For the current study, we used data from stable RTRs with a functional graft for more than 1 year post transplantation (n=795). We excluded RTRs who used any form of iron supplementation (n=54) and EPO-stimulating agents (n=24), resulting in 728 RTRs eligible for analyses. Use of PPIs was subdivided in different types of PPIs, i.e. omeprazole, esomeprazole, pantoprazole, and rabeprazole. ID was defined as TSAT&lt;20% and ferritin &lt;300 µg/L. Logistic regression analysis was used to assess the associations between PPIs and ID. Results We included 728 RTRs (age 56±13 years, 61% males), with a mean eGFR of 53±18 ml/min/1.73m2, a median [interquartile range] ferritin level of 96 (44 – 191) µg/L and mean TSAT of 24±10%. PPIs were used by 504 (69%) of the included RTRs, of which 398 (79%), 55 (11%), 49 (10%), and 2 (0.4%) respectively used omeprazole, pantoprazole, esomeprazole, and rabeprazole. Use of PPIs was strongly associated with ID (OR, 2.20; 95%CI 1.48 – 3.28; P&lt;0.001), independent of adjustment for age, sex, BMI, eGFR, hs-CRP, smoking, alcohol use, use of calcineurine inhibitors, prednisolone, antiplatelet drugs, and antihypertensives. When subdividing the PPIs into the different types, both omeprazole (OR, 1.98; 95%CI 1.39 – 2.83; P&lt;0.001) and esomeprazole (OR, 2.11; 95%CI 1.09 – 4.07; P=0.03) were independently associated with iron deficiency, whereas pantoprazole was not associated (OR, 0.89; 95%CI 0.47 – 1.70; P=0.73). Conclusion Omeprazole and esomeprazole, but not pantoprazole, are associated with an increased risk of ID. Our results are in line with previous reports that pantoprazole has the lowest potency with least increase in intragastric pH, thereby possibly interfering less with reduction of ferric to ferrous iron, and subsequently iron absorption. Future studies are warranted to confirm our present findings.


Sign in / Sign up

Export Citation Format

Share Document