scholarly journals Effects of Urate-Lowering Therapy on Risk of Hyperlipidemia in Gout by a Population-Based Cohort Study and on In Vitro Hepatic Lipogenesis-Related Gene Expression

2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Yi-Jen Fang ◽  
Tien-Yuan Wu ◽  
Cheng-Li Lin ◽  
Chih-Yang Su ◽  
Jia-Rong Li ◽  
...  

Patients with gout are at a higher risk of cardiovascular disease, which is associated with hyperlipidemia. Management of gout in Taiwan is poor, and the association between urate-lowering therapy (ULT) among gout patients and hyperlipidemia is unclear. We conducted a retrospective cohort study using data from the Longitudinal Health Insurance Database (LHID) of Taiwan on new-onset gout patients and a comparison cohort without gout. A Cox proportional hazards model was used to analyze differences in the risk of hyperlipidemia between patients with and without gout after considering related comorbidities. We also examined the ULT medications on the hepatic expression of lipogenesis-related genes. After adjusting for potential confounders, the case group (44,413 patients) was found to have a higher risk of hyperlipidemia than the control cohort (177,652 patients) [adjusted hazards ratio aHR = 2.55 ]. Gout patients without antigout treatment had significantly higher risk of hyperlipidemia than the control cohort ( aHR = 3.10 ). Among gout patients receiving ULT, except those receiving probenecid ( aHR = 0.80 ), all had significantly lower risk of hyperlipidemia than gout patients without ULT (all aHR < 0.90 ). Using real-time polymerase chain reaction, we found that most of the antigout drugs decreased the expression of hepatic genes related to lipogenesis in differentiated HepaRG cells. These data indicate that these antigout drugs reduce hyperlipidemia in gout patients, partly via the reduction in expression of lipogenesis-related genes, leading to improved blood lipid profiles. We provide evidence of the strong association between gout and hyperlipidemia and highlight the need for appropriate treatment guidelines.

Nutrients ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Pierre Ménager ◽  
Olivier Brière ◽  
Jennifer Gautier ◽  
Jérémie Riou ◽  
Guillaume Sacco ◽  
...  

Background. Vitamin K concentrations are inversely associated with the clinical severity of COVID-19. The objective of this cohort study was to determine whether the regular use of vitamin K antagonist (VKA) prior to COVID-19 was associated with short-term mortality in frail older adults hospitalized for COVID-19. Methods. Eighty-two patients consecutively hospitalized for COVID-19 in a geriatric acute care unit were included. The association of the regular use of VKA prior to COVID-19 with survival after 7 days of COVID-19 was examined using a propensity-score-weighted Cox proportional-hazards model accounting for age, sex, severe undernutrition, diabetes mellitus, hypertension, prior myocardial infarction, congestive heart failure, prior stroke and/or transient ischemic attack, CHA2DS2-VASc score, HAS-BLED score, and eGFR. Results. Among 82 patients (mean ± SD age 88.8 ± 4.5 years; 48% women), 73 survived COVID-19 at day 7 while 9 died. There was no between-group difference at baseline, despite a trend for more frequent use of VKA in those who did not survive on day 7 (33.3% versus 8.2%, p = 0.056). While considering “using no VKA” as the reference (hazard ratio (HR) = 1), the HR for 7-day mortality in those regularly using VKA was 5.68 [95% CI: 1.17; 27.53]. Consistently, COVID-19 patients using VKA on a regular basis had shorter survival times than the others (p = 0.031). Conclusions. Regular use of VKA was associated with increased mortality at day 7 in hospitalized frail elderly patients with COVID-19.


2021 ◽  
Author(s):  
Miguel I. Paredes ◽  
Stephanie Lunn ◽  
Michael Famulare ◽  
Lauren A. Frisbie ◽  
Ian Painter ◽  
...  

Background: The COVID–19 pandemic is now dominated by variant lineages; the resulting impact on disease severity remains unclear. Using a retrospective cohort study, we assessed the risk of hospitalization following infection with nine variants of concern or interest (VOC/VOI). Methods: Our study includes individuals with positive SARS–CoV–2 RT PCR in the Washington Disease Reporting System and with available viral genome data, from December 1, 2020 to July 30, 2021. The main analysis was restricted to cases with specimens collected through sentinel surveillance. Using a Cox proportional hazards model with mixed effects, we estimated hazard ratios (HR) for the risk of hospitalization following infection with a VOC/VOI, adjusting for age, sex, and vaccination status. Findings: Of the 27,814 cases, 23,170 (83.3%) were sequenced through sentinel surveillance, of which 726 (3.1%) were hospitalized due to COVID–19. Higher hospitalization risk was found for infections with Gamma (HR 3.17, 95% CI 2.15–4.67), Beta (HR: 2.97, 95% CI 1.65–5.35), Delta (HR: 2.30, 95% CI 1.69–3.15), and Alpha (HR 1.59, 95% CI 1.26–1.99) compared to infections with an ancestral lineage. Following VOC infection, unvaccinated patients show a similar higher hospitalization risk, while vaccinated patients show no significant difference in risk, both when compared to unvaccinated, ancestral lineage cases. Interpretation: Infection with a VOC results in a higher hospitalization risk, with an active vaccination attenuating that risk. Our findings support promoting hospital preparedness, vaccination, and robust genomic surveillance.


2019 ◽  
Vol 161 (6) ◽  
pp. 978-985 ◽  
Author(s):  
Derek Hsu ◽  
Falgun H. Chokshi ◽  
Patricia A. Hudgins ◽  
Suprateek Kundu ◽  
Jonathan J. Beitler ◽  
...  

Objective The Neck Imaging Reporting and Data System (NI-RADS) is a standardized numerical reporting template for surveillance of head and neck squamous cell carcinoma (HNSCC). Our aim was to analyze the accuracy of NI-RADS on the first posttreatment fluorodeoxyglucose positron emission tomography/contrast-enhanced computed tomography (PET/CECT). Study Design Retrospective cohort study. Setting Academic tertiary hospital. Subject and Methods Patients with HNSCC with a 12-week posttreatment PET/CECT interpreted using the NI-RADS template and 9 months of clinical and radiologic follow-up starting from treatment completion between June 2014 and July 2016 were included. Treatment failure was defined as positive tumor confirmed by biopsy or Response Evaluation Criteria in Solid Tumors criteria. Cox proportional hazards models were performed. Results This study comprised 199 patients followed for a median of 15.5 months after treatment completion (25% quartile, 11.8 months; 75% quartile, 20.2 months). The rates of treatment failure increased with each incremental increase in NI-RADS category from 1 to 3 (4.3%, 9.1%, and 42.1%, respectively). A Cox proportional hazards model demonstrated a strong association between NI-RADS categories and treatment failure at both primary and neck sites (hazard ratio [HR], 2.60 and 5.22, respectively; P < .001). In the smaller treatment subgroup analysis, increasing NI-RADS category at the primary site in surgically treated patients and treatment failure did not achieve statistically significant association (HR, 0.88; P = .82). Conclusion Increasing NI-RADS category at the baseline posttreatment PET/CECT is strongly associated with increased risk of treatment failure in patients with HNSCC.


2019 ◽  
Vol 37 (7_suppl) ◽  
pp. 117-117
Author(s):  
Jiakun Li ◽  
Yaochuan Guo ◽  
Shi Qiu ◽  
Mingjing He ◽  
Kun Jin ◽  
...  

117 Background: To evaluate the association between tertiary Gleason pattern (TGP) 5 and the biochemical recurrence (BCR) in patients with prostate cancer (PCa) of Gleason score (GS) 7 after radical prostatectomy(RP). Methods: This retrospective study collected 387 patients received RP and diagnosed GS 7 (3+4 or 4+3) in the West China Hospital from January 2009 to December 2017.Regardlessly the first Gleason pattern, patients were divide into 2 groups: TGP5 absence and TGP5 presence. Furthermore, we added the primary Gleason pattern to divided patients into 4 groups: GS 3+4, GS 3+4/TGP 5, Gleason 4+3, Gleason 4+3/TGP 5. Cox proportional-hazards models was used to evaluate the association between the status of TGP5 and BCR after adjusting the confounding factors with follow-up time as the underlying time scale. All the analyses were conducted with the use of statistical software packages Rnand EmpowerStats and conducted as two sides and P values less than 0.05 were considered statistical significance. Results: In the results by using Cox proportional-hazards model, regardless the primary Gleason pattern, comparing TGP5 absence (89.7%) and presence (10.3%), the risk of BCR for patients with tertiary Gleason pattern 5 presence was statistically significantly higher than absence (P = 0.02, HR = 2.24, 95%Cl: 1.12-4.49). In terms of the patients with primary Gleason pattern 4, the risk of BCR for patients with Gleason 4+3/TGP5 was statistically significantly higher than Gleason 4+3.(P = 0.02, HR = 2.56, 95%Cl: 1.16-5.67). There was a marked trend that patients with Gleason 3+4/TGP 5 has a higher risk of BCR compared with patients with Gleason 3+4, although there was no statistical difference (P = 0.58, HR = 1.82, 95%Cl: 0.22-14.96). Conclusions: The TGP5 in patients with GS 7 had strong association with the risk of BCR and it was an independent predictor for BCR. This result was more obvious in patients with GS 7 (4+3) in our study. Further researches with larger data size were needed to confirm these funding.


2009 ◽  
Vol 12 (5) ◽  
pp. 609-613 ◽  
Author(s):  
Truong-Minh Pham ◽  
Yoshihisa Fujino ◽  
Tatsuhiko Kubo ◽  
Reiko Ide ◽  
Noritaka Tokui ◽  
...  

AbstractObjectiveWe investigated the relationship between the intake of fish and the risk of death from prostate cancer.DesignData were derived from a prospective cohort study in Japan. Fish consumption obtained from a baseline questionnaire was classified into the two categories of ‘low intake’ and ‘high intake’. The Cox proportional hazards model was used to estimate hazard ratios (HR) and 95 % confidence intervals.SubjectsData for 5589 men aged 30–79 years were analysed.ResultsA total of twenty-one prostate cancer deaths were observed during 75 072 person-years of follow-up. Mean age at baseline study of these twenty-one subjects was 67·7 years, ranging from 47 and 79 years old. Results showed a consistent inverse association of this cancer between the high v. low intake groups. The multivariate model adjusted for potential confounding factors and some other food items showed a HR of 0·12 (95 % CI 0·05, 0·32) for the high intake group of fish consumption.ConclusionsThese results support the hypothesis that a high intake of fish may decrease the risk of prostate cancer death. Given the paucity of studies examining the association between prostate cancer and fish consumption, particularly in Asian populations, these findings require confirmation in additional cohort studies.


Author(s):  
Ying-Chuan Wang ◽  
Chung-Ching Wang ◽  
Ya-Hsin Yao ◽  
Wei-Te Wu

Purpose: This cohort study evaluated the effectiveness of noninvasive heart rate variability (HRV) analysis to assess the risk of cardiovascular disease over a period of 8 years. Methods: Personal and working characteristics were collected before biochemistry examinations and 5 min HRV tests from the Taiwan Bus Driver Cohort Study (TBDCS) in 2005. This study eventually identified 161 drivers with cardiovascular disease (CVD) and 627 without between 2005 and 2012. Estimation of the hazard ratio was analyzed by using the Cox proportional-hazards model. Results: Subjects with CVD had an overall lower standard deviation of NN intervals (SDNN) than their counterparts did. The SDNN index had a strong association with CVD, even after adjusting for risk factors. Using a median split for SDNN, the hazard ratio of CVD was 1.83 (95% CI = 1.10–3.04) in Model 1 and 1.87 (95% CI = 1.11–3.13) in Model 2. Furthermore, the low-frequency (LF) index was associated with a risk of CVD in the continuous approach. For hypertensive disease, the SDNN index was associated with increased risks in both the continuous and dichotomized approaches. When the root-mean-square of the successive differences (RMSSDs), high frequency (HF), and LF were continuous variables, significant associations with hypertensive disease were observed. Conclusions: This cohort study suggests that SDNN and LF levels are useful for predicting 8 year CVD risk, especially for hypertensive disease. Further research is required to determine preventive measures for modifying HRV dysfunction, as well as to investigate whether these interventions could decrease CVD risk among professional drivers.


2021 ◽  
Vol 10 (18) ◽  
pp. 4091
Author(s):  
Björn Weiss ◽  
David Hilfrich ◽  
Gerald Vorderwülbecke ◽  
Maria Heinrich ◽  
Julius J. Grunow ◽  
...  

The benzodiazepine, midazolam, is one of the most frequently used sedatives in intensive care medicine, but it has an unfavorable pharmacokinetic profile when continuously applied. As a consequence, patients are frequently prolonged and more deeply sedated than intended. Due to its distinct pharmacological features, including a cytochrome P450-independent metabolization, intravenous lormetazepam might be clinically advantageous compared to midazolam. In this retrospective cohort study, we compared patients who received either intravenous lormetazepam or midazolam with respect to their survival and sedation characteristics. The cohort included 3314 mechanically ventilated, critically ill patients that received one of the two drugs in a tertiary medical center in Germany between 2006 and 2018. A Cox proportional hazards model with mortality as outcome and APACHE II, age, gender, and admission mode as covariates revealed a hazard ratio of 1.75 [95% CI 1.46–2.09; p < 0.001] for in-hospital mortality associated with the use of midazolam. After additionally adjusting for sedation intensity, the HR became 1.04 [95% CI 0.83–1.31; p = 0.97]. Thus, we concluded that excessive sedation occurs more frequently in critically ill patients treated with midazolam than in patients treated with lormetazepam. These findings require further investigation in prospective trials to assess if lormetazepam, due to its ability to maintain light sedation, might be favorable over other benzodiazepines for sedation in the ICU.


2021 ◽  
Vol 15 (7) ◽  
pp. e0009635
Author(s):  
Selma Regina Penha Silva Cerqueira ◽  
Patrícia Duarte Deps ◽  
Débora Vilela Cunha ◽  
Natanael Victor Furtunato Bezerra ◽  
Daniel Holanda Barroso ◽  
...  

Background Protective effects of Bacillus Calmette–Guérin (BCG) vaccination and clofazimine and dapsone treatment against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection have been reported. Patients at risk for leprosy represent an interesting model for assessing the effects of these therapies on the occurrence and severity of coronavirus disease 2019 (COVID-19). We assessed the influence of leprosy-related variables in the occurrence and severity of COVID-19. Methodology/Principal findings We performed a 14-month prospective real-world cohort study in which the main risk factor was 2 previous vaccinations with BCG and the main outcome was COVID-19 detection by reverse transcription polymerase chain reaction (RT-PCR). A Cox proportional hazards model was used. Among the 406 included patients, 113 were diagnosed with leprosy. During follow-up, 69 (16.99%) patients contracted COVID-19. Survival analysis showed that leprosy was associated with COVID-19 (p<0.001), but multivariate analysis showed that only COVID-19-positive household contacts (hazard ratio (HR) = 8.04; 95% CI = 4.93–13.11) and diabetes mellitus (HR = 2.06; 95% CI = 1.04–4.06) were significant risk factors for COVID-19. Conclusions/Significance Leprosy patients are vulnerable to COVID-19 because they have more frequent contact with SARS-CoV-2-infected patients, possibly due to social and economic limitations. Our model showed that the use of corticosteroids, thalidomide, pentoxifylline, clofazimine, or dapsone or BCG vaccination did not affect the occurrence or severity of COVID-19.


Circulation ◽  
2014 ◽  
Vol 129 (suppl_1) ◽  
Author(s):  
Michikazu Nakai ◽  
Makoto Watanabe ◽  
Kunihiro Nishimura ◽  
Misa Takegami ◽  
Yoshihiro Kokubo ◽  
...  

Objective: The positive relation between body mass index (BMI) and risk of incident hypertension (HT) has been reported mainly in the Western subjects with high BMI. However, there are a few reports in the Asian with relatively lower BMI. This study investigated the relation of BMI with risk of incident HT in the population-based prospective cohort study of Japan, the Suita study. Methods: Participants who had no HT at baseline (1,591 men and 1,973 women) aged 30-84 years were included in this study. BMI categories were defined as following: underweight (BMI<18.5), normal (18.5≤BMI<25.0), and overweight (BMI ≥ 25.0). The Cox proportional hazards model was used to estimate hazard ratios (HRs) of BMI categories for incident HT by sex. HRs were adjusted for age, cigarette smoking and alcohol drinking. The HRs according to quartiles of BMI were also estimated, using the lowest quartile of BMI as a reference. Results: During median follow-up of 7.2 years, 1,325 participants (640 men and 685 women) developed HT. The HR (95% CI) of 1kg/m2 increment of BMI for HT in men and women was 1.08 (1.05-1.11) and 1.10 (1.07-1.12), respectively. When we set a normal BMI as a reference, HR of overweight BMI in men and women was 1.37 (1.13-1.67) and 1.45 (1.18-1.77), whereas HR of underweight BMI in men and women was 0.63 (0.45-0.90) and 0.60 (0.45-0.80), respectively. In addition, compared to the lowest quartile, HR of the highest quartile of BMI in men and women was 1.67 (1.33-2.10, trend p<0.001) and 2.10 (1.67-2.64, trend p<0.001), respectively. Conclusion: In this study, we showed that higher BMI was associated with increased risk of hypertension in both Japanese men and women.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Kidu Gidey ◽  
Legese Chelkeba ◽  
Tadesse Dukessa Gemechu ◽  
Fekede Bekele Daba

Abstract Epilepsy is a chronic neurological disease with a variable therapeutic response. To design effective treatment strategies for epilepsy, it is important to understand treatment responses and predictive factors. However, limited data are available in Africa, including Ethiopia. The aim of this study was therefore to assess treatment response and identify prognostic predictors among patients with epilepsy at Jimma university medical center, Ethiopia. We conducted a retrospective cohort study of 404 newly diagnosed adult epilepsy patients receiving antiepileptic treatment between May 2010 and May 2015. Demographic, clinical, and outcome data were collected for all patients with a minimum follow-up of two years. Cox proportional hazards model was used to identify predictors of poor seizure remission. Overall, 261 (64.6%) of the patients achieved seizure remission for at least one year. High number of pre-treatment seizures (adjusted hazard ratios (AHR) = 0.64, 95% CI: 0.49–0.83) and poor adherence (AHR = 0.57, 95% CI: 0.44–0.75) were significant predictors of poor seizure remission. In conclusion, our study showed that only about two-thirds of patients had achieved seizure remission. The high number of pre-treatment seizures and non-adherence to antiepileptic medications were predictors of poor seizure remission. Patients with these characteristics should be given special attention.


Sign in / Sign up

Export Citation Format

Share Document