scholarly journals Bleeding during critical illness: A prospective cohort study using a new measurement tool

2007 ◽  
Vol 30 (2) ◽  
pp. 93 ◽  
Author(s):  
Donald M. Arnold ◽  
Laura Donahoe ◽  
France J. Clarke ◽  
Andrea J. Tkaczyk ◽  
Diane Heels-Ansdell ◽  
...  

Purpose: To estimate the incidence, severity, duration and consequences of bleeding during critical illness, and to test the performance characteristics of a new bleeding assessment tool. Methods: Clinical bleeding assessments were performed prospectively on 100 consecutive patients admitted to a medical-surgical intensive care unit (ICU) using a novel bleeding measurement tool called HEmorrhage MEasurement (HEME). Bleeding assessments were done daily in duplicate and independently by blinded, trained assessors. Inter-rater agreement and construct validity of the HEME tool were calculated using φ. Risk factors for major bleeding were identified using a multivariable Cox proportional hazards model. Results: Overall, 90% of patients experienced a total of 480 bleeds of which 94.8% were minor and 5.2% were major. Inter-rater reliability of the HEME tool was excellent (φ = 0.98, 95% CI: 0.96 to 0.99). A decrease in platelet count and a prolongation of partial thromboplastin time were independent risk factors for major bleeding but neither were renal failure nor prophylactic anticoagulation. Patients with major bleeding received more blood transfusions and had longer ICU stays compared to patients with minor or no bleeding. Conclusions: Bleeding, although primarily minor, occurred in the majority of ICU patients. One of five patients experienced a major bleed which was associated with abnormal coagulation tests but not with prophylactic anticoagulants. These baseline bleeding rates can inform the design of future clinical trials in critical care that use bleeding as an outcome and HEME is a useful tool to measure bleeding in critically ill patients.

2021 ◽  
Vol 8 ◽  
Author(s):  
Xuejin Gao ◽  
Li Zhang ◽  
Siwen Wang ◽  
Yaqin Xiao ◽  
Deshuai Song ◽  
...  

Background: Patients with short bowel syndrome (SBS) are at a high risk of cholestasis or cholelithiasis. This study aimed to determine the incidence, risk factors, and clinical consequences of cholelithiasis in adults with SBS over an extended period.Methods: All eligible adults diagnosed with SBS and admitted to a tertiary hospital center between January 2010 and December 2019 were retrospectively identified from the hospital records database. Kaplan–Meier analysis was used to estimate the cumulative incidence of SBS during the 10-year period. For assessment the risk factors for cholelithiasis, we used multivariate Cox proportional hazards model with estimation of hazard ratio (HR) with 95% confidence intervals (95 %CI).Results: This study enrolled 345 eligible patients with SBS. Kaplan–Meier analysis revealed that 72 patients (20.9%) developed cholelithiasis during the 10-year observation period. In multivariate analyses using the Cox proportional hazard model revealed that the remnant jejunum (HR = 2.163; 95% confidence interval [CI]: 1.156–4.047, p = 0.016) and parenteral nutrition dependence (HR = 1.783; 95% CI: 1.077–2.952, p = 0.025) were independent risk factors for cholelithiasis in adults with SBS. Twenty-eight patients developed symptoms and/or complications in the cholelithiasis group. Proportions of acute cholecystitis or cholangitis and acute pancreatitis were significantly increased in the cholelithiasis group compared with the non-cholelithiasis group (31.9 vs. 7.7%, p < 0.01; and 6.9 vs. 1.1%, p = 0.003, respectively).Conclusion: Because of the adverse clinical consequences of cholelithiasis, adult patients with SBS should be closely monitored, and preventive interventions should be considered.Clinical Trial Registration:www.ClinicalTrials.gov, identifier: NCT04867538.


Author(s):  
Erwin Chiquete ◽  
Jesus Alegre-Díaz ◽  
Ana Ochoa-Guzmán ◽  
Liz Nicole Toapanta-Yanchapaxi ◽  
Carlos González-Carballo ◽  
...  

IntroductionPatients with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection may develop coronavirus disease 2019 (COVID-19). Risk factors associated with death vary among countries with different ethnic backgrounds. We aimed to describe the factors associated with death in Mexicans with confirmed COVID-19.Material and methodsWe analysed the Mexican Ministry of Health’s official database on people tested for SARS-CoV-2 infection by real-time reverse transcriptase–polymerase chain reaction (rtRT-PCR) of nasopharyngeal fluids. Bivariate analyses were performed to select characteristics potentially associated with death, to integrate a Cox-proportional hazards model.ResultsAs of May 18, 2020, a total of 177,133 persons (90,586 men and 86,551 women) in Mexico received rtRT-PCR testing for SARS-CoV-2. There were 5332 deaths among the 51,633 rtRT-PCR-confirmed cases (10.33%, 95% CI: 10.07–10.59%). The median time (interquartile range, IQR) from symptoms onset to death was nine days (5–13 days), and from hospital admission to death 4 days (2–8 days). The analysis by age groups revealed that the significant risk of death started gradually at the age of 40 years. Independent death risk factors were obesity, hypertension, male sex, indigenous ethnicity, diabetes, chronic kidney disease, immunosuppression, chronic obstructive pulmonary disease, age > 40 years, and the need for invasive mechanical ventilation (IMV). Only 1959 (3.8%) cases received IVM, of whom 1893 were admitted to the intensive care unit (96.6% of those who received IMV).ConclusionsIn Mexico, highly prevalent chronic diseases are risk factors for death among persons with COVID-19. Indigenous ethnicity is a poorly studied factor that needs more investigation.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Raffaele De Caterina ◽  
Ulrika Andersson ◽  
John H Alexander ◽  
M.Cecilia Bahit ◽  
Patrick J Commerford ◽  
...  

Background: History of bleeding is important in decisions for anticoagulation. We analyzed outcomes in relation to history of bleeding and randomized treatments in patients with atrial fibrillation (AF) in the ARISTOTLE trial. Methods: The on-treatment safety population included 18,140 patients receiving ≥1 dose of study drug, apixaban 5 mg bd (2.5 mg bd if 2 of the following: age >80 yrs; body weight <60 kg; or creatinine >133 μmol/L) or warfarin aiming for INR 2.0-3.0 (median TTR 66%), for a median of 1.8 yrs. Adjudicated outcomes in relation to randomization and history of bleeding were analyzed using a Cox proportional hazards model. Efficacy endpoints were analyzed in the intention-to-treat population. Results: A history of bleeding was reported in 3033 patients (16.7%), who more often were male (68% vs 64%, p <0.0005); with a history of prior stroke/TIA/systemic embolism (23% vs 19%, p <0.0001); diabetes (27% vs 24%, p=0.0010); higher CHADS2 score (CHADS2 >3: 35% vs 29%), age (mean [SD] 71 [9] vs 69 [10], p <0001) and body weight (86 [21] vs 84 [21], p <0.0001); lower creatinine clearance (77 [33] vs 80 [33], p=0.0007) and mean systolic blood pressure (131 [17] vs 132 [16], p=0.0027). Calcium channel blockers, statins, non-steroidal anti-inflammatory drugs and proton pump inhibitors were used more often in patients with vs without a history of bleeding. Major bleeding was the only outcome event occurring more frequently in patients with vs without a history of bleeding, HR 1.7 (95% CI 1.4-2.3) with apixaban and 1.5 (1.2-1.0) with warfarin. Primary efficacy and safety outcomes in relation to randomization, see Table. Conclusions: In patients with AF, a history of bleeding was associated with several risk factors for stroke and bleeding and, accordingly, a higher bleeding risk during anticoagulation. Benefits with apixaban vs warfarin as to stroke, mortality and major bleeding, are however consistent irrespective of bleeding history.


Circulation ◽  
2017 ◽  
Vol 135 (suppl_1) ◽  
Author(s):  
Gloria A Aguayo ◽  
Anna Schritz ◽  
Anne-Françoise Donneau ◽  
Michel T Vaillant ◽  
Saverio Stranges ◽  
...  

Introduction: Frailty is a state of vulnerability in elderly people linked to higher mortality risk. Cardiovascular disease (CVD) is highly prevalent in aged populations and associated with frailty. Thus, frailty state could predict higher risk of CVD. Many frailty scores (FS) have been developed, but none of them is considered the gold standard. We aimed to compare predictive and discriminative ability of an extensive list of FS with regard to incidence of CVD in a sample of the general elderly population in England. We assessed the hypothesis that some FS will have better predictive ability than others, depending on their characteristics. Methods: We performed a prospective analysis of the association between 35 FS in participants free of CVD at baseline wave 2 of the English Longitudinal Study of Ageing (2004-2005), and incident CVD assessed until February 2012. The sample consisted of 4,177 participants (43.0 % men). Hazard ratios (HR) and 95% confidence intervals (95% CI) were calculated for each FS using Cox proportional hazards model, adjusted for demographic, lifestyle and comorbidity variables. FS were analyzed on a continuous scale and using original cutoffs. The added predictive ability of FS beyond a basic model consisting of sex and age was studied using Harrel’s C statistic (the higher the better). Results: The median follow-up was 5.8 years, the incidence rate of CVD events was 301.2 /10,000 person-years and CVD represented 28% of the total cause of death. The mean age was 70.5 (SD: ±7.8) years. In fully-adjusted models with demographics, lifestyles and comorbidity, HRs ranged from: 1.0 (0.7; 1.6) to 12.7 (5.5; 29.3). Using cutoffs, HRs ranged from 0.7 (0.2; 1.9) to 1.8 (1.3; 2.5). Adjusted for sex and age, delta Harrel’s C statistic ranged from -0.8 (-3.4; 1.8) to 3.0 (-0.4; 6.4). The best CVD predictive ability was found for the Frailty Index with 70 variables and the Comprehensive Geriatric Assessment screening FS for continuous and cutoff analyses respectively. In conclusion, there is high variability in the association between different published FS and incident CVD. FS have better predictive ability used as continuous variable. Although most of the analyzed FS have good predictive ability with regard to incident CVD, they do not significantly improve on the discriminative capacity of a basic model. Our results will help to guide clinicians, researchers and public health practitioners in choosing the most informative frailty assessment tool.


2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


2019 ◽  
Vol 53 (10) ◽  
pp. 1020-1025
Author(s):  
Margaret R. Jorgenson ◽  
Jillian L. Descourouez ◽  
Dou-Yan Yang ◽  
Glen E. Leverson ◽  
Christopher M. Saddler ◽  
...  

Background: Modifiable risk-factors associated with Clostridioides difficile infection (CDI) in renal-transplant (RTX) have not been clearly established and peri-transplant risk has not been described. Objective: Evaluate epidemiology, risk-factors and outcomes after CDI occurring in the first 90 days after RTX (CDI-90).Methods: Observational cohort study/survival analysis of adult RTX recipients from 1/1/2012-12/31/2015. Primary outcome was CDI-90 incidence/risk-factors. Secondary outcome was evaluation of post-90 day transplant outcomes. Results: 982 patients met inclusion criteria; 46 with CDI-90 and 936 without (comparator). CDI incidence in the total population was 4.7% at 90 days, 6.3% at 1 year, and 6.4% at 3 years. Incidence of CDI-90 was 5%; time to diagnosis was 19.4±25 days (median 7). Risk-factors for CDI-90 were alemtuzumab induction (Hazard ratio [HR] 1.5, 95% CI(1.1-2.0), p = 0.005) and age at transplant (HR 1.007/year, 95% CI (1.002-1.012), p= 0.007). However, risk-factors for CDI at any time were different; donation-after-circulatory-death (DCD) donor (HR 2.5 95% CI (1.3-4.9), p = 0.008) and female gender (HR 1.6 95% CI (1.0-2.7), p = 0.049). On Kaplan-Meier, CDI-90 appeared to have an impact on patient/graft survival, however when analyzed in a multivariable stepwise Cox proportional hazards model, only age was significantly associated with survival ( p = 0.002). Conclusion and Relevance: Incidence of CDI-90 is low, mostly occurring in the first post-operative month. Risk-factors vary temporally based on time from transplant. In the early post-op period induction agent and age at transplant are significant, but not after. Associations between CDI and negative graft outcomes appear to be largely driven by age. Future studies validating these risk-factors as well as targeted prophylaxis strategies and their effect on long term graft outcomes and the host microbiome are needed.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Zhang Haiyu ◽  
Pei Xiaofeng ◽  
Mo Xiangqiong ◽  
Qiu Junlan ◽  
Zheng Xiaobin ◽  
...  

Purpose. The morbidity of esophageal adenocarcinoma (EAC) has significantly increased in Western countries. We aimed to identify trends in incidence and survival in patients with EAC in the recent 30 years and then analyzed potential risk factors, including race, sex, age, and socioeconomic status (SES). Methods. All data were collected from the Surveillance, Epidemiology, and End Results or SEER database. Kaplan–Meier analysis and the Cox proportional hazards model were conducted to compare the differences in survival between variables, including sex, race, age, and SES, as well as to evaluate the association of these factors with prognosis. Results. A total of 16,474 patients with EAC were identified from 1984 to 2013 in the United States. Overall incidence increased every 10 years from 1.8 to 3.1 to 3.9 per 100. Overall survival gradually improved (p<0.0001), which was evident in male patients ((hazard ratio (HR) = 1.111; 95% confidence interval (CI) (1.07, 1.15)); however, the 5-year survival rate remained low (20.1%). The Cox proportional hazards model identified old age, black ethnicity, and medium/high poverty as risk factors for EAC (HR = 1.018; 95% CI (1.017, 1.019; HR = 1.240, 95% CI (1.151,1.336), HR = 1.000, 95% CI (1.000, 1.000); respectively). Conclusions. The incidence of EAC in the United States increased over time. Survival advantage was observed in white patients and patients in the low-poverty group. Sex was an independent prognostic factor for EAC, but this finding has to be confirmed by further research.


2015 ◽  
Vol 35 (2) ◽  
pp. 199-205 ◽  
Author(s):  
Fan Zhang ◽  
Hong Liu ◽  
Xiaoli Gong ◽  
Fuyou Liu ◽  
Youming Peng ◽  
...  

ObjectiveThe intent of this study was to evaluate the clinical outcome and risk factors affecting mortality of the continuous ambulatory peritoneal dialysis (CAPD) patients in a single peritoneal dialysis (PD) center over a period of 10 years.Patients and methodsWe retrospectively analyzed patients on PD from June 2001 to June 2011. The clinical and biochemical data were collected from the medical records. Clinical variables included gender, age at the start of PD, smoking status, body mass index (BMI), cause of end-stage renal disease (ESRD), presence of diabetes mellitus and blood pressure. Biochemical variables included hemoglobin, urine volume, residual renal function (RRF), serum albumin, blood urea nitrogen (BUN), creatinine, total cholesterol, triglyceride, comorbidities, and outcomes. Survival curves were made by the Kaplan-Meier method. Univariate and multivariate analyses to identify mortality risk factors were performed using the Cox proportional hazard regression model.ResultsA total of 421 patients were enrolled, 269 of whom were male (63.9%). The mean age at the start of PD was 57.9 ± 14.8 years. Chronic glomerulonephritis was the most common cause of ESRD (39.4%). Estimation of patient survival by Kaplan-Meier was 92.5%, 80.2%, 74.4%, and 55.7% at 1, 3, 5, and 10 years, respectively. Patient survival was associated with age (hazard ratio [HR]: 1.641 [1.027 – 2.622], p = 0.038), cardiovascular disease (HR: 1.731 [1.08 – 2.774], p = 0.023), hypertriglyceridemia (HR: 1.782 [1.11 – 2.858], p = 0.017) in the Cox proportional hazards model analysis. Estimation of technique survival by Kaplan-Meier was 86.7%, 68.8%, 55.7%, and 37.4% at 1, 3, 5, and 10 years, respectively. In the Cox proportional hazards model analysis, age (HR: 1.672 [1.176 – 2.377], p = 0.004) and hypertriglyceridemia (HR: 1.511 [1.050 – 2.174], p = 0.026) predicted technique failure.ConclusionThe PD patients in our center exhibited comparable or even superior patient survival and technical survival rates, compared with reports from other centers in China and other countries.


Author(s):  
Jiwei Bai ◽  
Mingxuan Li ◽  
Jianxin Shi ◽  
Liwei Jing ◽  
Yixuan Zhai ◽  
...  

Abstract Objective Skull base chordoma (SBC) is rare and one of the most challenging diseases to treat. We aimed to assess the optimal timing of adjuvant radiation therapy (RT) and to evaluate the factors that influence resection and long-term outcomes. Methods In total, 284 patients with 382 surgeries were enrolled in this retrospective study. Postsurgically, 64 patients underwent RT before recurrence (pre-recurrence RT), and 47 patients underwent RT after recurrence. During the first attempt to achieve gross-total resection (GTR), when the entire tumor was resected, 268 patients were treated with an endoscopic midline approach, and 16 patients were treated with microscopic lateral approaches. Factors associated with the success of GTR were identified using χ2 and logistic regression analyses. Risk factors associated with chordoma-specific survival (CSS) and progression-free survival (PFS) were evaluated with the Cox proportional hazards model. Results In total, 74.6% of tumors were marginally resected [GTR (40.1%), near-total resection (34.5%)]. History of surgery, large tumor volumes, and tumor locations in the lower clivus were associated with a lower GTR rate. The mean follow-up period was 43.9 months. At the last follow-up, 181 (63.7%) patients were alive. RT history, histologic subtype (dedifferentiated and sarcomatoid), non-GTR, no postsurgical RT, and the presence of metastasis were associated with poorer CSS. Patients with pre-recurrence RT had the longest PFS and CSS, while patients without postsurgical RT had the worst outcome. Conclusion GTR is the goal of initial surgical treatment. Pre-recurrence RT would improve outcome regardless of GTR.


2019 ◽  
Author(s):  
Moa P. Lee ◽  
Robert J. Glynn ◽  
Sebastian Schneeweiss ◽  
Kueiyu Joshua Lin ◽  
Elisabetta Patorno ◽  
...  

AbstractBackgroundThe differential impact of various demographic characteristics and comorbid conditions on development of heart failure (HF) with preserved (pEF) and reduced ejection fraction (rEF) is not well studied among the elderly.Methods and ResultsUsing Medicare claims data linked to electronic health records, we conducted an observational cohort study of individuals ≥ 65 years of age without HF. A Cox proportional hazards model accounting for competing risk of HFrEF and HFpEF incidence was constructed. A gradient boosted model (GBM) assessed the relative influence (RI) of each predictor in development of HFrEF and HFpEF. Among 138,388 included individuals, 9,701 developed HF (IR= 20.9 per 1,000 person-year). Males were more likely to develop HFrEF than HFpEF (HR = 2.07, 95% CI: 1.81-2.37 vs. 1.11, 95% CI: 1.02-1.20, P for heterogeneity < 0.01). Atrial fibrillation and pulmonary hypertension had stronger associations with the risk of HFpEF (HR = 2.02, 95% CI: 1.80-2.26 and 1.66, 95% CI: 1.23-2.22) while cardiomyopathy and myocardial infarction were more strongly associated with HFrEF (HR = 4.37, 95% CI: 3.21-5.97 and 1.94, 95% CI: 1.23-3.07). Age was the strongest predictor across all HF subtypes with RI from GBM >35%. Atrial fibrillation was the most influential comorbidity for development of HFpEF (RI = 8.4%) while cardiomyopathy was most influential for HFrEF (RI = 20.7%).ConclusionsThese findings of heterogeneous relationships between several important risk factors and heart failure types underline the potential differences in the etiology of HFpEF and HFrEF.Key QuestionsWhat is already known about this subject?Previous epidemiologic studies describe the differences in risk factors involved in developing heart failure with preserved (HFpEF) and reduced ejection fraction (HFrEF), however, there has been no large study in an elderly population.What does this study add?This study provides further insights into the heterogeneous impact of various clinical characteristics on the risk of developing HFpEF and HFrEF in a population of elderly individuals.Employing an advanced machine learning technique allows assessing the relative importance of each risk factor on development of HFpEF and HFrEF.How might this impact on clinical practice?Our findings provide further insights into the potential differences in the etiology of HFpEF and HFrEF, which are critical in prioritizing populations for close monitoring and targeting prevention efforts.


Sign in / Sign up

Export Citation Format

Share Document