Gait speed and adverse outcomes following hospitalised exacerbation of COPD

2021 ◽  
pp. 2004047
Author(s):  
Jessica A. Walsh ◽  
Ruth E. Barker ◽  
Samantha S.C. Kon ◽  
Sarah E. Jones ◽  
Winston Banya ◽  
...  

Four-metre gait speed (4MGS) is a simple physical performance measure and surrogate marker of frailty that is associated with adverse outcomes in older adults. We aimed to assess the ability of 4MGS to predict prognosis in patients hospitalised with acute exacerbations of COPD (AECOPD).213 participants hospitalised with AECOPD (52% male, mean age and FEV1, 72 years and 35% predicted) were enrolled. 4MGS and baseline demographics were recorded at hospital discharge. All-cause readmission and mortality were collected for 1 y after discharge, and multivariable Cox-proportional hazards regression were performed. Kaplan-Meier and Competing risk analysis was conducted comparing time to all-cause readmission and mortality between 4MGS quartiles.111 participants (52%) were readmitted, and 35 (16%) died during the follow-up period. 4MGS was associated with all-cause readmission, with an adjusted subdistribution hazard ratio of 0.868 (95% CI 0.797–0.945; p=0.001) per 0.1 m·s−1 increase in gait speed, and with all-cause mortality with an adjusted subdistribution hazard ratio of 0.747 (95% CI: 0.622–0.898; p=0.002) per 0.1 m·s−1 increase in gait speed. Readmission and mortality models incorporating 4MGS had higher discrimination than age or FEV1% predicted alone, with areas under the receiver operator characteristic curves of 0.73 and 0.80 respectively. Kaplan-Meier and Competing Risk curves demonstrated that those in slower gait speed quartiles had reduced time to readmission and mortality (log rank both p<0.001).4MGS provides a simple means of identifying at-risk patients with COPD at hospital discharge. This provides valuable information to plan post-discharge care and support.

2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e18250-e18250
Author(s):  
Jifang Zhou ◽  
Karen Sweiss ◽  
Pritesh Rajni Patel ◽  
Edith Nutescu ◽  
Naomi Ko ◽  
...  

e18250 Background: Adjuvant intravenous bisphosphonates (IV BP) reduce the risk of skeletal-related events (SRE) in patients with multiple myeloma (MM). We examined the effects of bisphosphonate utilization patterns (adherence, cumulative dose and frequency) on risk of SRE. Methods: Patients aged 65 years or older and diagnosed with first primary MM between 2001 and 2011 were identified using the Surveillance, Epidemiology and End Results (SEER)-Medicare linked database. Patients receiving at least one dose of IV BP after MM diagnosis were identified and 5-year SRE-free survival was estimated using the Kaplan-Meier method stratified by demographic groups and compared with the log rank test. Cox proportional hazards models were fit to determine the association between IV BP utilization patterns and SRE after propensity score matching. We investigated the outcome of multiple recurrent SRE using the approach of Andersen-Gill, and estimated subdistribution hazard ratios (SHR) and 95% confidence intervals for risk of first SRE, accounting for death as competing risk. Results: The final cohort included 9176 MM patients with a median age of 76 years. The adjusted 5-year competing-risk SRE model showed a 48% reduction in risk of SRE (95% CI 0.49-0.55) with use of IV BP. In multivariable analyses taking into account competing risks, greater adherence to IV BP, higher cumulative IV BP dose and more frequent administration were all associated with a statistically significant reduction in SRE risks (See Table). Conclusions: Use of IV BP in patients with MM was associated with significant reduction in SRE risk over the 5-year period after MM diagnosis. The effectiveness of IV BP therapy was greater with increasing cumulative dose, adherence to and greater frequency of IV BP administration. [Table: see text]


Author(s):  
Phyo Kyaw Myint ◽  
Mary Joan Macleod ◽  
Allan B. Clark ◽  
Toby O. Smith ◽  
Joao H. Bettencourt-Silva ◽  
...  

Background: Whilst lack of concentration is a known symptom of anaemia, its association with post-stroke dementia is unclear. Methods: We used data from a UK regional stroke register. To be eligible, patient must have survived to discharge and had anaemia by WHO criteria. &nbsp;Dementia status and other prevalent co-morbidities were assessed using ICD-10 codes. Patients were followed till May 2015 (mean follow-up 3.7 years, total person years = 27,769). Hazard Ratio for incident dementia was calculated using Cox-proportional hazards model controlling for potential confounders. Fine and Gray model was additionally constructed using mortality as the competing risk. Results: A total of 7,454 stroke patients were included with mean age (SD) of 75.9(12.3) years (50.2% men). Those with anaemia were older, has higher disability and co-morbidity burden prior to stroke. We observed a large amount of variation in the dementia incidence rates over time and that the hazard ratio increased every year.&nbsp; The significant association between anaemia and dementia incidence was lost after controlling for pre-stroke Modified Rankin score (HR1.17(0.97,1.40)). With every 20g/dL increase in Hb was associated with a significant reduction in the risk of dementia after adjustment for age, sex, stroke factors and disability but lost significance after adjustment for vascular risk factors. Competing risk analyses showed similar results. Conclusion: Whilst we found no evidence of anaemia as a risk factor for post-stroke dementia, the findings may be limited by potential under recognition of post stroke dementia.


2020 ◽  
Vol 16 (3) ◽  
pp. 4461-4473 ◽  
Author(s):  
Dong Wang ◽  
Kun Liu ◽  
Yingchi Yang ◽  
Tingting Wang ◽  
Quan Rao ◽  
...  

Currently, the prognostic effects of leukemia inhibitory factor (LIF) and LIF receptor (LIFR) in pancreatic adenocarcinoma (PAAD) are not clear. In the present study, we utilized the large datasets from four public databases to investigate the expression of LIF and LIFR and their clinical significance in PAAD. Eight cohorts containing 1278 cases with PAAD were identified and the analysis results suggested that LIF was highly expressed while LIFR was lowly expressed in PAAD tissues compared with adjacent or normal tissues. Kaplan–Meier plot curves and univariate and multivariate Cox proportional hazards regression analyses indicated high LIF expression was associated with shorter overall survival (adjusted hazard ratio = 1.641, 95% CI: 1.399–1.925, p < 0.001) whereas high LIFR expression was associated with longer overall survival (adjusted hazard ratio = 0.653, 95% CI: 0.517–0.826, p < 0.001).


2013 ◽  
Vol 31 (15_suppl) ◽  
pp. 9102-9102
Author(s):  
Paul T Kang ◽  
Amy Louise Baker ◽  
James Warneke ◽  
Clara N Curiel ◽  
Joanne M. Jeter ◽  
...  

9102 Background: Lesion bleeding (BL) is a catalyst symptom leading to melanoma (MEL) diagnosis and is associated with adverse outcomes. The pathophysiologic significance of BL has not been elucidated. We hypothesize that BL reflects pathologic ulceration (UL). Methods: This retrospective cohort study was conducted using data from 850 patients seen 2005-2009 at our center. Eligible patients reported the pre-diagnosis BL status of their primary lesion; determination of its UL status was also required. Demographic, clinical, pathologic and outcomes data were abstracted from records. Χ2 and independent t-tests were used for univariate comparisons. Predictors of UL were analyzed with multiple logistic regression. Survival indices were analyzed by the Kaplan-Meier, log-rank, and Cox proportional hazards. Results: 190 patients with 193 MEL lesions were eligible. Median follow-up was 5.3 yr. 67 lesions bled prior to diagnosis; 68 demonstrated UL. BL and UL were associated with each other and with Breslow depth in univariate analyses; UL was also associated with age at diagnosis. Neither was associated with gender, lesion site, use of BL-associated drugs or Clark’s level. A logistic model was developed using BL, gender, lesion site, use of BL-associated drugs, and age at diagnosis. Only BL and age at diagnosis were associated with UL probability (OR=10.6/p<0.001 and OR=1.04/p=0.006, respectively). BL was associated with worsened median relapse-free (RFS) and overall survival (OS) (median 1.16y vs. 1.96y, p=0.001 and 3.06 y vs. 3.41y, p=0.001), as was ulceration. When status of BL and UL were considered together, only UL predicted outcomes. A Cox model of clinical factors (BL, gender, age at diagnosis, lesion site) confirmed the association of BL with RFS and OS (HR 1.82/p=0.006 and HR 2.36/p=0.001). Addition of UL to the model abrogated BL’s predictive value. Conclusions: BL of primary MEL lesions is strongly associated with pathologic UL. When clinical parameters are considered, BL significantly predicts RFS and OS, although this value is lost once UL status is known. Our data are consistent with BL’s reflecting the ulceration status of a primary lesion. When UL status is unknown, BL may be able to serve as its surrogate marker.


2019 ◽  
Vol 53 (2) ◽  
pp. 1801186 ◽  
Author(s):  
Claire M. Nolan ◽  
Matthew Maddocks ◽  
Toby M. Maher ◽  
Winston Banya ◽  
Suhani Patel ◽  
...  

The 4-m gait speed (4MGS), a simple physical performance measure and surrogate marker of frailty, consistently predicts adverse prognosis in older adults. We hypothesised that 4MGS could predict all-cause mortality and nonelective hospitalisation in patients with idiopathic pulmonary fibrosis (IPF).4MGS and lung function were measured at baseline in 130 outpatients newly diagnosed with IPF. Survival status and nonelective hospital admissions were recorded over 1 year. We assessed the predictive value of 4MGS (as a continuous variable and as a binary variable: slow versus preserved 4MGS) by calculating hazard ratios using Cox proportional regression, adjusting for potential confounding variables. Receiver operating characteristic curves assessed discrimination between the multivariable regression models and established prognostic indices.Continuous 4MGS and slow 4MGS were independent predictors of all-cause mortality (4MGS: HR 0.03, 95% CI 0.01–0.31; p=0.004; slow 4MGS: 2.63, 95% CI 1.01–6.87; p=0.049) and hospitalisation (4MGS: HR 0.02, 95% CI 0.01–0.14; p<0.001; slow 4MGS: 2.76, 95% CI 1.16–6.58; p=0.02). Multivariable models incorporating 4MGS or slow 4MGS had better discrimination for predicting mortality than either the gender, age and lung physiology index or Composite Physiologic Index.In patients with IPF, 4MGS is an independent predictor of all-cause mortality and nonelective hospitalisation.


Author(s):  
Stina Ek ◽  
Anna C. Meyer ◽  
Margareta Hedström ◽  
Karin Modig

Abstract Background Charlson Comorbidity Index (CCI) has been suggested to be associated with mortality in hip fracture patients, to the same extent as more expensive and time-consuming tools. However, even CCI might be too time-consuming in a clinical setting. Aim To investigate whether the American Society of Anaesthesiologists score (ASA score), a simple grading from the anaesthesiologist’s examination, is comparable with CCI in the association with 1-year mortality after a hip fracture. Methods The study population was patients 60 + years registered in the Swedish Hip Fracture Registry with a first-time hip fracture between 1997 and 2017 (N = 165,596). The outcome was 1-year mortality, and the exposures were ASA score and CCI. The association between comorbidity and mortality was described with Kaplan–Meier curves and analyzed with Cox proportional hazards models. Results The Kaplan–Meier curves showed a stepwise increase in mortality for increasing values of both ASA and CCI. The Hazard Ratios (HRs) for the highest ASA (4–5) were 3.8 (95% Confidence Interval 3.5–4.2) for women and 3.2 (2.8–3.6) for men in the fully adjusted models. Adjusted HRs for the highest CCI (4 +) were 3.6 (3.3–3.9) for women and 2.5 (2.3–2.7) for men. Reference was the lowest score value for both tools. The correlation between the tools was moderate. Conclusions Both ASA and CCI show a similar stepwise association with 1-year mortality in hip fracture patients, despite measuring different factors and capturing different individuals at risk. Since the ASA score is already accessible for health care staff, it might be preferable to aid in prioritizing vulnerable hip fracture patients at risk of adverse outcomes.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Morgan Harloff ◽  
Laura Piechura ◽  
Farhang Yazdchi ◽  
Mohamed Keshk ◽  
Hunbo Shim ◽  
...  

Introduction: Prolonged cardiopulmonary resuscitation (CPR) duration remains a source of apprehension with regards to the acceptance of donor hearts for orthotopic heart transplantation (OHT). Unfortunately, many of these organs are declined due to concern for adverse outcomes after OHT, further straining an already limited donor pool. Nevertheless, donor hearts with a history of prolonged CPR may represent an opportunity to expand the donor pool for patients with end-stage heart failure on the waiting list for OHT. Therefore, we sought to examine the duration of donor CPR and its impact on recipient survival after OHT. Methods: The United Network of Organ Sharing (UNOS) database was retrospectively quarried to identify all adult patients who underwent first-time OHT between 2000 and 2019 from a donor who had experienced cardiac arrest with a quantified downtime duration. The population was divided into five groups with a granular focus on longer downtimes: donors with CPR < 30 minutes, 30-39 minutes, 40-49 minutes, 50-59 minutes, and ≥ 60 minutes. Primary outcome of interest was post-transplant survival. Kaplan-Meier analysis was used to compare recipient survival between groups after OHT. Results: In total, 7,470 patients were identified during the study period. Overall survival by Kaplan-Meier analysis was not statistically different among the five groups (p=0.69) (Figure 1). In a Cox proportional-hazards model, duration of CPR was found to have no influence on survival (HR 1.00, p=0.56). Significant predictors of mortality included donor age (HR 1.01, p=0.013), donor smoking history (HR 1.11, p<0.005), and recipient diabetes (HR 1.27, p<0.0001). Conclusions: These findings suggest that, for hearts determined appropriate for transplant, duration of CPR performed on the donor heart does not significantly impact survival after OHT. Therefore, donor hearts with a prolonged downtime should be fully evaluated for OHT to maximize the donor pool.


2019 ◽  
Vol 26 (3) ◽  
Author(s):  
B. P. Levy ◽  
J. E. Signorovitch ◽  
H. Yang ◽  
O. Patterson-Lomba ◽  
C. Q. Xiang ◽  
...  

Background Commonly used first-line (1L) chemotherapies for patients with advanced squamous-cell lung cancer (scc) include gemcitabine–platinum (gp), nab-paclitaxel–carboplatin (nabpc), and sb-paclitaxel–carboplatin (sbpc) regimens. However, no head-to-head trials have compared those treatments. In the present study, we compared the efficacy of 1L gp, nabpc, and sbpc in patients with scc and in patients with scc who subsequently received secondline (2L) immunotherapy.Methods Medical records of patients who initiated the 1L treatments of interest between June 2014 and October 2015 were reviewed by 132 participating physicians. Kaplan–Meier curves were used to evaluate overall survival (os), progression-free survival (pfs), and treatment discontinuation (td), and then Cox proportional hazards regression was used to compare the results between the cohorts.Results Medical records of 458 patients with scc receiving gp (n = 139), nabpc (n = 159), or sbpc (n = 160) as 1L therapy were reviewed. Median os was longer with nabpc (23.9 months) than with gp (16.9 months; adjusted hazard ratio vs. nabpc: 1.55; p < 0.05) and with sbpc (18.3 months; adjusted hazard ratio: 1.42; p = 0.10). No differences were observed in pfs (median pfs: 8.8, 8.0, and 7.6 months for gp, nabpc, and sbpc respectively; log-rank p = 0.76) or in td (median td: 5.5, 5.7, and 4.6 months respectively; p = 0.65). For patients who subsequently received 2L immunotherapy, no differences in os were observed (median os: 27.3, 25.0, and 23.0 months respectively; p = 0.59).Conclusions In a nationwide sample of scc patients, longer median os was associated with 1L nabpc than with gp and sbpc. Median os for all 1L agents considered was similar in the subgroup of patients who sequenced to a 2L immunotherapy.


Author(s):  
Alejandro Márquez-Salinas ◽  
Carlos A Fermín-Martínez ◽  
Neftalí Eduardo Antonio-Villa ◽  
Arsenio Vargas-Vázquez ◽  
Enrique C. Guerra ◽  
...  

Abstract Background Chronological age (CA) is a predictor of adverse COVID-19 outcomes; however, CA alone does not capture individual responses to SARS-CoV-2 infection. Here, we evaluated the influence of aging metrics PhenoAge and PhenoAgeAccel to predict adverse COVID-19 outcomes. Furthermore, we sought to model adaptive metabolic and inflammatory responses to severe SARS-CoV-2 infection using individual PhenoAge components. Methods In this retrospective cohort study, we assessed cases admitted to a COVID-19 reference center in Mexico City. PhenoAge and PhenoAgeAccel were estimated using laboratory values at admission. Cox proportional hazards models were fitted to estimate risk for COVID-19 lethality and adverse outcomes (ICU admission, intubation, or death). To explore reproducible patterns which model adaptive responses to SARS-CoV-2 infection, we used k-means clustering using PhenoAge components. Results We included 1068 subjects of whom 222 presented critical illness and 218 died. PhenoAge was a better predictor of adverse outcomes and lethality compared to CA and SpO2 and its predictive capacity was sustained for all age groups. Patients with responses associated to PhenoAgeAccel&gt;0 had higher risk of death and critical illness compared to those with lower values (log-rank p&lt;0.001). Using unsupervised clustering we identified four adaptive responses to SARS-CoV-2 infection: 1) Inflammaging associated with CA, 2) metabolic dysfunction associated with cardio-metabolic comorbidities, 3) unfavorable hematological response, and 4) response associated with favorable outcomes. Conclusions Adaptive responses related to accelerated aging metrics are linked to adverse COVID-19 outcomes and have unique and distinguishable features. PhenoAge is a better predictor of adverse outcomes compared to CA.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
I.D Poveda Pinedo ◽  
I Marco Clement ◽  
O Gonzalez ◽  
I Ponz ◽  
A.M Iniesta ◽  
...  

Abstract Background Previous parameters such as peak VO2, VE/VCO2 slope and OUES have been described to be prognostic in heart failure (HF). The aim of this study was to identify further prognostic factors of cardiopulmonary exercise testing (CPET) in HF patients. Methods A retrospective analysis of HF patients who underwent CPET from January to November 2019 in a single centre was performed. PETCO2 gradient was defined by the difference between final PETCO2 and baseline PETCO2. HF events were defined as decompensated HF requiring hospital admission or IV diuretics, or decompensated HF resulting in death. Results A total of 64 HF patients were assessed by CPET, HF events occurred in 8 (12.5%) patients. Baseline characteristics are shown in table 1. Patients having HF events had a negative PETCO2 gradient while patients not having events showed a positive PETCO2 gradient (−1.5 [IQR −4.8, 2.3] vs 3 [IQR 1, 5] mmHg; p=0.004). A multivariate Cox proportional-hazards regression analysis revealed that PETCO2 gradient was an independent predictor of HF events (HR 0.74, 95% CI [0.61–0.89]; p=0.002). Kaplan-Meier curves showed a significantly higher incidence of HF events in patients having negative gradients, p=0.002 (figure 1). Conclusion PETCO2 gradient was demonstrated to be a prognostic parameter of CPET in HF patients in our study. Patients having negative gradients had worse outcomes by having more HF events. Time to first event, decompensated heart Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document