scholarly journals Radial Approach Expertise and Clinical Outcomes of Percutanous Coronary Interventions Performed Using Femoral Approach

2019 ◽  
Vol 8 (9) ◽  
pp. 1484 ◽  
Author(s):  
Tomasz Tokarek ◽  
Artur Dziewierz ◽  
Krzysztof Plens ◽  
Tomasz Rakowski ◽  
Michał Zabojszcz ◽  
...  

We sought to evaluate the impact of experience and proficiency with radial approach (RA) on clinical outcomes of percutaneous coronary interventions (PCI) performed via femoral approach (FA) in the “real-world” national registry. A total of 539 invasive cardiologists performing PCIs in 151 invasive cardiology centers in Poland between 2014 and 2017 were included. Proficiency threshold was set at >300 PCIs during four consecutive years per individual operator. The majority of operators performed >75% of all PCIs via RA (449 (65.4%)), 143 (20.8%) in 50–75% of cases, 62 (9.0%) in 25–50% and only 33 (4.8%) invasive cardiologists were using RA in <25% of all PCIs. Operators with the highest proficiency in RA were associated with increased risk of periprocedural death, stroke and bleeding complications at access site during angiography via FA. Similarly, higher prevalence of periprocedural mortality during PCI with FA was observed in most experienced radial operators as compared to other groups. The detrimental effect of FA utilization by the most experienced radial operators was observed in both stable angina and acute coronary syndromes. Higher experience and utilization of RA might be linked to worse outcomes of PCIs performed via femoral artery in both stable and acute settings.

Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Tomonori Akasaka ◽  
Seiji Hokimoto ◽  
Noriaki Tabata ◽  
Kenji Sakamoto ◽  
Kenichi Tsujita ◽  
...  

Background: Based on 2011 ACCF/AHA/SCAI PCI guideline, it is recommended that PCI should be performed at hospital with onsite cardiac surgery. But, recent data suggests that there is no significant difference in clinical outcomes following primary or elective PCI between hospitals with and without onsite cardiac surgery. The proportion of PCI centers without onsite cardiac surgery comprises approximately more than half of all PCI centers in Japan. We examined the impact of with or without onsite cardiac surgery on clinical outcomes following PCI to ACS. Methods: From Aug 2008 to March 2011, subjects (n=2288) were enrolled from the Kumamoto Intervention Conference Study (KICS), which is a multicenter registry, and enrolling consecutive patients undergoing PCI in 15 centers in Japan. Patients were assigned to two groups treated in hospitals with (n=1954) or without (n=334) onsite cardiac surgery. Clinical events were followed up for 12 months. Primary endpoint was in-hospital death, cardiovascular death, myocardial infarction, and stroke. And we monitored other events those were non-cardiovascular deaths, bleeding complications, revascularizations, and emergent CABG. Results: There was no overall significant difference in primary endpoint between hospitals with and without onsite cardiac surgery (9.6%vs9.5%; P=0.737). There was also no significant difference when events in primary endpoint were considered separately. In other events, only revascularization was more frequently seen in hospitals with onsite cardiac surgery (22.1%vs12.9%; P<0.001). Kaplan-Meier analysis for primary endpoint showed that there was no significant difference between two groups (Log Rank P=0.943). By cox proportional hazards model analysis for primary endpoint, without onsite cardiac surgery was not a predictive factor for primary endpoint (HR 0.969, 95%CI 0.704-1.333; P=0.845). We performed propensity score matching analysis to correct for the disparate patient numbers between two groups, and there was also no significant difference for primary endpoint (6.9% vs 8.0%; P=0.544). Conclusions: There is no significant difference in clinical outcomes following PCI for ACS between hospitals with and without onsite cardiac surgery backup in Japan.


2018 ◽  
Vol 30 (2) ◽  
pp. 106-112 ◽  
Author(s):  
Elizabeth Nagel ◽  
Michael J Blackowicz ◽  
Foday Sahr ◽  
Olamide D Jarrett

The impact of the 2014–2016 Ebola epidemic in West Africa on human immunodeficiency virus (HIV) treatment in Sierra Leone is unknown, especially for groups with higher HIV prevalence such as the military. Using a retrospective study design, clinical outcomes were evaluated prior to and during the epidemic for 264 HIV-infected soldiers of the Republic of Sierra Leone Armed Forces (RSLAF) and their dependents receiving HIV treatment at the primary RSLAF HIV clinic. Medical records were abstracted for baseline clinical data and clinic attendance. Estimated risk of lost to follow-up (LTFU), default, and number of days without antiretroviral therapy (DWA) were calculated using repeated measures general estimating equations adjusted for age and gender. Due to missing data, 262 patients were included in the final analyses. There was higher risk of LTFU throughout the Ebola epidemic in Sierra Leone compared to the pre-Ebola baseline, with the largest increase in LTFU risk occurring at the peak of the epidemic (relative risk: 3.22, 95% CI: 2.22–4.67). There was an increased risk of default and DWA during the Ebola epidemic for soldiers but not for their dependents. The risk of LTFU, default, and DWA stabilized once the epidemic was largely resolved but remained elevated compared to the pre-Ebola baseline. Our findings demonstrate the negative and potentially lasting impact of the Ebola epidemic on HIV care in Sierra Leone and highlight the need to develop strategies to minimize disruptions in HIV care with future disease outbreaks.


Author(s):  
Jonathan R Enriquez ◽  
James A de Lemos ◽  
Ramin Farzaneh-Far ◽  
Anand Rohatgi ◽  
S. A Peng ◽  
...  

Background: Previous reports are conflicting regarding outcomes, treatments, and processes of care after acute myocardial infarction (MI) for patients with chronic lung disease (CLD). Methods: Using the NCDR ACTION Registry ® -GWTG ™ (AR-G), demographics, clinical characteristics, treatments, processes of care, and in-hospital adverse events after NSTEMI and STEMI were compared between patients with (n= 22,624; 14.2%) and without (n= 136,266; 85.8%) CLD. CLD was defined by a history of COPD, chronic bronchitis, or emphysema. Multivariable adjustment using published AR-G in-hospital mortality and major bleeding risk adjustment models was performed to quantify the impact of CLD on treatments and outcomes. Results: CLD was present in 10.1% of STEMI patients and 17.0% of NSTEMI patients. In both STEMI and NSTEMI, CLD patients were older, more likely to be female, and had more comorbidities including diabetes, renal disease, prior MI and heart failure, compared to those without CLD. Although on admission CLD patients were more likely to be on cardiovascular medications, by discharge slightly fewer CLD patients received composite core measures (aspirin, beta-blockers, ACE-inhibitors, and statins) (table). In NSTEMI, CLD was also associated with less use of invasive procedures and with increased risk of both death and major bleeding. In STEMI, major bleeding but not mortality was increased. Conclusions: CLD is a common comorbidity and is independently associated with an increased risk for major bleeding after MI. In NSTEMI, CLD is also associated with receiving fewer evidence-based medications, less timely angiography and revascularization, and increased in-hospital mortality. Close attention should be given to this high-risk subgroup for the prevention and management of bleeding complications after MI, and further investigation is needed to determine the reasons for treatment and outcome disparities in NSTEMI.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S271-S271
Author(s):  
Gauri Chauhan ◽  
Nikunj M Vyas ◽  
Todd P Levin ◽  
Sungwook Kim

Abstract Background Vancomycin-resistant Enterococci (VRE) occurs with enhanced frequency in hospitalized patients and are usually associated with poor clinical outcomes. The purpose of this study was to evaluate the risk factors and clinical outcomes of patients with VRE infections. Methods This study was an IRB-approved multi-center retrospective chart review conducted at a three-hospital health system between August 2016-November 2018. Inclusion criteria were patients ≥18 years and admitted for ≥24 hours with cultures positive for VRE. Patients pregnant or colonized with VRE were excluded. The primary endpoint was to analyze the association of potential risk factors with all-cause in-hospital mortality (ACM) and 30-day readmission. The subgroup analysis focused on the association of risk factors with VRE bacteremia. The secondary endpoint was to evaluate the impact of different treatment groups of high dose daptomycin (HDD) (≥10 mg/kg/day) vs. low dose daptomycin (LDD) (< 10 mg/kg/day) vs. linezolid (LZD) on ACM and 30-day readmission. Subgroup analysis focused on the difference of length of stay (LOS), length of therapy (LOT), duration of bacteremia (DOB) and clinical success (CS) between the treatment groups. Results There were 81 patients included for analysis; overall mortality was observed at 16%. Utilizing multivariate logistic regression analyses, patients presenting from long-term care facilities (LTCF) were found to have increased risk for mortality (OR 4.125, 95% CI 1.149–14.814). No specific risk factors were associated with 30-day readmission. Patients with previous exposure to fluoroquinolones (FQ) and cephalosporins (CPS), nosocomial exposure and history of heart failure (HF) showed association with VRE bacteremia. ACM was similar between HDD vs. LDD vs. LZD (16.7% vs. 15.4% vs. 0%, P = 0.52). No differences were seen between LOS, LOT, CS, and DOB between the groups. Conclusion Admission from LTCFs was a risk factor associated with in-hospital mortality in VRE patients. Individuals with history of FQ, CPS and nosocomial exposure as well as history of HF showed increased risk of acquiring VRE bacteremia. There was no difference in ACM, LOS, LOT, and DOB between HDD, LDD and LZD. Disclosures All authors: No reported disclosures.


Author(s):  
Francesco Costa ◽  
Marco Valgimigli ◽  
Philippe Gabriel Steg ◽  
Deepak L Bhatt ◽  
Stefan H Hohnloser ◽  
...  

Abstract Aims Patients with atrial fibrillation undergoing coronary intervention are at higher bleeding risk due to the concomitant need for oral anticoagulation and antiplatelet therapy. The RE-DUAL PCI trial demonstrated better safety with dual antithrombotic therapy (DAT: dabigatran 110 or 150 mg b.i.d., clopidogrel or ticagrelor) compared to triple antithrombotic therapy (TAT: warfarin, clopidogrel or ticagrelor, and aspirin). We explored the impact of baseline bleeding risk based on the PRECISE-DAPT score for decision-making regarding DAT vs. TAT. Methods and results A score ≥25 points qualified high bleeding risk (HBR). Comparisons were made for the primary safety endpoint International Society of Thrombosis and Haemostasis major or clinically relevant non-major bleeding, and the composite efficacy endpoint of death, thrombo-embolic events, or unplanned revascularization, analysed by time-to-event analysis. PRECISE-DAPT was available in 2336/2725 patients, and 37.9% were HBR. Compared to TAT, DAT with dabigatran 110 mg reduced bleeding risk both in non-HBR [hazard ratio (HR) 0.42, 95% confidence interval (CI) 0.31–0.57] and HBR (HR 0.70, 95% CI 0.52–0.94), with a greater magnitude of benefit among non-HBR (Pint = 0.02). Dual antithrombotic therapy with dabigatran 150 mg vs. TAT reduced bleeding in non-HBR (HR 0.60, 95% CI 0.45–0.80), with a trend toward less benefit in HBR patients (HR 0.92, 95% CI 0.63–1.34; Pint = 0.08). The risk of ischaemic events was similar on DAT with dabigatran (both 110 and 150 mg) vs. TAT in non-HBR and HBR patients (Pint = 0.45 and Pint = 0.56, respectively). Conclusions PRECISE-DAPT score appeared useful to identify AF patients undergoing PCI at further increased risk of bleeding complications and may help clinicians identifying the antithrombotic regimen intensity with the best benefit–risk ratio in an individual patient.


Open Heart ◽  
2021 ◽  
Vol 8 (2) ◽  
pp. e001726
Author(s):  
Anthony P Carnicelli ◽  
Ruth Owen ◽  
Stuart J Pocock ◽  
David B Brieger ◽  
Satoshi Yasuda ◽  
...  

ObjectiveAtrial fibrillation (AF) and myocardial infarction (MI) are commonly comorbid and associated with adverse outcomes. Little is known about the impact of AF on quality of life and outcomes post-MI. We compared characteristics, quality of life and clinical outcomes in stable patients post-MI with/without AF.Methods/resultsThe prospective, international, observational TIGRIS (long Term rIsk, clinical manaGement and healthcare Resource utilization of stable coronary artery dISease) registry included 8406 patients aged ≥50 years with ≥1 atherothrombotic risk factor who were 1–3 years post-MI. Patient characteristics were summarised by history of AF. Quality of life was assessed at baseline using EQ-5D. Clinical outcomes over 2 years of follow-up were compared. History of AF was present in 702/8277 (8.5%) registry patients and incident AF was diagnosed in 244/7575 (3.2%) over 2 years. Those with AF were older and had more comorbidities than those without AF. After multivariable adjustment, patients with AF had lower self-reported quality-of-life scores (EQ-5D UK-weighted index, visual analogue scale, usual activities and pain/discomfort) than those without AF. CHA2DS2-VASc score ≥2 was present in 686/702 (97.7%) patients with AF, although only 348/702 (49.6%) were on oral anticoagulants at enrolment. Patients with AF had higher rates of all-cause hospitalisation (adjusted rate ratio 1.25 [1.06–1.46], p=0.008) over 2 years than those without AF, but similar rates of mortality.ConclusionsIn stable patients post-MI, those with AF were commonly undertreated with oral anticoagulants, had poorer quality of life and had increased risk of clinical outcomes than those without AF.Trial registration numberClinicalTrials: NCT01866904.


Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 5039-5039
Author(s):  
Hazim Ababneh ◽  
Fadwa Alqadi ◽  
Mohammad ma'Akoseh ◽  
Khalid Halahleh ◽  
Layan abo Abed ◽  
...  

Abstract Introduction: The COVID-19 infection has a devastating clinical outcome among individuals with immunocompromised states, particularly those with malignancies. The impact of the coronavirus pandemic on patients with hematological malignancies in low and middle-income countries is not well studied. Herein, we sought to report the clinical outcomes of the COVID-19 infection in patients with hematological malignancies treated at a single institution. Methods: Electronic medical record charts of patients diagnosed with hematological malignancies (leukemia, lymphoma, and multiple myeloma) were reviewed. Patients who were diagnosed with laboratory-confirmed SARS-CoV-2 infection by Real-Time Polymerase Chain Reaction test between April 2020 and October 2020 were identified as the subjects of this study. The demographic data, including tumor characteristics, laboratory results, anti-cancer treatments, patient outcomes (need for hospitalization, ICU admission, complications, and mortality), were extracted and analyzed. Results: We identified 89 patients diagnosed with hematological malignancies who were infected with COVID-19 during the eligibility period. The median age at the time of diagnosis was 54 years (range, 19-80 years). Fifty-two patients (58%) were male, and 37 patients (42%) were female. Of the 89 cases, 41 patients (46%) were diagnosed with lymphoma, 27 patients (30%) had leukemia, 21 patients (24%) had multiple myeloma. 84 patients (94%) received prior anti-cancer treatment, such as: chemotherapy (n=47, 53%), immunotherapy (n= 4, 4%), chemoimmunotherapy (n=26, 29%), and tyrosine kinase inhibitors (n=3, 3%). At the time of COVID-19 diagnosis, 52 patients (58%) had active malignancy, while 37 patients (42%) were in remission. Fifty-nine patients (66%) had comorbidities, with hypertension (n=32, 36%) being the most commonly reported comorbidity, followed by diabetes mellitus (n=25, 28%) and ischemic heart disease (n=8, 9%). The most encountered presentations were: fever (n=32, 36%) followed by cough (n=31, 35%), shortness of breath (n=21, 23%), aches (n=6, 7%), fatigue (n=6, 7%), and ageusia (n=6, 7%). Forty subjects (45%) were hospitalized, 11 patients (12%) were eventually admitted to the intensive care unit (ICU). Notably, the hospitalization and ICU admission rates were higher among the people aged more than 53 years (n= 24, 59%; n=9, 82%, respectively). Among the 89 patients, complications were recognized in 36% of the patients (n=32), with sepsis (n=12, 13%), acute kidney injury (n=11, 12%), and cardiovascular complications (n=3, 3%) being the most prevalent complications. The median time interval between the date of COVID-19 diagnosis and the last follow-up date was 3 months (range, 2 days-6.4 months). At the time of the last follow-up, 64 patients (72%) remained alive, and 25 patients (28%) succumbed to COVID-related complications. Conclusion: The COVID-19 infection has deteriorated clinical outcomes among patients with hematological malignancies, which could be attributed to the high incidence of infections, increased risk of hospitalizations/ICU admissions, and other COVID-related complications. Such high morbidity and mortality rates necessitate future studies to outline the potential risk factors for COVID-related complications and modifications in the plan of care, including evaluation of the effect of vaccination on the outcome of these patients. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 2572-2572
Author(s):  
Aleksandr Lazaryan ◽  
Tao Wang ◽  
Stephen R. Spellman ◽  
Hai-Lin Wang ◽  
Carlheinz R. Müller ◽  
...  

Abstract The diversity of the HLA class I and II alleles can be simplified by consolidating them into fewer supertype clusters based on functional or predicted structural similarities in epitope binding grooves of HLA molecules. HLA class I and II supertypes have been increasingly studied in association with immune susceptibility to infection and cancer with potential implications for vaccine development. However, the significance of individual allele mismatching within and outside of HLA class I or II supertypes remains unknown in the context of hematopoietic cell transplantation (HCT). We therefore studied the impact of HLA supertype disparities on clinical outcomes of 1934 patients with AML (45%), ALL (31%), CML (14%) or MDS (9%) who underwent 7/8 unrelated donor myeloablative conditioning HCT from 1999 to 2011 and were registered with CIBMTR. Median age at transplant was 35 years (range, 1-70); 53% were males; 81% Caucasian; 56% received peripheral blood grafts; 50% were ABO-mismatched; 36% had in-vivo T-cell depletion; 62% received tacrolimus- and 36% cyclosporine A-based GVHD prophylaxis; 72% male or non-parous female donors; median follow up of survivors was 54 months (3-149). Supertype assignment methods of (1) revised main HLA anchor specificities (Sydney, 2008) and (2) bioinformatics (Doytchinova, 2004-05) were used to categorize single mismatched alleles into 6 HLA-A (A01, A01A03, A01A24, A02, A03, A24), 6 HLA-B (B07, B08, B27, B44, B58, B62), 2 HLA-C (C1, C2), and 5 DRB1 (DR1, DR3, DR4, DR5, DR9) supertypes. Overall survival (OS), disease-free survival (DFS), relapse, treatment-related mortality (TRM), acute graft vs. host disease (aGVHD) and chronic GVHD were compared across matched vs. mismatched HLA-A (265 vs. 429), -B (230 vs. 92), -C (365 vs. 349), and -DRB1 (153 vs. 51) supertypes. We used predetermined α=0.01 for statistical significance as multiple exploratory analyses were conducted by Kaplan-Meier, Gray, and Cox proportional hazard methods. In the multivariable analysis, supertype B-mismatch was associated with increased risk of grade II-IV aGVHD (HR=1.78; 95% CI, 1.23-2.59, p=0.0025), however no difference was found for grade III-IV aGVHD or other clinical outcomes compared to supertype B-matches. Supertype DRB1-mismatch was associated with shorter neutrophil recovery (HR=0.51; 95% CI, 0.36-0.71, p=0.0001), yet a trend toward inferior OS (HR=1.58; 95% CI 1.04-2.38, p=0.037) and higher TRM (HR=1.64; 95% CI, 0.99-2.74, p=0.0565) compared to DRB1 matches within supertypes. There was no increased risk of GVHD with DRB1 supertype mismatch. No associations were observed between HLA-A and -C supertypes or aggregate supertype-matched vs. -mismatched groups for any outcomes. Our analysis demonstrated differential influence of HLA supertype-based allele matching within -B and -DRB1 loci on clinical outcomes after myeloablative 7/8 URD HCT. Disclosures No relevant conflicts of interest to declare.


2021 ◽  
Vol 15 (3) ◽  
pp. e0009174
Author(s):  
Helen Ngum Ntonifor ◽  
Julius Suh Chewa ◽  
Mahamat Oumar ◽  
Hermann Desire Mbouobda

This study aimed at determining the impact of intestinal helminths on malaria parasitaemia, anaemia and pyrexia considering the levels of IL-1β among outpatients in Bamenda. A cohort of 358 consented participants aged three (3) years and above, both males and females on malaria consultation were recruited in the study. At enrolment, patients’ axillary body temperatures were measured and recorded. Venous blood was collected for haemoglobin concentration and malaria parasitaemia determination. Blood plasma was used to measure human IL-1β levels using Human ELISA Kit. The Kato-Katz technique was used to process stool samples. Five species of intestinal helminths Ascaris lumbricoides (6.4%), Enterobius vermicularis (5.0%), Taenia species (4.2%), Trichuris trichiura (1.1%) and hookworms (0.8%) were identified. The overall prevalence of Plasmodium falciparum and intestinal helminths was 30.4% (109/358) and 17.6% (63/358) respectively. The prevalence of intestinal helminths in malaria patients was 17.4% (19/109). Higher Geometric mean parasite density (GMPD ±SD) (malaria parasitaemia) was significantly observed in patients co-infected with Enterobius vermicularis (5548 ± 2829/μL, p = 0.041) and with Taenia species (6799 ± 4584/μL, p = 0.020) than in Plasmodium falciparum infected patients alone (651 ± 6076/ μL). Higher parasitaemia of (1393 ± 3031/μL) and (3464 ± 2828/μL) were recorded in patients co-infected with Ascaris lumbricoides and with hookworms respectively but the differences were not significant (p > 0.05). Anaemia and pyrexia prevalence was 27.1% (97/358) and 33.5% (120/358) respectively. Malaria patients co-infected with Enterobius vermicularis and Ascaris lumbricoides had increased risk of anaemia (OR = 13.712, p = 0.002 and OR = 16.969, p = 0.014) respectively and pyrexia (OR = 18.07, p = 0.001 and OR = 22.560, p = 0.007) respectively than their counterparts. Increased levels of IL-1β were significantly observed in anaemic (148.884 ± 36.073 pg/mL, t = 7.411, p = 0.000) and pyretic (127.737 ± 50.322 pg/mL, t = 5.028, p = 0.000) patients than in non-anaemic (64.335 ± 38.995pg/mL) and apyretic patients (58.479 ± 36.194pg/mL). Malaria patients co-infected with each species of intestinal helminths recorded higher IL-1β levels (IL-1β > 121.68 ± 58.86 pg/mL) and the overall mean (139.63 ± 38.33pg/mL) was higher compared with levels in malaria (121.68 ± 58.86 pg/mL) and helminth (61.78 ± 31.69pg/mL) infected patients alone. Intestinal helminths exacerbated the clinical outcomes of malaria in the patients and increased levels of IL-1β were observed in co-infected patients with anaemia, pyrexia and higher parasitaemia.


2018 ◽  
Vol 39 (46) ◽  
pp. 4101-4108 ◽  
Author(s):  
Deborah N Kalkman ◽  
Melissa Aquino ◽  
Bimmer E Claessen ◽  
Usman Baber ◽  
Paul Guedeney ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document