Incidence of Unintentional Flow of Contrast into the Facet Joints During Fluoroscopy-Guided Cervical Interlaminar Epidural Injections: A Retrospective Cohort Study

Pain Medicine ◽  
2020 ◽  
Vol 21 (7) ◽  
pp. 1362-1368
Author(s):  
Yoo Jung Park ◽  
Joon-Yong Jung ◽  
Gyuho Choe ◽  
Yu Jung Lee ◽  
Jiyoung Lee ◽  
...  

Abstract Objective We sometimes encounter unintentional flow of contrast into the facet joints during cervical interlaminar epidural injection, which leads to false-positive epidural injection. The purposes of this study were to evaluate the rate of facet flow of contrast and to investigate various factors associated with injection into the space of Okada during fluoroscopy-guided cervical interlaminar epidural injection. Setting and Subjects Images from consecutive cases of fluoroscopy-guided cervical interlaminar epidural injection performed at a single institution between July 2015 and July 2018 were obtained and reviewed. Methods Cases of epidural injection were classified as either facet flow or no facet flow. Multivariate logistic regression was used to identify the predictive factors of unintended injection into the Okada space. Results A total of 2,006 cases were included. Intra-articular flow was identified in 6.0% of cases (121/2,006). All cases of flow of contrast into the facet joints were recognized, and appropriate epidurograms were obtained during the procedures. The highest rate of unintended facet flow of the contrast (10.1%, 44/436) occurred at C5–6. Cervical interlaminar epidural injection at C5–6 and above (adjusted odds ratio [aOR] = 1.929, P = 0.001) and the paramidline approach for epidural injection (aOR = 2.427, P < 0.001) were associated with injection into the space of Okada. Conclusions We detected injection into the space of Okada during fluoroscopy-guided cervical interlaminar epidural injection in 6.0% of procedures. Cervical interlaminar epidural injection at C5–6 and above and the paramidline approach for epidural injection were positive predictors of unintentional facet flow of the contrast.

Author(s):  
Daiki Sakai ◽  
Wataru Matsumiya ◽  
Sentaro Kusuhara ◽  
Makoto Nakamura

Abstract Purpose To evaluate the factors associated with the development of ocular candidiasis (OC) and ocular prognosis with echinocandin therapy for candidemia. Methods The medical records of 56 consecutive patients with a positive blood culture for Candida species between November 2016 and October 2019 were retrospectively reviewed. Information on patient characteristics, isolated Candida species, treatment details for candidemia, and ocular findings were extracted to identify factors associated with OC development. Results The leading pathogen of candidemia was Candida albicans (C.albicans) (41.1%). Of 56 patients, 18 (32.1%) were diagnosed with chorioretinitis, categorized as either probable (8 patients) or possible OC (10 patients). There was no case of endophthalmitis with vitritis. The incidence of probable OC was not significantly different between the groups treated with echinocandins and other antifungal drugs (15.2% vs. 11.1%, p = 1.00). In all probable OC cases, systemic antifungal therapy was switched from echinocandins to azoles, and no case progressed to endophthalmitis. A multivariate logistic analysis revealed that female sex (adjusted odds ratio [aOR], 8.93; 95% confidence interval [CI], 1.09–72.9) and C. albicans (aOR, 23.6; 95% CI, 1.8–281) were independent factors associated with the development of probable OC. Conclusion One-seventh of patients with candidemia developed probable OC. Given the evidence of female and C. albicans as the factors associated with OC development, careful ophthalmologic management is required with these factors, especially in candidemia. Although echinocandins had no correlation with OC development and did not lead to the deterioration of ocular prognosis, further investigation is required.


2020 ◽  
Vol 8 ◽  
pp. 205031212097800
Author(s):  
Damtew Asrat ◽  
Atsede Alle ◽  
Bekalu Kebede ◽  
Bekalu Dessie

Background: Over the last 100 years, the development and mass production of chemically synthesized drugs have revolutionized health care in most parts of the world. However, large sections of the population in developing countries still depend on traditional medicines for their primary health care needs. More than 88% of Ethiopian parents use different forms of traditional medicine for their children. Therefore, this study aimed to determine factors associated with parental traditional medicine use for children in Fagita Lekoma Woreda. Method: Community-based cross-sectional study was conducted from 1 to 30 March 2019 in Fagita Lekoma Woreda. Data collection tool was a structured interviewer-administered questionnaire. Both descriptive and inferential statistics were used to present the data. Odds ratio and binary and multiple logistic regression analysis were used to measure the relationship between dependent and independent variables. Results: Among 858 participants, 71% of parents had used traditional medicine for their children within the last 12 months. Parents who cannot read and write (adjusted odds ratio = 6.42, 95% confidence interval = 2.1–19.7), parents with low monthly income (adjusted odds ratio = 4.38, 95% confidence interval = 1.58–12.1), and those who had accesses to traditional medicine (adjusted odds ratio = 2.21, 95% confidence interval = 1.23–3.98) were more likely to use traditional medicine for their children. Urban residents (adjusted odds ratio = 0.20, 95% confidence interval = 0.11–0.38) and members of community-based health insurance (adjusted odds ratio = 0.421, 95% confidence interval = 0.211–0.84) were less likely to use traditional medicine for their children. Conclusions: Our study revealed that the prevalence of traditional medicine remains high. Educational status, monthly income, residence, accessibility to traditional medicine, and being a member of community-based health insurance were predictors of potential traditional medicine use. Therefore, the integration of traditional medicine with modern medicine should be strengthened. Community education and further study on efficacy and safety of traditional medicines should be also given great attention.


2021 ◽  
Author(s):  
Chang-Soon Lee ◽  
Young Jae Park ◽  
Jee Youn Moon ◽  
Yong-Chul Kim

Background Deep spinal infection is a devastating complication after epidural injection. This study aimed to investigate the incidence of deep spinal infection primarily after outpatient single-shot epidural injection for pain. Secondarily, this study assessed the national trends of the procedure and risk factors for said infection. Methods Using South Korea’s National Health Insurance Service sample cohort database, the 10-yr national trend of single-shot epidural injections for pain and the incidence rate of deep spinal infection after the procedure with its risk factors were determined. New-onset deep spinal infections were defined as those occurring within 90 days of the most recent outpatient single-shot epidural injection for pain, needing hospitalization for at least 1 night, and receiving at least a 4-week course of antibiotics. Results The number of outpatient single-shot epidural injections per 1,000 persons in pain practice doubled from 40.8 in 2006 to 84.4 in 2015 in South Korea. Among the 501,509 injections performed between 2007 and 2015, 52 cases of deep spinal infections were detected within 90 days postprocedurally (0.01% per injection). In multivariable analysis, age of 65 yr or more (odds ratio, 2.91; 95% CI, 1.62 to 5.5; P = 0.001), living in a rural area (odds ratio, 2.85; 95% CI, 1.57 to 5.0; P < 0.001), complicated diabetes (odds ratio, 3.18; 95% CI, 1.30 to 6.7; P = 0.005), multiple epidural injections (three times or more) within the previous 90 days (odds ratio, 2.34; 95% CI, 1.22 to 4.2; P = 0.007), and recent use of immunosuppressants (odds ratio, 2.90; 95% CI, 1.00 to 6.7; P = 0.025) were significant risk factors of the infection postprocedurally. Conclusions The incidence of deep spinal infection after outpatient single-shot epidural injections for pain is very rare within 90 days of the procedure (0.01%). The data identify high-risk patients and procedure characteristics that may inform healthcare provider decision-making. Editor’s Perspective What We Already Know about This Topic What This Article Tells Us That Is New


2013 ◽  
Vol 31 (3) ◽  
pp. 306-314 ◽  
Author(s):  
Edson Theodoro dos S. Neto ◽  
Eliana Zandonade ◽  
Adauto Oliveira Emmerich

OBJECTIVE To analyze the factors associated with breastfeeding duration by two statistical models. METHODS A population-based cohort study was conducted with 86 mothers and newborns from two areas primary covered by the National Health System, with high rates of infant mortality in Vitória, Espírito Santo, Brazil. During 30 months, 67 (78%) children and mothers were visited seven times at home by trained interviewers, who filled out survey forms. Data on food and sucking habits, socioeconomic and maternal characteristics were collected. Variables were analyzed by Cox regression models, considering duration of breastfeeding as the dependent variable, and logistic regression (dependent variables, was the presence of a breastfeeding child in different post-natal ages). RESULTS In the logistic regression model, the pacifier sucking (adjusted Odds Ratio: 3.4; 95%CI 1.2-9.55) and bottle feeding (adjusted Odds Ratio: 4.4; 95%CI 1.6-12.1) increased the chance of weaning a child before one year of age. Variables associated to breastfeeding duration in the Cox regression model were: pacifier sucking (adjusted Hazard Ratio 2.0; 95%CI 1.2-3.3) and bottle feeding (adjusted Hazard Ratio 2.0; 95%CI 1.2-3.5). However, protective factors (maternal age and family income) differed between both models. CONCLUSIONS Risk and protective factors associated with cessation of breastfeeding may be analyzed by different models of statistical regression. Cox Regression Models are adequate to analyze such factors in longitudinal studies.


Author(s):  
Rekib Sacaklidir ◽  
Ekim Can Ozturk ◽  
Savas Sencan ◽  
Osman Hakan Gunduz

Background: Since fluoroscopy-guided interventional therapies grew significantly in recent years, exposure to ionizing radiation (IR) either for patient or medical staff became a critical issue. IR exposure varies according to the physicians’ experience, patients’ body mass index (BMI), imaging techniques and type of the procedure performed. The purpose of this study is to calculate the reference IR doses for fluoroscopy-guided epidural injections per procedure and BMI to provide reference doses for potential use in future dose reduction strategies. Methods: A retrospectively, evaluation of patients who received epidural steroid injections between January 2015 and December 2020 in a university hospital interventional pain management center, was performed. This observational study was conducted with patients aged  18 who underwent 3711 epidural injections including cervical interlaminar, lumbar interlaminar, lumbar transforaminal and caudal approaches. Provided IR doses for each patient were also divided by patients’ BMI to obtain dose per BMI. Results: The highest IR dose per procedure was found in caudal epidural injection with 0.218 mGy m2 and lowest dose was in cervical interlaminar epidural injection with 0.057 mGy m2. The IR dose per procedure was 0.123 mGy m2 for lumbar transforaminal and 0.191 mGy m2 for lumbar interlaminar epidural injection. Caudal epidural injection had also the highest IR dose per BMI which was 0.00749 and cervical interlaminar epidural injection had the lowest radiation dose per BMI which was 0.00214. Conclusions: We proposed reference IR dose levels of four approaches of epidural injections obtained from 3711 injections performed in a university hospital pain medicine clinic. BMI of patients were taken into account with the dose levels of injections given per BMI. Multicenter research with standardized techniques will assure more reliable reference levels which will guide pain physicians to self-assess their own levels of radiation exposure.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245528
Author(s):  
Almaz Tefera Gonete ◽  
Bogale Kassahun ◽  
Eskedar Getie Mekonnen ◽  
Wubet Worku Takele

Background Stunting at birth is a chronic form of undernutrition majorly attributable to poor prenatal nutrition, which could persist in children’s later life and impact their physical and cognitive health. Although multiple studies have been conducted in Ethiopia to show the magnitude of stunting and factors, all are concentrated on children aged between 6 to 59 months. Therefore, this study was done to determine the prevalence and associated factors of stunting at birth among newborns delivered at the University of Gondar Comprehensive Specialized Referral Hospital, Northwest, Ethiopia. Methods An institution-based cross-sectional study was conducted from February 26th to April 25th/2020. A systematic random sampling technique was used, to select a total of 422 newborn-mother pairs. The binary logistic regression was employed to identify factors associated with stunting and all independent variables were entered into the multivariable logistic regression model to adjust for confounders. Variables that had significant association were identified based on p-value < 0.05 and the adjusted odds ratio with its respective 95% confidence interval was applied to determine the strength as well as the direction of the association. Results About 30.5% (95% CI: 26.3%, 35.1%) of newborns were stunted at birth. Being male [Adjusted odds ratio (AOR) = 2.9(1.62, 5.21)], newborns conceived in Kiremt(rainy season) [AOR = 2.7(1.49, 4.97)], being low birth weight [AOR = 3.1(1.64, 6.06)] were factors associated with stunting at birth. Likewise, newborns born to short stature mothers [AOR = 2.8(1.21, 6.62)] and chronically malnourished mothers [AOR = 15.3(8.12, 29.1)] were at greater risk of being stunted. Conclusion Just under a third of newborns are stunted at birth, implying a pressing public health problem. Newborns born to chronically malnourished and short stature mothers were more stunted. Besides, stunting was prevalently observed among male neonates, newborns conceived in Kiremet, and being low birth weight. Thus, policymakers and nutrition programmers should work on preventing maternal undernutrition through nutrition education to reduce the burden of low birth weight and stunting. Further, paying due attention to newborns conceived in Kiremet season to improve nutritional status is recommended.


2016 ◽  
Vol 4 (1) ◽  
Author(s):  
Ikwo K. Oboho ◽  
Anna Bramley ◽  
Lyn Finelli ◽  
Alicia Fry ◽  
Krow Ampofo ◽  
...  

Abstract Background Data on oseltamivir treatment among hospitalized community-acquired pneumonia (CAP) patients are limited. Methods Patients hospitalized with CAP at 6 hospitals during the 2010−2012 influenza seasons were included. We assessed factors associated with oseltamivir treatment using logistic regression. Results Oseltamivir treatment was provided to 89 of 1627 (5%) children (&lt;18 years) and 143 of 1051 (14%) adults. Among those with positive clinician-ordered influenza tests, 39 of 61 (64%) children and 37 of 48 (77%) adults received oseltamivir. Among children, oseltamivir treatment was associated with hospital A (adjusted odds ratio [aOR], 2.76; 95% confidence interval [CI], 1.36−4.88), clinician-ordered testing performed (aOR, 2.44; 95% CI, 1.47−5.19), intensive care unit (ICU) admission (aOR, 2.09; 95% CI, 1.27−3.45), and age ≥2 years (aOR, 1.43; 95% CI, 1.16−1.76). Among adults, oseltamivir treatment was associated with clinician-ordered testing performed (aOR, 8.38; 95% CI, 4.64−15.12), hospitals D and E (aOR, 3.46−5.11; 95% CI, 1.75−11.01), Hispanic ethnicity (aOR, 2.06; 95% CI, 1.18−3.59), and ICU admission (aOR, 2.05; 95% CI, 1.34−3.13). Conclusions Among patients hospitalized with CAP during influenza season, oseltamivir treatment was moderate overall and associated with clinician-ordered testing, severe illness, and specific hospitals. Increased clinician education is needed to include influenza in the differential diagnosis for hospitalized CAP patients and to test and treat patients empirically if influenza is suspected.


2019 ◽  
Vol 130 (6) ◽  
pp. 912-922 ◽  
Author(s):  
Jean Guglielminotti ◽  
Ruth Landau ◽  
Guohua Li

Abstract Editor’s Perspective What We Already Know about This Topic What This Article Tells Us That Is New Background Compared with neuraxial anesthesia, general anesthesia for cesarean delivery is associated with increased risk of maternal adverse events. Reducing avoidable general anesthetics for cesarean delivery may improve safety of obstetric anesthesia care. This study examined adverse events, trends, and factors associated with potentially avoidable general anesthetics for cesarean delivery. Methods This retrospective study analyzed cesarean delivery cases without a recorded indication for general anesthesia or contraindication to neuraxial anesthesia in New York State hospitals, 2003 to 2014. Adverse events included anesthesia complications (systemic, neuraxial-related, and drug-related), surgical site infection, venous thromboembolism, and the composite of death or cardiac arrest. Anesthesia complications were defined as severe if associated with death, organ failure, or prolonged hospital stay. Results During the study period, 466,014 cesarean deliveries without a recorded indication for general anesthesia or contraindication to neuraxial anesthesia were analyzed; 26,431 were completed with general anesthesia (5.7%). The proportion of avoidable general anesthetics decreased from 5.6% in 2003 to 2004 to 4.8% in 2013 to 2014 (14% reduction; P &lt; 0.001). Avoidable general anesthetics were associated with significantly increased risk of anesthesia complications (adjusted odds ratio, 1.6; 95% CI, 1.4 to 1.9), severe complications (adjusted odds ratio, 2.9; 95% CI, 1.6 to 5.2), surgical site infection (adjusted odds ratio, 1.7; 95% CI, 1.5 to 2.1), and venous thromboembolism (adjusted odds ratio, 1.9; 95% CI, 1.3 to 3.0), but not of death or cardiac arrest. Labor neuraxial analgesia rate was one of the most actionable hospital-level factors associated with avoidable general anesthetics. Relative to hospitals with a rate greater than or equal to 75%, the adjusted odds ratio of avoidable general anesthetics increased to 1.3 (95% CI, 1.2 to 1.4), 1.6 (95% CI, 1.5 to 1.7), and 3.2 (95% CI, 3.0 to 3.5) as the rate decreased to 50 to 74.9%, 25 to 49.9%, and less than 25%, respectively. Conclusions Compared with neuraxial anesthesia, avoidable general anesthetics are associated with increased risk of adverse maternal outcomes.


2015 ◽  
Vol 36 (11) ◽  
pp. 1298-1304 ◽  
Author(s):  
Jessica Reno ◽  
Saumil Doshi ◽  
Amy K. Tunali ◽  
Betsy Stein ◽  
Monica M. Farley ◽  
...  

BACKGROUNDPatients with candidemia are at risk for other invasive infections, such as methicillin-resistantStaphylococcus aureus(MRSA) bloodstream infection (BSI).OBJECTIVETo identify the risk factors for, and outcomes of, BSI in adults withCandidaspp. and MRSA at the same time or nearly the same time.DESIGNPopulation-based cohort study.SETTINGMetropolitan Atlanta, March 1, 2008, through November 30, 2012.PATIENTSAll residents withCandidaspp. or MRSA isolated from blood.METHODSThe Georgia Emerging Infections Program conducts active, population-based surveillance for candidemia and invasive MRSA. Medical records for patients with incident candidemia were reviewed to identify cases of MRSA coinfection, defined as incident MRSA BSI 30 days before or after candidemia. Multivariate logistic regression was performed to identify factors associated with coinfection in patients with candidemia.RESULTSAmong 2,070 adult candidemia cases, 110 (5.3%) had coinfection within 30 days. Among these 110 coinfections, MRSA BSI usually preceded candidemia (60.9%; n=67) or occurred on the same day (20.0%; n=22). The incidence of coinfection per 100,000 population decreased from 1.12 to 0.53 between 2009 and 2012, paralleling the decreased incidence of all MRSA BSIs and candidemia. Thirty-day mortality was similarly high between coinfection cases and candidemia alone (45.2% vs 36.0%,P=.10). Only nursing home residence (odds ratio, 1.72 [95% CI, 1.03–2.86]) predicted coinfection.CONCLUSIONSA small but important proportion of patients with candidemia have MRSA coinfection, suggesting that heightened awareness is warranted after 1 major BSI pathogen is identified. Nursing home residents should be targeted in BSI prevention efforts.Infect. Control Hosp. Epidemiol.2015;36(11):1298–1304


2018 ◽  
Vol 08 (02) ◽  
pp. e89-e94 ◽  
Author(s):  
Tetsuya Kawakita ◽  
Chun-Chih Huang ◽  
Helain Landy

Objective The aim of the study was to examine the association between cervical exam at the time of artificial rupture of membranes (AROM) and cord prolapse. Study Design We conducted a retrospective cohort study using the data from the Consortium on Safe Labor. We included women with cephalic presentation and singleton pregnancies at ≥ 23 weeks' gestation who underwent AROM during the course of labor. Multivariable logistic regression was used to calculate the adjusted odds ratio (aOR) with 95% confidence interval (95% CI), controlling for prespecified covariates. Results Of 57,204 women who underwent AROM, cord prolapse occurred in 113 (0.2%). Compared with dilation 6 to 10 cm + station ≥ 0 at the time of AROM, <6 cm + any station and 6–10 cm + station ≤ −3 were associated with increased risks of cord prolapse (<6 cm + station ≤ −3 [aOR, 2.29; 95% CI, 1.02–5.40]; <6 cm + station −2.5 to −0.5 [aOR, 2.34; 95% CI, 1.23–4.97]; <6 cm + station ≥ 0 [aOR, 3.31; 95% CI, 1.39–8.09]; and 6–10 cm + station ≤ −3 [aOR, 5.47; 95% CI, 1.35–17.48]). Conclusion Cervical dilation < 6 cm with any station and 6 to 10 cm with station ≤ −3 were associated with a higher risk of cord prolapse.


Sign in / Sign up

Export Citation Format

Share Document