scholarly journals Evaluation of a mandatory phishing training program for high-risk employees at a US healthcare system

2019 ◽  
Vol 26 (6) ◽  
pp. 547-552 ◽  
Author(s):  
William J Gordon ◽  
Adam Wright ◽  
Robert J Glynn ◽  
Jigar Kadakia ◽  
Christina Mazzone ◽  
...  

Abstract Objective The study sought to understand the impact of a phishing training program on phishing click rates for employees at a single, anonymous US healthcare institution. Materials and Methods We stratified our population into 2 groups: offenders and nonoffenders. Offenders were defined as those that had clicked on at least 5 simulated phishing emails and nonoffenders were those that had not. We calculated click rates for offenders and nonoffenders, before and after a mandatory training program for offenders was implemented. Results A total of 5416 unique employees received all 20 campaigns during the intervention period; 772 clicked on at least 5 emails and were labeled offenders. Only 975 (17.9%) of our set clicked on 0 phishing emails over the course of the 20 campaigns; 3565 (65.3%) clicked on at least 2 emails. There was a decrease in click rates for each group over the 20 campaigns. The mandatory training program, initiated after campaign 15, did not have a substantial impact on click rates, and the offenders remained more likely to click on a phishing simulation. Discussion Phishing is a common threat vector against hospital employees and an important cybersecurity risk to healthcare systems. Our work suggests that, under simulation, employee click rates decrease with repeated simulation, but a mandatory training program targeted at high-risk employees did not meaningfully decrease the click rates of this population. Conclusions Employee phishing click rates decrease over time, but a mandatory training program for the highest-risk employees did not decrease click rates when compared with lower-risk employees.

2021 ◽  
pp. 082585972110374
Author(s):  
Jee Y. You ◽  
Lie D. Ligasaputri ◽  
Adarsh Katamreddy ◽  
Kiran Para ◽  
Elizabeth Kavanagh ◽  
...  

Many patients admitted to intensive care units (ICUs) are at high risk of dying. We hypothesize that focused training sessions for ICU providers by palliative care (PC) certified experts will decrease aggressive medical interventions at the end of life. We designed and implemented a 6-session PC training program in communication skills and goals of care (GOC) meetings for ICU teams, including house staff, critical care fellows, and attendings. We then reviewed charts of ICU patients treated before and after the intervention. Forty-nine of 177 (28%) and 63 of 173 (38%) patients were identified to be at high risk of death in the pre- and postintervention periods, respectively, and were included based on the study criteria. Inpatient mortality (45% vs 33%; P = .24) and need for mechanical ventilation (59% vs 44%, P = .13) were slightly higher in the preintervention population, but the difference was not statistically significant. The proportion of patients in whom the decision not to initiate renal replacement therapy was made because of poor prognosis was significantly higher in the postintervention population (14% vs 67%, P = .05). There was a nonstatistically significant trend toward earlier GOC discussions (median time from ICU admission to GOC 4 vs 3 days) and fewer critical care interventions such as tracheostomies (17% vs 4%, P = .19). Our study demonstrates that directed PC training of ICU teams has a potential to reduce end of life critical care interventions in patients with a poor prognosis.


2021 ◽  
Vol 108 (Supplement_7) ◽  
Author(s):  
Pierre Montauban ◽  
Charannya Balakumar ◽  
Jaideep Rait ◽  
Prizzi Zarsadias ◽  
Sara Iqbal ◽  
...  

Abstract Background Effective training is vital when facing viral outbreaks such as the SARS Coronavirus 2 (SARS-CoV-2) outbreak of 2019. The objective of this study was to measure the impact of in-situ simulation on the confidence of the surgical teams of two hospitals in assessing and managing acutely unwell surgical patients who are high-risk or confirmed to have COVID-19. Methods This was a quasi-experimental study with a pretest-posttest design. The surgical teams at each hospital participated in multi-disciplinary simulation sessions to explore the assessment and management of a patient requiring emergency surgery who is high risk for COVID-19. The participants were surveyed before and after receiving simulation training to determine their level of confidence on a Visual Analog Scale (VAS) for the premise stated in each of the nine questions in the survey, which represented multiple aspects of the care of these patients. Results 27 participants responded the pre-simulation survey and 24 the one post-simulation. The level of confidence (VAS score) were statistically significantly higher for all nine questions after the simulation. Specific themes were identified for further training and changes in policy. Conclusion In-situ simulation is an effective training method. Its versatility allows it to be set up quickly as rapid-response training in the face of an imminent threat. In this study, it improved the preparedness of two surgical teams for the challenges of the COVID-19 pandemic.


2018 ◽  
Vol 10 (4) ◽  
pp. 140
Author(s):  
Mohammad Ahmad Bairat ◽  
Akef Abdullah Al-Khateeb

The study aimed at building a training program for the families of students with learning disabilities to activate the familial participation and reduce learning disabilities aspects and develop the academic achievement of such students. The study’s sample composed of (46) families and (46) male and female students from these families. To achieve the objectives of the study, the researchers prepared a list to assist the familial participation applied on the families before and after the training period; they used the scale of (Sartawi,1995) to reveal the learning disabilities of their children applied before and after the training period, as well as the scale of academic achievement (educational packages,2010), moreover; they built the suggested program to activate the familial participation. The study concluded that there were statistically significant differences between the pre-measurement and post-measurement in favor of the post-measurement regarding the students’ performance in relation to the learning disabilities aspects. It also showed that there were statistically significant differences between the pre-measurement and post-measurement in favor of the post-measurement regarding the students’ performance in relation to the academic performance scale (educational packages,2010), furthermore; there were statistically significant positive correlation between the familial participation and learning disabilities aspects, and between the familial participation and the academic achievement.


2019 ◽  
Author(s):  
Pierre Delanaye ◽  
François Krzesinski ◽  
Bernard E Dubois ◽  
Alexandre Delcour ◽  
Sébastien Robinet ◽  
...  

Abstract Background Sudden death is frequent in haemodialysis (HD) patients. Both hyperkalaemia and change of plasma potassium (K) concentrations induced by HD could explain this. The impact of increasing dialysate K by 1 mEq/L on plasma K concentrations and electrocardiogram (ECG) results before and after HD sessions was studied. Methods Patients with pre-dialysis K >5.5 mEq/L were excluded. ECG and K measurements were obtained before and after the first session of the week for 2 weeks. Then, K in the dialysate was increased (from 1 or 3 to 2 or 4 mEq/L, respectively). Blood and ECG measurements were repeated after 2 weeks of this change. Results Twenty-seven prevalent HD patients were included. As expected, a significant decrease in K concentrations was observed after the dialysis session, but this decrease was significantly lower after the switch to an increased dialysate K. The pre-dialysis K concentrations were not different after changing, but post-dialysis K concentrations were higher after switching (P < 0.0001), with a lower incidence of post-dialysis hypokalaemia. Regarding ECG, before switching, the QT interval (QT) dispersion increased during the session, whereas no difference was observed after switching. One week after switching, post-dialysis QT dispersion [38 (34–42) ms] was lower than post-dialysis QT dispersion 2 weeks and 1 week before switching [42 (38–57) ms, P = 0.0004; and 40 (35–50) ms, P = 0.0002]. Conclusions A simple increase of 1 mEq/L of K in the dialysate is associated with a lower risk of hypokalaemia and a lower QT dispersion after the dialysis session. Further study is needed to determine if such a strategy is associated with a lower risk of sudden death.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 2588-2588
Author(s):  
Zaid Abdel Rahman ◽  
Michael G. Heckman ◽  
Kevin C. Miller ◽  
Patricia Greipp ◽  
Matthew R Spiegel ◽  
...  

Introduction Novel high-risk groups have been identified in adult ALL, including secondary (sALL) and Philadelphia-like ALL (Ph-like, based on CRLF2, IgH, ABL2, JAK2 and other tyrosine kinase translocations), and those with minimal residual disease >0.1% (MRD+) after induction therapy. Novel targeted therapies are now routinely incorporated into 1st line regimens, including tyrosine kinase inhibitors (BCR-ABL1-pos), rituximab (CD20+) and blinatumomab (Blina) for MRD+. The impact of these novel high risk groups and therapies after alloHCT is unknown; therefore, we evaluated their impact on overall survival (OS), relapse rate (REL), non-relapse mortality (NRM) and acute and chronic GVHD. Methods We evaluated pts receiving 1st Allo-HCT for ALL at Mayo Clinic (Rochester, Phoenix, and Jacksonville) from 2008-2018 for outcomes of interest, specifically the impact of novel therapies and risk groups. Associations of patient factors with outcomes were examined using univariable (UVA) and multivariable (MVA) Cox proportional hazards regression models, where the cause-specific hazard of the given outcome was modeled to account for the competing risk of death. Results We identified 261 consecutive AlloHCT recipients during the study period. Median age at transplant was 48 years (18-72) and 147 (56.3%) were male. The median comorbidity (HCT-CI) score was 2 (0-8). 213 pts (81.6%) had B-lineage ALL, of which 85 (32.6%) were BCR-ABL-pos, 17 (6.5%) Ph-like (identified by FISH), 16 (6.1%) hypoploidy/near triploidy (Hy/Tri), and 67 (25.7%) pre-B ALL NOS. The remaining 48 (18.4%) had T-ALL. 30 pts (11.5%) had sALL (i.e. prior chemo/radiotherapy for another malignancy). HyperCVAD was the most common 1st line regimen (68.2%). 243 (93.1%) pts achieved Complete Remission (CR1) after induction therapy, and 203 (77.8%) were in CR1 at the time of alloHCT. Blina was administered for MRD+ in 14 pts (5.4%), and for relapsed/refractory ALL (R/R) in 13 (27% of R/R pts), 7 of whom received Blina as initial therapy for R/R. Donors were matched unrelated in 149 (57.1%), matched related in 98 (37.5%), and haploidentical in 14 (5.4%). Peripheral blood (PB) grafts were used in 233 (89.3%). 103 (54.5%) were donor:recipient (D:R) sex-matched, and 86 D:R mismatched [47 (24.9%) M:F; and 39 (20.6%) F:M]. Myeloablative conditioning was used for the majority (78.5%) mostly with Cy/TBI (60.5%). Standard GVHD prophylaxis regimens were used. Outcomes Median follow-up after transplant was 22.4 months (0.5-135), and 51 (19.5%) had REL. The 1, 2 and 5-year survival rates were 71.9%, 64.9%, and 54.1%, respectively (Figure 1). Acute GVHD developed in 144 (55.2%) and chronic GVHD in 100 (38.3%). Ph-like ALL, Blina for MRD+, Blina for R/R, sALL and CD20-pos had no independent impact on OS. In contrast, age>60, Hy/Tri, and >CR1 at alloHCT were associated with worse OS in UVA, however, in MVA only pre-B ALL NOS was associated with better OS. Female:male D:R status was associated with inferior OS. Blina for R/R disease was associated with increased risk of REL in UVA [HR 5.26 95% CI (1.33, 20.00), p=0.017], whereas other novel high risk groups had no impact on REL. In contrast, T-ALL, Hy/Tri and >CR1 at AlloHCT were associated with increased REL in UVA, but only T-ALL and Hy/Tri continued to predict for increased REL in MVA. Secondary ALL was associated with increased NRM in UVA [HR 1.96 95% CI (1.07, 3.57), p=0.028], whereas other novel high risk groups had no impact on NRM. In contrast, age>60, >CR1 at AlloHCT and D:R sex mismatch were associated with higher NRM in UVA, but only sex mismatch and >CR1 at AlloHCT were associated with higher NRM in MVA. TBI use was associated with higher risk of acute GvHD (p=0.008) and ATG use with lower risk chronic GVHD (p<0.001). Similarly non-PB grafts were associated with a lower risk of chronic GVHD (p=0.005). Results for OS, REL, NRM, acute and chronic GVHD analysis are shown in Table 1. Conclusion Novel high risk groups (CD20+, Ph-like and sALL) do not appear to adversely impact OS after alloHCT, although sALL was associated with increased risk of NRM. Interestingly, pre-B-ALL NOS appear to be associated with favorable OS. Novel targeted therapies also do not independently predict outcome, with the exception of Blina for R/R ALL which may be associated with REL after subsequent alloHCT (a subgroup for whom novel maintenance strategies should be explored). Our analysis highlights the importance of allo-HCT for novel high risk ALL subgroups. Disclosures Patnaik: Stem Line Pharmaceuticals.: Membership on an entity's Board of Directors or advisory committees. Kharfan-Dabaja:Daiichi Sankyo: Consultancy; Pharmacyclics: Consultancy. Foran:Agios: Honoraria, Research Funding.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 3313-3313
Author(s):  
Frederic Baron ◽  
Myriam Labopin ◽  
Mohamad Mohty ◽  
Nadezda Basara ◽  
Dietger Niederwieser ◽  
...  

Abstract Abstract 3313 Poster Board III-201 RIC allo-SCT has been increasingly used as treatment for AML patients (pts) ineligible for myeloablative allo-SCT. Previous studies have observed a lower risk of relapse in pts who experienced chronic GVHD after RIC allo-SCT versus in those who did not. The objective of the current study was to further investigate the association between chronic GVHD and relapse in a large cohort of pts given RIC allo-SCT as treatment for AML. Data from 1188 AML pts in first or second CR transplanted between 2000 and 2008 following a RIC regimen at EBMT affiliated centers were analyzed. Patients were given PBSC from HLA-identical sibling (MRD, n=879), or from HLA-matched unrelated donors (MUD, n=309). RIC was defined as Busulfan conditioning regimens containing ≤ 8mg/kg total dose, or TBI <6 Gy. Median pt age at transplantation was 55 (range, 18-76) yrs in pts given grafts from MRD, versus 57 (range, 19-72) yrs in those given grafts from MUD. 54 pts had good risk (4.5%), 564 standard-risk (47.5%), and 116 high-risk (9.8%) cytogenetics, while cytogenetic was unknown in 454 pts (38.2%). The impact of chronic GVHD on relapse risk, non-relapse mortality (NRM) and leukemia-free survival (LFS) was assessed by time-dependent multivariate Cox models and in a landmark analysis. Three-yr incidences of relapse, NRM and LFS were 35 ± 2%, 14 ± 2%, and 50 ± 2%, respectively, while 2-yr incidence of chronic GVHD was 49 ± 2%. In a landmark analysis at 18 months after allo-SCT, 5-year relapse rates were 10 ± 2% versus 19 ± 3% for patients with or without chronic GVHD (P=0.04), respectively. In multivariate Cox models, CR2 versus CR1 (P=.003), pt age > 55 yrs (P=.008), alemtuzumab use in the RIC (P=.048), TBI-based RIC (P=.006), high-risk cytogenetics (P=.001), and absence of chronic GVHD (P=.015) were each associated with higher risk of relapse. Factors associated with high NRM were MUD versus MRD (P=.003), grade II-IV acute GVHD (P<.001), and chronic GVHD (P=.002). Factors associated with lower LFS were CR2 versus CR1 (P=.003), pt age > 55 yrs (P=.007), alemtuzumab use in the RIC (P=.012), and high-risk cytogenetics (P=.003). In conclusion, in this cohort of AML patients transplanted in remission, chronic GVHD was associated with a lower risk of relapse while profound in-vivo T cell depletion with alemtuzumab was associated with higher relapse rate suggesting that GVL effects play a role in preventing AML relapse in patients given RIC allo-SCT. Therefore, closed surveillance of patients in this setting not presenting chronic GVHD such as decreasing of immunosuppression should be further investigated. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2013 ◽  
Vol 122 (21) ◽  
pp. 518-518 ◽  
Author(s):  
Hideki Makishima ◽  
Thomas LaFramboise ◽  
Bartlomiej P Przychodzen ◽  
Kenichi Yoshida ◽  
Matthew Ruffalo ◽  
...  

Abstract Chromosomal aberrations and somatic mutations constitute key elements of the pathogenesis of myelodysplastic syndromes (MDS), a clonal hematologic malignancy characterized by cytopenias, a dysplastic bone marrow and propensity to clonal evolution. Next generation sequencing (NGS) enables definition of somatic mutational patterns and clonal architecture as a discovery platform, and for clinical applications. We systematically applied NGS to 707 cases of MDS and MDS-related disorders. 205 cases (low-risk MDS: N=78, high-risk MDS: N=42, MDS/MPN: N=48 and sAML: N=37) were tested by whole exome sequencing (WES). For validation in an additional 502 patients (low-risk MDS: N=192, high-risk MDS: N=104, MDS/MPN: N=111 and sAML: N=95), targeted deep NGS was applied for 60 index genes which were most commonly affected in the cohort analyzed by WES. For NGS data analysis a statistical pipeline was developed to focus on: i) identification of the most relevant somatic mutations, and ii) minimization of false positive results. We studied serial samples from 21 exemplary informative patients. We also compared somatic mutational patterns to those seen in primary AML TCGA cohort (N=201). Given the size of the cohort, there was, for example, a 87% chance of seeing mutations at a frequency of 1% and a 98% of seeing those with a frequency of 2%. While focusing on the most common events, we observed 1117 somatic mutations in 199 genes. The 88 genes mutated mutated in >1% of cases with MDS carried 388 mutations in MDS+sAML (2.5/case), 128 in MDS/MPN (2.7/case) and 398 in pAML (2.0/case). The average number of mutations per case increased during progression (2.2 in lower-risk, 2.8 in higher-risk MDS, 3.4 in sAML). In MDS, the 30 most frequently affected genes were present at least once in 70% of patients. The 30 most frequently mutated genes in MDS/MPN were mutated in 82% of patients. Individual mutations were also sub-grouped according to their function. When we compared three MDS subcategories (lower-risk, higher-risk MDS and sAML) in a cross-sectional view, RTK family, RAS family, IDH family and cohesin family mutations were more frequently detected in the sAML group than in the MDS group. In contrast, the frequency of the DNMT family, TET2 and ASXL family gene mutations did not increase in frequency in the sAML cohort. In addition to better definition of mutational patterns of known genes, we have also defined new mutations, including in the RNA helicase family and the BRCC3pathway. Clonal architecture analysis indicates that mutations of TET2, DNMT3A, ASXL1, and U2AF1 most likely represent ancestral/originator events, while those of the IDH family, RTK family and cohesin family are typical secondary events. Establishment of mutational patterns may improve the precision of morphologically-based diagnosis. The comparison between MDS-related diseases (MDS+sAML) and pAML revealed a notably different mutational pattern suggestive of a distinct molecular derivation of these two disease groups. While RTK, IDH family and NPM1 mutations were more frequently observed in the pAML cohort, mutations of SF3B1 and SRSF2, were more common in MDS+sAML. With regard to the connections between individual mutation combinations, RTK mutations were strongly associated with DNMT, but not with RAS family mutations in the pAML cohort, while the mutual association between TET2 and PRC2 family, cohesin family and RUNX1were encountered in the MDS+sAML cohort. Individual mutations may have prognostic significance, including having an impact on survival, either within the entire cohort or within specific subgroups. In the combined MDS cohort, TP53 family mutations were associated with a poor prognosis (HR; 3.65, 95%CI; 1.90-7.01, P<.0001) by univariate analysis. Similar results were found for mutations in TCF4(HR; 7.98, 95%CI; 1.58-10.1, P<.0007). Such an individual approach does not allow for assessment of the impact of less common mutational events. In conclusion, our study continues to indicate the power of NGS in the molecular analysis of MDS. MDS and related disorders show a great deal of pathogenetic molecular overlap, consistent with their morphologic and clinical pictures, but also distinct molecular differences in mutational patterns. Some of the specific mutations are pathognomonic for specific subtypes while some may convey a prognostic rather than discriminatory value. Disclosures: Makishima: Scott Hamilton CARES grant: Research Funding; AA & MDS international foundation: Research Funding. Polprasert:MDS foundation: Research Funding.


Author(s):  
Asmaa M. Elbandrawy ◽  
Sara G. Mahmoud ◽  
Mohamed F. AboElinin ◽  
Amel M. Yousef

The purpose of this study was to explore the impact of aerobic walking exercise on stress urinary incontinence (SUI) among postmenopausal women. Thirty females diagnosed with SUI participated in the research. Participants were assigned randomly into two groups: The usual care group (UC) and the UC plus aerobic walking exercise (TMT) group. The UC group performed pelvic floor muscle (PFM) training only, while the TMT group performed PFM training in addition to aerobic exercise. Myomed biofeedback was used to assess the PFM strength both before and after a 12-week period. The Revised Urinary Incontinence Scale was utilized to assess changes in incontinence severity symptoms after intervention. Findings revealed a significant increase in PFM strength in both UC and TMT groups (p = .011 and p = .010, respectively) and a significant reduction in their Revised Urinary Incontinence Scale (p = .011 and p = .001, respectively) after the end of the 12 weeks of the training program. In addition, there was a more significant increase in PFM strength in the TMT group than in the UC group (p = .010) and a more significant decrease in Revised Urinary Incontinence Scale (p = .011) after 12 weeks of the training program. This study concluded that aerobic walking exercise with PFM training is more effective than PFM training only in increasing PFM strength and improving symptoms of SUI in postmenopausal women with SUI.


2014 ◽  
Vol 9 (2) ◽  
pp. 265-272 ◽  
Author(s):  
Iker Muñoz ◽  
Stephen Seiler ◽  
Javier Bautista ◽  
Javier España ◽  
Eneko Larumbe ◽  
...  

Purpose:To quantify the impact of training-intensity distribution on 10K performance in recreational athletes.Methods:30 endurance runners were randomly assigned to a training program emphasizing low-intensity, sub-ventilatory-threshold (VT), polarized endurance-training distribution (PET) or a moderately high-intensity (between-thresholds) endurance-training program (BThET). Before the study, the subjects performed a maximal exercise test to determine VT and respiratory-compensation threshold (RCT), which allowed training to be controlled based on heart rate during each training session over the 10-wk intervention period. Subjects performed a 10-km race on the same course before and after the intervention period. Training was quantified based on the cumulative time spent in 3 intensity zones: zone 1 (low intensity, <VT), zone 2 (moderate intensity, between VT and RCT), and zone 3 (high intensity, >RCT). The contribution of total training time in each zone was controlled to have more low-intensity training in PET (±77/3/20), whereas for BThET the distribution was higher in zone 2 and lower in zone 1 (±46/35/19).Results:Both groups significantly improved their 10K time (39min18s ± 4min54s vs 37min19s ± 4min42s, P < .0001 for PET; 39min24s ± 3min54s vs 38min0s ± 4min24s, P < .001 for BThET). Improvements were 5.0% vs 3.6%, ~41 s difference at post-training-intervention. This difference was not significant. However, a subset analysis comparing the 12 runners who actually performed the most PET (n = 6) and BThET (n = 16) distributions showed greater improvement in PET by 1.29 standardized Cohen effect-size units (90% CI 0.31–2.27, P = .038).Conclusions:Polarized training can stimulate greater training effects than between-thresholds training in recreational runners.


2017 ◽  
Vol 29 (3) ◽  
pp. 273-277
Author(s):  
RI Gilson ◽  
DJ Clutterbuck ◽  
ZE Chen

There is a lack of data on ability and willingness of men who have sex with men (MSM) to self-fund HIV pre-exposure prophylaxis (PrEP). We aimed to explore how many eligible (PROUD study criteria) men may want PrEP and how many lower-risk MSM would be willing and able to self-fund this intervention. A self-completed anonymous questionnaire was distributed to MSM populations attending services. Of 377 participants, 81.5% were aware of PrEP. Fifty-three (15.5 %) were eligible, of whom 43 (81%) were very/extremely likely to want it. Of those ineligible, 229 (80%) were aware of PrEP and 106 (37.3%) were very/extremely likely to want it. Of eligible respondents 23% would be willing and able to pay at least £50 a month for PrEP. Of ineligible respondents this proportion was 21%. Our survey revealed high levels of awareness, understanding and willingness to take PrEP among MSM at high and lower risk of HIV acquisition. It indicated that over 70% of high-risk men would be unwilling or unable to self-fund PrEP, should it not be available on the NHS. For lower-risk MSM we estimated that capacity requirements for monitoring self-funded PrEP will be 50% higher than numbers eligible for PrEP. These factors will need to be taken into account when planning services.


Sign in / Sign up

Export Citation Format

Share Document