scholarly journals Developing a dashboard to meet the needs of residents in a competency-based training program: A design-based research project

Author(s):  
Robert Carey ◽  
Grayson Wilson ◽  
Venkat Bandi ◽  
Debajyoti Mondal ◽  
Lynsey Martin ◽  
...  

Background: Canadian specialty programs are implementing Competence By Design, a competency-based medical education (CBME) program which requires frequent assessments of entrustable professional activities. To be used for learning, the large amount of assessment data needs to be interpreted by residents, but little work has been done to determine how visualizing and interacting with this data can be supported. Within the University of Saskatchewan emergency medicine residency program, we sought to determine how our residents’ CBME assessment data should be presented to support their learning and to develop a dashboard that meets our residents’ needs. Methods: We utilized a design-based research process to identify and address resident needs surrounding the presentation of their assessment data. Data was collected within the emergency medicine residency program at the University of Saskatchewan via four resident focus groups held over 10 months. Focus group discussions were analyzed using a grounded theory approach to identify resident needs. This guided the development of a dashboard which contained elements (data, analytics, and visualizations) that support their interpretation of the data. The identified needs are described using quotes from the focus groups as well as visualizations of the dashboard elements. Results: Resident needs were classified under three themes: (1) Provide guidance through the assessment program, (2) Present workplace-based assessment data, and (3) Present other assessment data. Seventeen dashboard elements were designed to address these needs. Conclusions: Our design-based research process identified resident needs and developed dashboard elements to meet them. This work will inform the creation and evolution of CBME assessment dashboards designed to support resident learning.

Author(s):  
Yusuf Yilmaz ◽  
Robert Carey ◽  
Teresa Chan ◽  
Venkat Bandi ◽  
Shisong Wang ◽  
...  

Background: Canadian specialist residency training programs are implementing a form of competency-based medical education (CBME) that requires frequent assessments of entrustable professional activities (EPAs). Faculty struggle to provide helpful feedback and assign appropriate entrustment scores. CBME faculty development initiatives rarely incorporate teaching metrics. Dashboards could be used to visualize faculty assessment data to support faculty development. Methods: Using a design-based research process, we identified faculty development needs related to CBME assessments and designed a dashboard containing elements (data, analytics, and visualizations) meeting these needs. Data was collected within the emergency medicine residency program at the University of Saskatchewan through interviews with program leaders, faculty development experts, and faculty participating in development sessions. Two investigators thematically analyzed interview transcripts to identify faculty needs that were audited by a third investigator. The needs were described using representative quotes and the dashboard elements designed to address them. Results: Between July 1, 2019 and December 11, 2020 we conducted 15 interviews with nine participants (two program leaders, three faculty development experts, and four faculty members). Three needs emerged as themes from the analysis: analysis of assessments, contextualization of assessments, and accessible reporting. We addressed these needs by designing an accessible dashboard to present contextualized quantitative and narrative assessment data for each faculty member. Conclusions: We identified faculty development needs related to EPA assessments and designed dashboard elements to meet them. The resulting dashboard was used for faculty development sessions. This work will inform the development of CBME assessment dashboards for faculty.


Author(s):  
Brent Thoma ◽  
Venkat Bandi ◽  
Robert Carey ◽  
Debajyoti Mondal ◽  
Robert Woods ◽  
...  

Background: Competency-based programs are being adopted in medical education around the world. Competence Committees must visualize learner assessment data effectively to support their decision-making. Dashboards play an integral role in decision support systems in other fields. Design-based research allows the simultaneous development and study of educational environments. Methods: We utilized a design-based research process within the emergency medicine residency program at the University of Saskatchewan to identify the data, analytics, and visualizations needed by its Competence Committee, and developed a dashboard incorporating these elements. Narrative data were collected from two focus groups, five interviews, and the observation of two Competence Committee meetings. Data were qualitatively analyzed to develop a thematic framework outlining the needs of the Competence Committee and to inform the development of the dashboard. Results: The qualitative analysis identified four Competence Committee needs (Explore Workplace-Based Assessment Data, Explore Other Assessment Data, Understand the Data in Context, and Ensure the Security of the Data). These needs were described with narratives and represented through visualizations of the dashboard elements. Conclusions: This work addresses the practical challenges of supporting data-driven decision making by Competence Committees and will inform the development of dashboards for programs, institutions, and learner management systems.


CJEM ◽  
2019 ◽  
Vol 21 (S1) ◽  
pp. S48
Author(s):  
L. Costello ◽  
N. Argintaru ◽  
A. Wong ◽  
R. Simard ◽  
M. Chacko ◽  
...  

Innovation Concept: Emergency medicine (EM) programs have restructured their training using a Competence by Design model. This model emphasizes entrustable professional activities (EPAs) that residents must fulfill before advancing in their training. The first EPA (EPA 1) for the transition to discipline (TTD) stage involves managing the unstable patient. Data from the University of Toronto (U of T) program suggests residents lack enough exposure to these patient presentations during TTD – creating a disconnect between anticipated clinical exposure and the expectation for residents to achieve competence in EPA 1. Methods: To overcome this gap, U of T EM faculty specifically targeted EPA 1 while designing the TTD curriculum. Kern's six-step approach to curriculum development in medical education was used. This six-step approach involves: problem identification, needs assessment, goals and objectives, education strategies, implementation and evaluation. To maximize feasibility of the new curriculum, existing sessions were mapped against EPAs and required training activities to identify synchrony where possible. Residents were scheduled on EM rotations with weekly academic days that included this novel curriculum. Curriculum, Tool or Material: Didactic lectures, procedural workshops and simulation were closely integrated in TTD to address EPA 1. Lectures introduced approaches to cardinal presentations. An interactive workshop introduced ACLS and PALS algorithms and defibrillator use. Three simulation sessions focused on ACLS, shock, airway, trauma and the altered patient. A final simulation session allowed spaced-repetition and integration of these topics. After the completion of TTD, residents participated in a six-scenario simulation OSCE directly assessing EPA 1. Conclusion: The curriculum was evaluated using a multifaceted approach including surveys, self-assessments, faculty feedback and OSCE performance. Overall, the curriculum achieved its goal in addressing EPA 1. It was well-received by faculty and residents. Residents rated the sessions highly, and self-reported improved confidence in assessing unstable patients and adhering to ACLS algorithms. The simulation OSCE demonstrated expected competency by residents in EPA 1. One limitation identified was the lack of a pediatric simulation session which has now been incorporated into the curriculum. Moving forward, this innovative curriculum will undergo continuous cycles of evaluation and improvement with a goal of applying a similar design to other stages of CBD.


2021 ◽  
Vol 8 ◽  
pp. 237428952110417
Author(s):  
Bronwyn H. Bryant

Entrustable professional activities are an intuitive form of workplace-based assessment that can support competency-based medical education. Many entrustable professional activities have been written and published, but few studies describe the feasibility or implementation of entrustable professional activities in graduate medical education. The frozen section entrustable professional activit was introduced into the pathology residency training at the University of Vermont for postgraduate year 1 at the start of their training in frozen section. The feasibility of the entrustable professional activit was evaluated based on 3 criteria: (a) utilization, (b) support of frozen section training, and (c) generating data to support entrustment decision about residents’ readiness to take call. The entrustable professional activit was well utilized and satisfactory to residents, faculty, pathologists’ assistants, and Clinical Competency Committee members. Most members of the Clinical Competency Committee agreed they had sufficient data and noted higher confidence in assessing resident readiness to take call with the addition of entrustable professional activit to the residents’ assessment portfolio. Residents did not endorse it helped them prepare for call; however, the interruption to frozen section training due to the COVID-19 pandemic was a significant contributing factor. The frozen section entrustable professional activit is a feasible addition to pathology resident training based on utilization, support of training, and generation of data to support entrustment decisions for graduated responsibilities. The implementation and integration of the entrustable professional activit into pathology training at our institution is described with discussion of adjustments for future use.


2021 ◽  
Author(s):  
Cynthia R Peng ◽  
Kimberly A Schertzer ◽  
Holly A Caretta-Weyer ◽  
Stefanie S Sebok-Syer ◽  
William Lu ◽  
...  

BACKGROUND The 13 Core Entrustable Professional Activities (EPAs) are key competency-based learning outcomes in the transition from undergraduate to graduate medical education. Five of these EPAs (EPA2: prioritizing differential, EPA3: recommending and interpreting tests, EPA4: entering orders and prescriptions, EPA5: documenting clinical encounters, and EPA10: recognizing urgent and emergent conditions) are uniquely suited for online assessment. OBJECTIVE For this pilot study, we created a web-based simulation platform for diagnostic assessment of these EPAs and examined its feasibility and acceptability. METHODS Four simulation cases underwent three rounds of consensus panels and pilot testing. Incoming emergency medicine interns (n=15) completed all cases, and up to 4 “look for” statements, which encompassed specific EPAs, were generated for each participant: 1) performing harmful or missing actions, 2) narrow differential or wrong final diagnosis, 3) having errors in documentation, and 4) lack of recognition and stabilization of urgent diagnoses. Finally, we interviewed a sample of interns (n=5) and residency leadership (n=5) and analyzed the responses using thematic analysis. RESULTS All participants had at least 1 missing critical action and 40% participants performed at least one harmful action across all 4 cases. The final diagnosis was not included in the differential diagnosis in more than half of assessments (53%). Other errors included choosing the incorrect documentation (40%) and indiscriminately applying oxygen (60%). The themes to the interviews included: psychological safety of the interface, ability to assess learning, and fidelity of cases. The most valuable feature cited was the ability to place orders in a realistic electronic medical record interface. CONCLUSIONS This study demonstrates the feasibility and acceptability of this platform for diagnostic assessment of specific EPAs. This approach rapidly identifies potential areas of concern for incoming interns using an asynchronous format, provides this feedback in a manner appreciated by residency leadership, and informs individualized learning plans.


CJEM ◽  
2019 ◽  
Vol 21 (S1) ◽  
pp. S100
Author(s):  
E. Stoneham ◽  
L. Witt ◽  
Q. Paterson ◽  
L. Martin ◽  
B. Thoma

Innovation Concept: Competence by Design (CBD) was implemented nationally for Emergency Medicine (EM) residents beginning training in 2018. One challenge is the need to introduce residents to Entrustable Professional Activities (EPAs) that are assessed across numerous clinical rotations. The Royal College's resources detail these requirements, but do not map them to specific rotations or present them in a succinct format. This is problematic as trainees are less likely to succeed when expectations are unclear. We identified a need to create practical resources that residents can use at the bedside. Methods: We followed an intervention mapping framework to design two practical, user-friendly, low-cost, aesthetically pleasing resources that could be used by residents and observers at the bedside to facilitate competency-based assessment. Curriculum, Tool or Material: First, we designed a set of rotation- and stage-specific EPA reference cards for the use of residents and observers at the bedside. These cards list EPAs and clinical presentations likely to be encountered during various stages of training and on certain rotations. Second, we developed a curriculum board to organize the EPA reference cards by stage based upon our program's curriculum map. The curriculum board allows residents to view the program's curriculum map and the EPAs associated with each clinical rotation at a glance. It also contains hooks to hang and store extra cards in an organized manner. Conclusion: We believe that these practical and inexpensive tools facilitated our residency program's transition to competency-based EPA assessments. Anecdotally, the residents are using the cards and completing the suggested rotation-specific EPAs. We hope that the reference cards and curriculum board will be successfully incorporated into other residency programs to facilitate the introduction of their EPA-based CBD assessment system.


CJEM ◽  
2019 ◽  
Vol 22 (1) ◽  
pp. 95-102 ◽  
Author(s):  
Jonathan Sherbino ◽  
Glen Bandiera ◽  
Ken Doyle ◽  
Jason R. Frank ◽  
Brian R. Holroyd ◽  
...  

ABSTRACTCanadian specialist emergency medicine (EM) residency training is undergoing the most significant transformation in its history. This article describes the rationale, process, and redesign of EM competency-based medical education. The rationale for this evolution in residency education includes 1) improved public trust by increasing transparency of the quality and rigour of residency education, 2) improved fiscal accountability to government and institutions regarding specialist EM training, 3) improved assessment systems to replace poor functioning end-of-rotation assessment reports and overemphasis on high-stakes, end-of-training examinations, and 4) and tailored learning for residents to address individualized needs. A working group with geographic and stakeholder representation convened over a 2-year period. A consensus process for decision-making was used. Four key design features of the new residency education design include 1) specialty EM-specific outcomes to be achieved in residency; 2) designation of four progressive stages of training, linked to required learning experiences and entrustable professional activities to be achieved at each stage; 3) tailored learning that provides residency programs and learner flexibility to adapt to local resources and learner needs; and 4) programmatic assessment that emphasizes systematic, longitudinal assessments from multiple sources, and sampling sentinel abilities. Required future study includes a program evaluation of this complex education intervention to ensure that intended outcomes are achieved and unintended outcomes are identified.


CJEM ◽  
2020 ◽  
Vol 22 (S1) ◽  
pp. S33-S34
Author(s):  
H. Alrimawi ◽  
MB BCh ◽  
S. Sample ◽  
T. Chan

Introduction: With the transition of Emergency Medicine into competency based medical education (CBME), entrustable professional activities (EPAs) are used to evaluate residents on performed clinical duties. This study aimed to determine if implementing a case-based orientation, designed to increase recognition of available EPAs, into CBME orientation would help residents increase the number of EPAs completed. Methods: We designed an intervention consisting of clinical cases that were reviewed by national EPA experts who identified which EPAs could be assessed from each case. A case-based session was incorporated into the 2019 CBME orientation for The McMaster Emergency Medicine Program. Postgraduate Year (PGY)1 residents read the cases and discussed which EPAs could be obtained with PGY2/faculty facilitators. The number of EPAs completed in the first two blocks of PGY1 was determined from local program data and Student's t-test was used to compare averages between cohorts. Results: We analyzed data from 22 trainees (7 in 2017, 8 in 2018, and 7 in 2019). In the first two blocks of PGY1, the intervention cohort (2019) had a significantly higher average number of EPAs completed per trainee (47.4 [SD 11.8]) than the historical cohort (25.3 [SD 6.7]) (p < 0.001) (Cohen's d = 2.3). No significant difference existed in the number EPAs obtained between the 2017/2018 cohorts, with averages of 24.3 [SD 6.8] and 26.1 [SD 7.0] per trainee respectively (p = 0.6). Conclusion: Implementation of a case-based orientation led by CBME-experienced facilitators nearly doubled the EPA acquisition rate of our PGY1s. The consistent EPA acquisition by the 2017/2018 cohorts suggest that the post-intervention increase was not solely due to developed familiarity with the CBME curriculum.


CJEM ◽  
2019 ◽  
Vol 21 (S1) ◽  
pp. S108
Author(s):  
S. Segeren ◽  
L. Shepherd ◽  
R. Pack

Introduction: For many years, Emergency Medicine (EM) educators have used narrative comments to assess their learners on each shift, either in isolation or combined with some type of Likert scale ranking. Competency based medical education (CBME), soon to be fully implemented throughout Canadian EM educational programs, encourages this type of frequent low-stakes narrative assessment. It is important to understand what information is currently garnered from existing narrative assessments in order to successfully and smoothly transition to the CBME system. The purpose of this study was to explore how one Canadian undergraduate EM program's narrative assessment comments mapped to two competency frameworks: one traditional CanMEDS-based and one competency-based, built on entrustable professional activities (EPAs). Methods: A qualitative and quantitative content analysis of 1925 retrospective, narrative assessments was conducted for the 2015/2016 and 2016/2017 academic years. The unprompted comments were mapped to the Royal College CanMEDS framework and the Association of Faculties of Medicine of Canada EPA Framework. Using an iterative coding process as per accepted qualitative methodologies, additional codes were generated to classify comments and identify themes that were not captured by either framework. Results: 93% and 85% of the unprompted narrative assessments contained comments that mapped to at least one CanMEDS role or EPA competency, respectively. The most common CanMEDS role commented upon was Medical Expert (86%), followed by Communicator, Collaborator and Scholar (all at 23%). The most common EPA competency mentioned related to history and physical findings (62%) followed by management plan (33%), and differential diagnosis (33%). However, 75% of narrative comments contained within the assessments, included ideas that did not fall into either framework but were repeated with frequency to suggest importance. The experiential characteristics of working with a learner were commented upon by 22% of preceptors. Other unmapped themes included contextual information, generalities and platitudes, and directed feedback for next steps to improve. Conclusion: While much of the currently captured data can be mapped to established frameworks, important information for both learner and assessor may be lost by limiting comments to the competencies described within a particular framework, suggesting caution when transitioning to a CBME assessment program.


Sign in / Sign up

Export Citation Format

Share Document