medical school performance
Recently Published Documents


TOTAL DOCUMENTS

39
(FIVE YEARS 3)

H-INDEX

14
(FIVE YEARS 0)

BMJ Open ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. e046615
Author(s):  
Ricky Ellis ◽  
Duncan S G Scrimgeour ◽  
Peter A Brennan ◽  
Amanda J Lee ◽  
Jennifer Cleland

BackgroundIdentifying predictors of success in postgraduate examinations can help guide the career choices of medical students and may aid early identification of trainees requiring extra support to progress in specialty training. We assessed whether performance on the educational performance measurement (EPM) and situational judgement test (SJT) used for selection into foundation training predicted success at the Membership of the Royal College of Surgeons (MRCS) examination.MethodsThis was a longitudinal, cohort study using data from the UK Medical Education Database (https://www.ukmed.ac.uk). UK medical graduates who had attempted Part A (n=2585) and Part B (n=755) of the MRCS between 2014 and 2017 were included. χ2 and independent t-tests were used to examine the relationship between medical school performance and sociodemographic factors with first-attempt success at MRCS Part A and B. Multivariate logistic regression was employed to identify independent predictors of MRCS performance.ResultsThe odds of passing MRCS increased by 55% for Part A (OR 1.55 (95% CI 1.48 to 1.61)) and 23% for Part B (1.23 (1.14 to 1.32)) for every additional EPM decile point gained. For every point awarded for additional degrees in the EPM, candidates were 20% more likely to pass MRCS Part A (1.20 (1.13 to 1.29)) and 17% more likely to pass Part B (1.17 (1.04 to 1.33)). For every point awarded for publications in the EPM, candidates were 14% more likely to pass MRCS Part A (1.14 (1.01 to 1.28)). SJT score was not a statistically significant independent predictor of MRCS success.ConclusionThis study has demonstrated the EPM’s independent predictive power and found that medical school performance deciles are the most significant measure of predicting later success in the MRCS. These findings can be used by medical schools, training boards and workforce planners to inform evidence-based and contemporary selection and assessment strategies.


2021 ◽  
Author(s):  
Ahmet Murt ◽  
David Hope ◽  
Recep Ozturk ◽  
Helen Cameron

Abstract Background Medical educators and assessors like to include predictive validity in their validity arguments but relevant evidence may be difficult to find. External standardized examinations may have a role in validating both the educational process of medical schools and their assessment results and outcomes. A strong correlation between medical school and external exam performances may also lend evidence of validity to the external examination. This work from one of Turkey’s top medical schools explored the correlations between students’ medical school performances and scores from the Specialization in Medicine Exam (TUS). The TUS is a post-graduate national ranking examination. Methods A total of 246 students from two different programs of a medical school, which have identical curricula but different admission scores were studied retrospectively. Students’ year based Grade Point Averages (GPAs) and end-of-school (graduating) GPAs were calculated using a weighted mean method. Bivariate correlations were calculated between year specific GPAs, graduating GPAs and TUS scores. Results Students’ inter-year GPAs showed strong significant correlations (“r” ranging from 0.59 to 0.86, p < 0.001). Their graduating GPA also had a strong significant correlation with TUS scores (r = 0.65, p < 0.001). Linear regression models showed the significant relation between medical school performance and post-graduation national exam performance. Conclusion Student success has a high degree of consistency throughout the medical school and students’ performance across all domains of assessment in the undergraduate program might be a good predictor of cognitive skills in an external national examination in the early postgraduate phase.


2019 ◽  
Vol 13 (2) ◽  
pp. 122-125
Author(s):  
Abebe Ayalew BEKEL ◽  
Dawit Habte WOLDEYES ◽  
Yibeltal Wubale ADAMU ◽  
Mengstu Desalegn KIROS ◽  
Shibabaw Tedila TRUNEH ◽  
...  

2019 ◽  
Vol 11 (4) ◽  
pp. 475-478 ◽  
Author(s):  
Judith M. Brenner ◽  
Thurayya Arayssi ◽  
Rosemarie L. Conigliaro ◽  
Karen Friedman

ABSTRACT Background The Medical School Performance Evaluation (MSPE) is an important factor for application to residency programs. Many medical schools are incorporating recent recommendations from the Association of American Medical Colleges MSPE Task Force into their letters. To date, there has been no feedback from the graduate medical education community on the impact of this effort. Objective We surveyed individuals involved in residency candidate selection for internal medicine programs to understand their perceptions on the new MSPE format. Methods A survey was distributed in March and April 2018 using the Association of Program Directors in Internal Medicine listserv, which comprises 4220 individuals from 439 residency programs. Responses were analyzed, and themes were extracted from open-ended questions. Results A total of 140 individuals, predominantly program directors and associate program directors, from across the United States completed the survey. Most were aware of the existence of the MSPE Task Force. Respondents read a median of 200 to 299 letters each recruitment season. The majority reported observing evidence of adoption of the new format in more than one quarter of all medical schools. Among respondents, nearly half reported the new format made the MSPE more important in decision-making about a candidate. Within the MSPE, respondents recognized the following areas as most influential: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank). Conclusions The internal medicine graduate medical education community finds value in many components of the new MSPE format, while recognizing there are further opportunities for improvement.


BMJ Open ◽  
2018 ◽  
Vol 8 (5) ◽  
pp. e020291 ◽  
Author(s):  
Lazaro M Mwandigha ◽  
Paul A Tiffin ◽  
Lewis W Paton ◽  
Adetayo S Kasim ◽  
Jan R Böhnke

2016 ◽  
Vol 91 (3) ◽  
pp. 388-394 ◽  
Author(s):  
Paul George ◽  
Yoon Soo Park ◽  
Julianne Ip ◽  
Philip A. Gruppuso ◽  
Eli Y. Adashi

Sign in / Sign up

Export Citation Format

Share Document