Graduates of different UK medical schools show substantial differences in performance on MRCP(UK) Part 1, Part 2 and PACES examinations

McManus, I. C.; Elder, Andrew T.; de Champlain, Andre; Dacre, Jane E.; Mollon, Jennifer; Chis, Liliana
January 2008
BMC Medicine;2008, Vol. 6, Special section p1
Academic Journal
Background: The UK General Medical Council has emphasized the lack of evidence on whether graduates from different UK medical schools perform differently in their clinical careers. Here we assess the performance of UK graduates who have taken MRCP(UK) Part 1 and Part 2, which are multiple-choice assessments, and PACES, an assessment using real and simulated patients of clinical examination skills and communication skills, and we explore the reasons for the differences between medical schools. Method: We perform a retrospective analysis of the performance of 5827 doctors graduating in UK medical schools taking the Part 1, Part 2 or PACES for the first time between 2003/2 and 2005/3, and 22453 candidates taking Part 1 from 1989/1 to 2005/3. Results: Graduates of UK medical schools performed differently in the MRCP(UK) examination between 2003/2 and 2005/3. Part 1 and 2 performance of Oxford, Cambridge and Newcastle-upon-Tyne graduates was significantly better than average, and the performance of Liverpool, Dundee, Belfast and Aberdeen graduates was significantly worse than average. In the PACES (clinical) examination, Oxford graduates performed significantly above average, and Dundee, Liverpool and London graduates significantly below average. About 60% of medical school variance was explained by differences in pre-admission qualifications, although the remaining variance was still significant, with graduates from Leicester, Oxford, Birmingham, Newcastle-upon-Tyne and London overperforming at Part 1, and graduates from Southampton, Dundee, Aberdeen, Liverpool and Belfast underperforming relative to pre-admission qualifications. The ranking of schools at Part 1 in 2003/2 to 2005/3 correlated 0.723, 0.654, 0.618 and 0.493 with performance in 1999-2001, 1996-1998, 1993-1995 and 1989-1992, respectively. Conclusion: Candidates from different UK medical schools perform differently in all three parts of the MRCP(UK) examination, with the ordering consistent across the parts of the exam and with the differences in Part 1 performance being consistent from 1989 to 2005. Although pre-admission qualifications explained some of the medical school variance, the remaining differences do not seem to result from career preference or other selection biases, and are presumed to result from unmeasured differences in ability at entry to the medical school or to differences between medical schools in teaching focus, content and approaches. Exploration of causal mechanisms would be enhanced by results from a national medical qualifying examination.


Related Articles

  • Assessment methods in surgical training in the United Kingdom. Evgeniou, Evgenios; Peter, Loizou; Tsironi, Maria; Iyer, Srinivasan // Journal of Educational Evaluation for Health Professions;2013, Vol. 10, p1 

    A career in surgery in the United Kingdom demands a commitment to a long journey of assessment. The assessment methods used must ensure that the appropriate candidates are selected into a programme of study or a job and must guarantee public safety by regulating the progression of surgical...

  • Team-Based Learning in a UK Medical School: Using Mobile- Friendly Technology to Support the In-class Individual Readiness Assurance Test. Khogali, Shihab; Smithies, Alisdair; Gray, Alison; Manca, Annalisa; Lafferty, Natalie // Proceedings of the European Conference on e-Learning;2014, p273 

    Team-based learning (TBL) provides opportunities for application of knowledge and problem-solving. The TBL strategy incorporates structured individual and teamwork activities and multiple small groups in a single classroom setting. Students are required to prepare individually before attending...

  • Cardiac auscultation via simulation: a survey of the approach of UK medical schools. Owen, Samantha Jayne; Wong, Kenneth // BMC Research Notes;9/11/2015, Vol. 8 Issue 1, p1 

    Background: A decline in clinical skills of medical students and junior doctors is well documented. We aim to determine how the 32 UK medical schools utilise simulated heart sounds to develop medical students' cardiac auscultation skills. Methods: Representatives of all 32 UK medical schools...

  • Current teaching of paediatric musculoskeletal medicine within UK medical schools--a need for change. Sharmila Jandial; Tim Rapley; Helen Foster // Rheumatology;May2009, Vol. 48 Issue 5, p587 

    Objectives. Doctors involved in the assessment of children have low confidence in their clinical skills within paediatric musculoskeletal (pMSK) medicine and demonstrate poor performance in clinical practice. Core paediatric clinical skills are taught within undergraduate child health teaching...

  • MRCGP Q&A. Harris, Emma // InnovAiT;Mar2012, Vol. 5 Issue 3, p187 

    The article provides an answer to a question concerning the preparation of own examination for a Clinical Skills Assessment (CSA) study group.

  • GP training gets tougher as pass rates fall under nMRCGP. Praities, Nigel // Pulse;7/9/2008, Vol. 68 Issue 24, p4 

    The article reports on the pass rates for the general practice training of the general practitioners (GPs) in Great Britain. The completion for the training becomes harder for those who aim to enter the field with the coming of the new membership of the Royal College of General Practitioners...

  • La importancia de la evaluación por competencias en contextos clínicos dentro de la docencia universitaria en salud. Correa Bautista, Jorge Enrique // Revista Ciencias de la Salud;Apr2012, Vol. 10 Issue 1, p73 

    Competency assessment (CA) has renewed the way to determine the clinical performance of health professionals. To this end, the university teaching requires conceptual and methodological domain on the various techniques of formative assessment. This article reports the main technical competency...

  • Validity of final examinations in undergraduate medical training. van der Vleuten, Cees // BMJ: British Medical Journal (International Edition);11/11/2000, Vol. 321 Issue 7270, p1217 

    Examines the viability of continuous assessment and final examinations in medical education in Great Britain. Questions regarding the validity and reliability of final examinations; Lack of feedback and opportunities for correction associated with such exams; Need for more professionally...

  • Structured assessments of clinical competence. Boursicot, Katharine A. M. // British Journal of Hospital Medicine (17508460);Jun2010, Vol. 71 Issue 6, p342 

    Clinical teachers are often involved in assessing clinical competence in the workplace, in universities and colleges. Assessments commonly used to formally assess clinical competence include long and short cases and the objective structured clinical examination which, if well designed, is a fair...


Read the Article


Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics