Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-26T08:22:40.614Z Has data issue: false hasContentIssue false

A summative, Objective, Structured, Clinical Examination in ENT used to assess postgraduate doctors after one year of ENT training, as part of the Diploma of Otorhinolaryngology, Head and Neck Surgery

Published online by Cambridge University Press:  16 July 2009

A B Drake-Lee*
Affiliation:
ENT Department, University Hospital NHS Trust, Birmingham, UK
D Skinner
Affiliation:
ENT Department, Royal Shrewsbury Hospital, UK
M Hawthorne
Affiliation:
ENT Department, James Cook University Hospital, Middlesbrough, UK
R Clarke
Affiliation:
ENT Department, Royal Liverpool Children's Hospital, UK
*
Address for correspondence: Mr Adrian Drake-Lee, Queen Elizabeth Hospital, Edgbaston, Birmingham B15 2TH, UK. E-mail: [email protected]

Abstract

Context:

‘High stakes’ postgraduate medical examinations should conform to current educational standards. In the UK and Ireland, national assessments in surgery are devised and managed through the examination structure of the Royal Colleges of Surgeons. Their efforts are not reported in the medical education literature. In the current paper, we aim to clarify this process.

Objectives:

To replace the clinical section of the Diploma of Otorhinolaryngology with an Objective, Structured, Clinical Examination, and to set the level of the assessment at one year of postgraduate training in the specialty.

Methods:

After ‘blueprinting’ against the whole curriculum, an Objective, Structured, Clinical Examination comprising 25 stations was divided into six clinical stations and 19 other stations exploring written case histories, instruments, test results, written communication skills and interpretation skills. The pass mark was set using a modified borderline method and other methods, and statistical analysis of the results was performed.

Results:

The results of nine examinations between May 2004 and May 2008 are presented. The pass mark varied between 68 and 82 per cent. Internal consistency was good, with a Cronbach's α value of 0.99 for all examinations and split-half statistics varying from 0.96 to 0.99. Different standard settings gave similar pass marks.

Conclusions:

We have developed a summative, Objective, Structured, Clinical Examination for doctors training in otorhinolaryngology, reported herein. The objectives and standards of setting a high quality assessment were met.

Type
Main Articles
Copyright
Copyright © JLO (1984) Limited 2009

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1PMETB principles of an assessment system for postgraduate medical training, 2004. In: http://www.pmetb.org.uk/fileadmin/user/Standards_Requirements/PM_ETB_Scas_July2008_Final.pdf [23 June 2009]Google Scholar
2Calman, K. Hospital Doctors: Training for the Future: the Report of the Working Group on Specialist Medical Training. Heywood: Department of Health, 1993Google Scholar
3 Diploma in Otolaryngology – Head and Neck Surgery: Syllabus. In: http://www.intercollegiatemrcs.org.uk/dohns/syllabus_html [23 June 2009]Google Scholar
4Harden, R, Gleeson, F. Assessment of clinical competence using an Objective Structured Clinical Examination (OSCE). Med Educ 1979;13:1134–40CrossRefGoogle ScholarPubMed
5Harden, R, Stevenson, M, Downie, W, Wilson, G. Assessment of clinical competence using objective structured clinical examination. BMJ 1975;1:447–51CrossRefGoogle Scholar
6Schwartz, RW, Witzke, DB, Donnelly, MB, Stratton, T, Blue, AV, Sloan, DA. Assessing residents' clinical performance: cumulative results of a four-year study with the Objective Structured Clinical Examination. Surgery 1998;124:307–12CrossRefGoogle ScholarPubMed
7Cohen, R, Reznick, RK, Taylor, B, Provan, J, Rothman, A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160:302–5CrossRefGoogle ScholarPubMed
8MacRae, H, Regehr, G, Leadbetter, W, Reznick, R. A comprehensive examination for senior surgical resident. Am J Surg 2000;179:190–3CrossRefGoogle Scholar
9Patil, NG, Saing, H, Wong, J. Role of OSCE in evaluation of practical skills. Medical Teacher 2003;25:271–2CrossRefGoogle ScholarPubMed
10Brailovsky, C, Grand'Mason, P, Lescop, J. A large-scale multicentre objective structured clinical examination for licensure. Acad Med 1992;67:S37–9CrossRefGoogle Scholar
11Hutchinson, L, Aitken, P, Hayes, T. Are medical postgraduate certification processes valid? A systematic review of the published evidence. Med Educ 2002;36:7391CrossRefGoogle ScholarPubMed
12Drake-Lee, A, Skinner, D, Reid, A. An OSCE as part of the first-year SpR assessment in ENT. Ann R Coll Surg 2006;88(suppl):310–12Google Scholar
13Ebel, RL, Frisbie, DA. Essentials of Educational Measurement, 5 edn.Englewood Cliffs: Prentice Hall, 1991;370Google Scholar
15Frederiksen, J, Collins, A. A systems approach to educational testing. Educational Researcher 1989;18:2732CrossRefGoogle Scholar
16Boulet, J, Champlain, A, McKinley, D. Setting defensible performance standards in OSCEs and standarized patient examinations. Medical Teacher 2003;25:245–9CrossRefGoogle Scholar
17Norcini, JJ. Setting standards on educational tests. Med Educ 2003;37:464–9CrossRefGoogle ScholarPubMed
18Kramer, A, Muijtjens, A, Jansen, K, Dusman, H, Tan, L, van der Vleuten, M. Comparison of a rational and empirical standard setting procedure for an OSCE. Med Educ 2003;37:132–9CrossRefGoogle ScholarPubMed
19Davis, MH. OSCE: the Dundee experience. Medical Teacher 2003;25:255–61CrossRefGoogle ScholarPubMed
20Hodges, B. Validity and the OSCE. Medical Teacher 2003;25:250–4CrossRefGoogle ScholarPubMed
21Adamo, G. Simulated and standardized patients in OSCEs: achievements and challenges 1992–2003. Medical Teacher 2003;25:262–70CrossRefGoogle ScholarPubMed
22Patil, NG, Cheng, SW, Wong, J. Surgical competence. World J Surg 2003;27:943–7CrossRefGoogle ScholarPubMed
23Wilkinson, TJ, Newble, DI, Frampton, CM. Standard setting in an objective structured clinical examination: use of global ratings of borderline performance to determine the passing score. Med Educ 2001;35:1043–9CrossRefGoogle Scholar
24Smee, S, Blackmore, D. Setting standards for an objective structured clinical examination: the borderline method gains ground on Angoff. Med Educ 2001;35:1109–1010CrossRefGoogle ScholarPubMed
25Humphrey-Murto, S, MacFadyen, J. Standard setting: a comparison of case-author and modified borderline-group methods in a small scale OSCE. Acad Med 2002;77:729–32CrossRefGoogle Scholar
26McKinlay, D, Boulet, J, Hambleton, R. A work-centered approach for setting passing scores on performance-based assessments. Evaluation and the Health Professions 2005;28:349–69CrossRefGoogle Scholar