Hostname: page-component-586b7cd67f-tf8b9 Total loading time: 0 Render date: 2024-11-24T04:50:40.486Z Has data issue: false hasContentIssue false

Peak performance: Simulation and the nature of expertise in emergency medicine

Published online by Cambridge University Press:  28 January 2019

Christopher M. Hicks*
Affiliation:
St. Michael's Hospital, Toronto, ON Li Ka Shing Knowledge Institute, Toronto, ON Department of Medicine, University of Toronto, Toronto, ON
Andrew Petrosoniak
Affiliation:
St. Michael's Hospital, Toronto, ON Li Ka Shing Knowledge Institute, Toronto, ON Department of Medicine, University of Toronto, Toronto, ON
*
Correspondence to: Dr. Christopher M. Hicks, Dept. Emergency Medicine, Bond Wing, St. Michael's Hospital, 30 Bond Street, Toronto, CanadaM5B 1W8; Email: [email protected]

Abstract

Type
Commentary
Copyright
Copyright © Canadian Association of Emergency Physicians 2019 

INTRODUCTION

Expertise in emergency medicine is difficult to define. Justice Potter Stewart described the threshold for obscenity as, “I know it when I see it”; similar proclamations have been made about the nature, characteristics, and process for developing expert performance. While this has colloquial appeal, there remains a need to objectively describe and quantify expertise in emergency medicine to guide the development of peak expert performance. Anders Ericsson holds that “establishing a science of superior performance starts with the accumulation of a body of reproducible empirical phenomena,” and that the process that drives the development of mastery in a given domain is deliberate practice: engaging in purposeful, directed rehearsal, and execution with the provision of timely feedback on performance.Reference Ericsson1

There is a weak relationship between experience, reputation, and skill – experience, it seems, does not always equal expertise. Thus, a better understanding of expert performance in a given domain bears relevance for both the trainee, still learning the craft, and the seasoned practitioner pushing towards mastery. The shift towards competency-based medical education has taught us much about the metrics of competence in healthcare, but the public expectation is that of expertise, both as an outcome of training and a fluid end point of a career in emergency medicine. This poses a challenge for colleges and accrediting bodies, which have an interest in responding to public expectation with meaningful data about the experts they develop.

Enter simulation-based medical education: a training, assessment, and research tool uniquely positioned to inform and accelerate the conversation about expert performance in emergency medicine. Simulation offers a controlled, standardized, reproducible training environment that poses no threat to patients. Full-scale, computer-operated mannequins create immersive clinical environments sufficient to generate the psychological “buy in” for exploring team-based, non-technical skills. Task trainers allow for focused micro-skill development. Standardized patients and actors facilitate interpersonal and psychological skills training. Layer this with the provision of focused, specific feedback and repetition while varying context and level of difficulty and you have a powerful means by which to explore, quantify, and accelerate expert performance.Reference Issenberg, McGaghie and Petrusa2

New techniques to augment simulation-based training and assessment hold significant promise for enhancing our understanding of expertise in emergency medicine. Szulewski et al.Reference Szulewski, Egan and Hall3 have demonstrated that specific patterns of gaze tracking are associated with performance during simulated resuscitation scenarios, highlighting differences between high- and low-performing residents. This expands on a prior body of work using gaze tracking and pupillometry and to assess attention and cognitive load during high-stakes clinical events.Reference Szulewski, Kelton and Howes4 Gaze tracking is a surrogate for attention, allowing researchers to quantify what healthcare providers are paying attention to and ignoring, which in turn yields insight into what is deemed relevant, irrelevant, or excluded all together. Szulewski et al.Reference Szulewski, Fernando, Bayles and Howes5 have used pupillometry and task-evoked pupillary responses to show that, when compared to experts, novice physicians exert a greater mental effort to answer clinical reasoning questions, even when they are able to answer correctly. Both gaze tracking and pupillometry appear to be reliable techniques for differentiating levels of expertise in healthcare providers.

Few specialties in medicine are as reliant on expert pattern recognition as emergency pediatrics, where the subjective of “sick or not sick” characterization is often decided in a matter of milliseconds and at an unconscious level informed by prior experience and robust illness scripts. Damji et al. have explored the feasibility of using eye tracking in pediatric trauma resuscitations.Reference Damji, Lee-Nobbee, Borkenhagen and Cheng6 Characterizing visual fixation as >0.2 seconds, the authors describe dwell times of attending pediatric emergency medicine physicians who focus primarily on the mannequin, but move periodically to other structural (monitors, checklists) or social (team members) elements of the clinical environment. This preliminary exploration helps set the stage for future work characterizing the nature of expertise and decision-making in pediatric resuscitation.

Performance analysis and skill development do not end with the acquisition of expert skill, but rather defines it. Ultra-elite athletes are typified by a devotion to skill development, facilitated by video and computer-assisted feedback on biometrics, movement, and task execution from practice and game day. The relevance of psychological skill development to performance execution in athletics cannot be under-stated; a similar relationship probably exists in other high-stress, high-stakes domains like emergency medicine.Reference Damji, Lee-Nobbee, Borkenhagen and Cheng6 Salivary cortisol and heart rate variability measurements of acute stress during simulated resuscitations are correlated with performance, and feedback using these tools forms the basis for training interventions to decrease acute stress during high stakes clinical events.Reference LeBlanc7, Reference Hicks and Leblanc8

To truly push the boundaries of expertise, we must examine it where it lives: the in situ clinical environment. Similar to gaze tracking, in situ movement analysis provides insight into how healthcare team members move in their clinical space when responding to complex trauma resuscitation.Reference Petrosoniak, Almeida and Pozzobon9 The Operating Room Black Box project is currently ongoing at St. Michael's Hospital in Toronto, recording audio, video, and patient monitoring data from live operating room events.Reference Jung, Jüni, Lebovic and Grantcharov10 With artificial intelligence-enhanced analysis and plans to expand the new St. Michael's trauma bay, the possibilities for exploring and improving expertise in emergency situations are bounded only by imagination. Fast forward a few years, and research combining measurements of acute stress, gaze tracking, pupillometry, and movement analysis, all under the analytic watch of black box technology may become reality. This, in turn, may help answer some of the most persistently vexatious questions about the nature of expertise in emergency medicine in a reliable, reproducible, and quantifiable way.

Competing interests

None declared.

References

REFERENCES

1.Ericsson, KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 2008;15:988-94.10.1111/j.1553-2712.2008.00227.xGoogle Scholar
2.Issenberg, SB, McGaghie, WC, Petrusa, ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27(1):10-28.10.1080/01421590500046924Google Scholar
3.Szulewski, A, Egan, R, Hall, R, et al. A new way to look at simulation-based assessment: the relationship between gaze-tracking and exam performance. CJEM 2019;21(1):129-37.10.1017/cem.2018.391Google Scholar
4.Szulewski, A, Kelton, D, Howes, D. Pupillometry as a tool to study expertise in medicine. Frontline Res Learn 2017;5(3):53-63.10.14786/flr.v5i3.256Google Scholar
5.Szulewski, A, Fernando, SM, Bayles, J, Howes, D. Increasing pupil size is associated with increasing cognitive processing demands: a pilot study using a mobile eye-tracking device. Open J Emerg Med 2014;2:8-11.10.4236/ojem.2014.21002Google Scholar
6.Damji, O, Lee-Nobbee, P, Borkenhagen, D, Cheng, A. Analysis of eye-tracking behaviours in a pediatric trauma simulation. CJEM 2019;21(1):138-40.10.1017/cem.2018.450Google Scholar
7.LeBlanc, VR. The effects of acute stress on performance: implications for health professions education. Acad Med 2009;84(10 Suppl):S25-33.10.1097/ACM.0b013e3181b37b8fGoogle Scholar
8.Hicks, CM, Leblanc, V. Hardened tendencies: persistence of initial appraisals following simulation-based stress training. CJEM 2018;20(Suppl 1):S81.10.1017/cem.2018.267Google Scholar
9.Petrosoniak, A, Almeida, R, Pozzobon, LD, et al. Tracking workflow during high-stakes resuscitation: the application of a novel clinician movement tracing tool during in situ trauma simulation. BMJ Simul Technol Enhanc Learn 2018; epub, doi:10.1136/ bmjstel-2017-000300.Google Scholar
10.Jung, JJ, Jüni, P, Lebovic, G, Grantcharov, T. First-year analysis of the operating room black box study. Ann Surg 2018; epub, doi:10.1097/SLA.0000000000002863.Google Scholar