Hostname: page-component-78c5997874-fbnjt Total loading time: 0 Render date: 2024-11-14T11:16:32.862Z Has data issue: false hasContentIssue false

Faculty development in the age of competency-based medical education: A needs assessment of Canadian emergency medicine faculty and senior trainees

Published online by Cambridge University Press:  22 May 2019

Alexandra Stefan
Affiliation:
Division of Emergency Medicine, University of Toronto, ON
Justin N. Hall
Affiliation:
Division of Emergency Medicine, University of Toronto, ON
Jonathan Sherbino
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON.
Teresa M. Chan*
Affiliation:
Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON.
*
Correspondence to: Dr. Teresa Chan, 2nd Floor, McMaster Clinics, 237 Barton St. E., Hamilton, ON L8L 2X2; Email: [email protected]; Twitter: @TChanMD

Abstract

Objectives

The Royal College of Physicians and Surgeons of Canada (RCPSC) emergency medicine (EM) programs transitioned to the Competence by Design training framework in July 2018. Prior to this transition, a nation-wide survey was conducted to gain a better understanding of EM faculty and senior resident attitudes towards the implementation of this new program of assessment.

Methods

A multi-site, cross-sectional needs assessment survey was conducted. We aimed to document perceptions about competency-based medical education, attitudes towards implementation, perceived/prompted/unperceived faculty development needs. EM faculty and senior residents were nominated by program directors across RCPSC EM programs. Simple descriptive statistics were used to analyse the data.

Results

Between February and April 2018, 47 participants completed the survey (58.8% response rate). Most respondents (89.4%) thought learners should receive feedback during every shift; 55.3% felt that they provided adequate feedback. Many respondents (78.7%) felt that the ED would allow for direct observation, and most (91.5%) participants were confident that they could incorporate workplace-based assessments (WBAs). Although a fair number of respondents (44.7%) felt that Competence by Design would not impact patient care, some (17.0%) were worried that it may negatively impact it. Perceived faculty development priorities included feedback delivery, completing WBAs, and resident promotion decisions.

Conclusions

RCPSC EM faculty have positive attitudes towards competency-based medical education-relevant concepts such as feedback and opportunities for direct observation via WBAs. Perceived threats to Competence by Design implementation included concerns that patient care and trainee education might be negatively impacted. Faculty development should concentrate on further developing supervisors’ teaching skills, focusing on feedback using WBAs.

Résumé

Introduction

Les programmes de médecine d'urgence (MU) du Collège royal des médecins et chirurgiens du Canada sont passés, en juillet 2018, à un nouveau modèle de formation appelé Compétence par conception. Avant ce passage, une enquête avait été menée à l’échelle nationale pour dégager une meilleure compréhension des attitudes du personnel enseignant et des résidents séniors en MU à l’égard de la mise en œuvre du nouveau programme d’évaluation.

Méthode

Il s'agit d'une enquête transversale et multicentrique sur l’évaluation des besoins. Elle visait à recueillir des renseignements sur les perceptions des personnes concernées sur la formation médicale axée sur les compétences, sur leurs attitudes à l’égard de la mise en application du programme ainsi que sur les besoins perçus, suscités ou inaperçus du personnel en matière de perfectionnement. Les membres du personnel enseignant en MU et les résidents séniors ont été désignés par les directeurs de tous les programmes de MU du Collège royal. Les données ont été analysées à l'aide de simples statistiques descriptives.

Résultats

Au total, 47 participants ont rempli le questionnaire d'enquête (taux de réponse : 58,8%) entre février et avril 2018. La plupart des répondants (89,4%) ont indiqué que les apprenants devraient recevoir de la rétroaction à tous les postes de travail, et 55,3% avaient l'impression de donner une rétroaction adéquate. Bon nombre de répondants (78,7%) étaient également d'avis que les services des urgences se prêtaient bien à l'observation directe, et la plupart des participants (91,5%) avaient bon espoir d'intégrer les évaluations en milieu de travail (EMT). Par ailleurs, si un assez bon nombre de répondants (44,7%) croyaient que la formation axée sur la Compétence par conception n'aurait aucune incidence sur les soins aux patients, d'autres (17,0%) s'en inquiétaient. Enfin, les priorités perçues en matière de perfectionnement du personnel comprenaient la communication des rétroactions, la réalisation des EMT et les décisions relatives à la promotion des résidents.

Conclusion

Le personnel enseignant en MU du Collège royal a des attitudes favorables à l’égard de certains concepts liés à la formation médicale axée sur les compétences, tels que les rétroactions et les possibilités d'observation directe au moyen des EMT. Par contre, la mise en œuvre du programme de Compétence par conception suscite également des craintes, comme une incidence défavorable sur les soins aux patients et la formation des stagiaires. Le perfectionnement du personnel enseignant devrait donc porter davantage sur les aptitudes à enseigner des superviseurs, notamment sur la communication des rétroactions à l'aide des EMT.

Type
Original Research
Copyright
Copyright © Canadian Association of Emergency Physicians 2019 

CLINICIAN'S CAPSULE

What is known about the topic?

Emergency medicine faculty attitudes regarding Competence by Design have not been investigated.

What did this study ask?

We surveyed baseline perceptions about Competence by Design, attitudes towards implementation, perceived/prompted/unperceived faculty development needs.

What did this study find?

Emergency medicine education is well-situated to transition to Competence by Design given clinicians’ positive attitudes towards feedback, direct observation, and workplace-based assessments.

Why does this study matter to clinicians?

Education leaders should concentrate on supervision and observation skills but also address concerns regarding how Competence by Design can best complement/augment clinical care.

INTRODUCTION

Competency-based medical education represents a major shift from the traditional educational model in postgraduate medical education towards one that is less time-dependent and hinges more on the achievement of the necessary abilities to practise emergency medicine (EM).Reference Sherbino, Snell and Dath1Reference Hodges10 Competency-based medical education seeks to remediate some of the traditional training system's perceived flaws by tailoring teaching and assessment to the learner's needs and desired outcomes and ensuring that progression in training is based on achievement of clearly defined competencies required for practice.Reference Carraccio, Englander and Melle11,Reference Cooney, Chan and Gottlieb12 Multiple factors have led to the rapid adoption of competency-based medical education by many national bodies: the need to demonstrate sustained physician competence in an age of increased accountability to the public, variation in the quality of training of physicians in the current system, the prevalence of the “failure to fail” culture in medical education, and the need to ensure smoother transitions from training into clinical practice.Reference Carraccio and Englander4,Reference Cooney, Chan and Gottlieb1215

Competence by Design was developed by the Royal College of Physicians and Surgeons of Canada (RCPSC) as a change initiative to transform specialty medical education to a competency-based medical education model of teaching and assessment. EM is among six specialties that transitioned to Competence by Design in July 2018. The RCPSC Competence by Design model maintains the current 5-year duration of training but blends in competency-based medical education concepts such as refreshed outcomes required of EM graduates, new stages of training (transition to discipline, foundations, core, transition to practice) with associated milestones, tailored learning approaches, and programmatic assessment.

For the frontline clinical teacher, competency-based medical education implies increased direct observation of trainees’ clinical performance and assessment via entrustable professional activities (EPAs) – stage-specific, key tasks of the discipline.16 Though the EPA construct is new, the EM faculty community is well-versed in direct observation and assessment. Previous research regarding EM faculty perceptions of the 2005 CanMEDS framework (a paradigm that defines high-level competencies of specialist practice) showed that frontline clinicians found the structure helpful in organizing feedback and increasing awareness of the breadth of competencies expected of an independent practitioner.Reference Bandiera and Lendrum17,Reference Frank, Snell and Sherbino18 However, they struggled with the abstract nature of the CanMEDS roles and key competencies.Reference Bandiera and Lendrum17 Daily encounter cards have been used in EM to provide timely feedback to learners around all CanMEDS competencies, although one drawback was a tendency towards leniency bias.Reference Bandiera and Lendrum19,Reference Sherbino, Kulasegaram, Worster and Norman20 Though these initiatives suggest that EM faculty are generally well-prepared for observation and feedback, no Canadian studies to date looked at frontline faculty opinions on how these concepts will change during the transition to Competence by Design.

Given that successful implementation depends on a full understanding of the target audience, we designed a multi-centre needs assessment survey to determine the specific Canadian EM faculty and senior trainee development requirements during the transition to Competence by Design.

METHODS

A multi-site, cross-sectional digital survey was conducted between February and April 2018 with a sample of EM faculty and senior residents across RCPSC EM programs in Canada. Senior residents were included in this survey because they will be potential supervisors for the first Competence by Design cohort.

Survey design

The survey was modelled on a previously described multi-phase needs assessment.Reference Chan, Jo and Shih21 Survey domains included baseline perceptions about Competence by Design, attitudes towards implementation, and perceived/prompted and unperceived faculty development needs. The survey was initially developed by members of our team (AS, JH, TC) and subsequently revised based on expert feedback (JS). The needs assessment (Appendix A) was created on Google Forms (Mountainview, CA, USA).

The first section of the survey explored frontline clinicians’ practice and perceptions around feedback and workplace-based assessments (WBAs). Respondents were asked to reflect on the current state of feedback within the training environment and identify barriers to optimal feedback delivery. We also asked about their pre-existing perceptions of Competence by Design.

Secondly, we asked respondents to select topics that they believed would aid in their delivery of Competence by Design from a provided list and invited them to free-text additional suggestions. We also sought to harness the power of stories to determine what prompted needs might emerge from respondents’ descriptions of difficult scenarios. This technique has previously been used in other needs assessment studies.Reference Chan, Jo and Shih21,Reference Tseng, Jo and Shih22

Finally, the third part sought to use multiple-choice quiz questions (Appendix B) to examine what types of terminology and competency-based medical education-associated language are confusing as a means to determine unperceived educational needs.

This study was reviewed and received exemption by the Hamilton Integrated Research Ethics Board.

Participants

An invitation with the link to the online survey was distributed by email to all of the RCPSC program directors (PDs) via the EM specialty committee electronic mailing list. PDs were invited to nominate local faculty members and senior residents who represented an adequate cross-sectional sampling of EM clinician teachers in the pre-, early-, mid-, and late-career phase. We did not restrict the number of nominees from PDs. Nominees were approached by email; a link to the needs assessment was provided. Nominated individuals were contacted using a modified Dillman technique (three times, approximately 3 weeks apart) to ensure maximal completion compliance.Reference Dilman23

Analysis

We used Microsoft Excel (Redmond, WA, USA) to calculate simple descriptive statistics of our survey responses.

RESULTS

Between February and April 2018, 47 participants (40 faculty, 7 residents) from 11 Canadian schools with EM postgraduate training programs completed the needs assessment survey (58.8% response rate, based on the original 80 nominated individuals). We lacked full participation of francophone and Atlantic sites. All participants (100%) trained in Canada. Most (66.0%) worked full-time at an academic centre. A description of the demographic data for responding participants is displayed in Table 1.

Table 1. Respondent demographics

Current practice

Half (55.3%) of the respondents felt they were currently providing high-quality feedback. A significant number of participants felt they did not provide adequate feedback (23.4%) or that the feedback provided was not always honest (8.5%). Most (89.4%) respondents reported that meaningful learner feedback should be provided on each shift with the additional 10.6% reporting it should be provided weekly (every three to four shifts). Additionally, a strong majority (76.6%) of respondents indicated that they had received teacher-training on providing feedback.

Barriers to feedback and WBAs

Because Competence by Design emphasizes the use of WBAs (in the form of EPAs), it was important to explore the participants’ perceptions around existing barriers to providing adequate feedback in the workplace. The top five barriers identified by respondents when providing feedback in the clinical setting were time constraints (80.4%), perceived learner disinterest (43.5%), fear of assessment repercussions (41.3%), lack of training (28.3%), and difficulty giving negative feedback (13.0%).

Perceptions of Competence by Design impact

Respondents generally indicated they felt that the transition to Competence by Design would have no impact on patient care (44.7%). A minority (17.0%) perceived it might have a negative impact, whereas another group (12.8%) felt that it might have a positive impact. The reasons cited by those participants who felt patient care would be negatively affected were that direct observation, feedback, and documentation of WBAs will slow down patient flow. Those who felt Competence by Design would improve patient care believed that a more direct observation will result in safer care, because more feedback might result in more competent trainees. This finding warrants consideration and exploration. Prompted needs from free-text reponses tended to concern logistics of Competence by Design and practicality of carrying out increased direct observation in higher acuity/volume settings.

Respondents’ confidence in the implementation of Competence by Design

Respondents (91.5%) were confident that they could incorporate WBAs into their emergency department (ED) environments. Most respondents reported that direct observation is required for adequate and meaningful feedback (19.1% rated it “critical,” and 78.7% rated it “somewhat” or “very important”). A strong majority (78.7%) indicated that the ED environment offered above-average opportunity for direct observation compared with other clinical environments. A majority (55.3%) of respondents indicated that Competence by Design would lead to improved feedback to trainees, whereas 17.0% indicated that feedback would not be improved. Many (44.7%) reported Competence by Design would lead to a better educational experience (38.3% uncertain; 17.0% indicated that it might have negative effects).

Self-reported needs

We polled participants about which faculty development topics they thought are most valuable as we transition to Competence by Design. Table 2 displays these topics in descending order of importance. Despite the overall perception that current feedback practice is of high quality, a majority (63.8%) of respondents were interested in additional professional development around how to deliver high-quality feedback. Completing assessments after a clinical encounter (57.4%) and principles of resident promotion (55.3%) are made were the next two most requested topics for faculty development. Very few respondents (25.5%) endorsed that they would like more information on Competence by Design terminology.

Table 2. Priority professional development topics as identified by survey respondents

Note: All other items with one or fewer endorsements were excluded. These included documentation of feedback; how to help residents implement personal learning plans; how direct observations can be done (different techniques and approaches); how to integrate many direct observations into one feedback session; how to help the resident in difficulty; and how to recruit staff to be more excited about CBD and the idea of big change.

Unperceived needs

In our unperceived needs (testing) phase of the needs assessment, only 44.6% of respondents could correctly identify key competency-based medical education terms (Appendix B). This suggests that there is a gap between what respondents prefer and what they may require.

DISCUSSION

As we enter into a new era of postgraduate EM education in Canada, the findings of our study will help inform faculty development. We found that surveyed EM faculty members and senior residents valued direct observation, frequent feedback, and increased assessment opportunities – all of which are crucial for implementing Competence by Design.

Despite concerns about time constraints and concurrently balancing patient care and educational needs, respondents in our survey reported perceptions that the ED offered an above-average opportunity for direct observation and regular feedback. This is consistent with prior literature findings.Reference Bandiera and Lendrum19,Reference Bandiera24 Because the competency-based medical education model hinges on extensive direct observation, this is an important advantage and suggests that teachers are culturally ready to transition to Competence by Design.

Implications for EM faculty developers in the age of competency-based medical education

A strong majority (63.8%) of those surveyed stated that they wanted to receive further faculty development in delivering high-quality feedback, although 53.1% of respondents felt that the residents were already receiving adequate feedback at their centre. Our finding is similar to previous literature showing that a majority of academic EM physicians have a strong interest in improving their educational skillset.Reference Brown, Lang and Patel25 Other key areas of faculty development identified by our survey are concepts unique to Competence by Design (e.g., completing new WBAs and understanding how resident performace is linked to promotion decisions). These areas may not have previously been well-covered in faculty development to date, and may be high-yield when training existing educators and new faculty.

Potential pitfalls

Interestingly, it appears that many respondents are unsure whether educational experiences and better feedback will translate into better patient care, given that 44.7% felt that Competence by Design will not impact patient care. We believe this finding may be because faculty are unsure of how their observations and instructions are linked to promotion decisions. This inconsistency is important to explore further in qualitative studies because it may affect buy-in from faculty and residents when transitioning to the competency-based medical education model.

Our survey also identified other potential problems that could affect transition to Competence by Design. Our respondents were also less unified in their opinions regarding whether Competence by Design would lead to a better educational experience or better feedback and the impact that this paradigm shift may have on patient care. Programs must ensure that the impact of this transition on training and clinical care is monitored and that the results are shared with the faculty, in order to address these concerns.

Faculty and senior residents were unfamiliar of some of their terminology/knowledge gaps regarding Competence by Design fundamentals, but are unlikely to be interested in further development in this area. One potential solution would be to bundle information regarding competency-based medical education concepts and terminology with faculty development sessions that target the areas of higher interest, such as completion of WBAs. Though the poor adoption of official Competence by Design language is not likely to cause big problems for front-line teachers,it may be useful to incorporate these terms in faculty development to ensure a common lexicon.

Time constraints in the clinical environment were identified by over 80% of respondents as the most common barrier in delivering high-quality feedback. This is certainly consistent with prior literature on teaching in EM.Reference Thurgur, Bandiera, Lee and Tiberius26Reference Chan, Dewark and Sherbino28 Fear of repercussions when providing honest feedback and perceived learner disinterest were also major barriers. If we want Competence by Design to succeed, it is paramount to acknowledge these issues in faculty and resident development sessions and begin finding solutions that will address these concerns.

Next steps

This needs assessment marks the beginning of what is likely to be an ongoing and continuous task. We will need to continually assess faculty and trainee engagement over time, especially when the Competence by Design EPAs become the standard. Previous work in competency-based medical education has suggested that these changes can lead to cultural shiftsReference Li, Sherbino and Chan29 or implementation issues.Reference Dayal, O'Connor, Qadri and Arora30 In Canada, we can gain a glimpse into how faculty and resident perceptions and practices may change by watching “early adopter” programs.Reference Li, Sherbino and Chan29,Reference Chan and Sherbino31Reference Hall, Rich and Dagnone35 Will we create the desired shift in learning culture towards a growth mindsetReference Dweck36 at an early stage in the trainees’ development (which may decrease the fear of feedback and repercussions)? Or will the constant surveillance and monitoring result merely in increased accounting and documentation without educational benefit? Continued flexibility and innovation will also be imperative; we will need to remain vigilant as we implement and refine our specialty's EPAs.

Limitations

For our study, we focused on surveying the needs of frontline faculty. As such, we deliberately excluded questions regarding other important competency-based medical education concepts such as curriculum reform or competence committees, which would be of interest to educational leaders. Our survey response rate was only 58% with representation gaps from the Atlantic provinces and Québec; this may hinder the generalizability of the study findings. Specific to Québec, we did not have a French language version of the survey, although the strong participation from one French-language program suggests that this may not have been the main barrier.

Our sampling technique of PD nomination may also have introduced some level of bias in our data – likely those who were highly engaged were nominated, and so voices who were objectors or resistors of Competence by Design may not be captured. Given this, even with highly engaged teachers, our findings suggest that there is ample room for growth and faculty development. This survey is also limited in its generalizability towards community teaching sites with fewer trainees, but, in this study, our focus was on faculty in academic centres (which are responsible for the majority of postgraduate training, and they may be impacted the most by Competence by Design).

Finally, the types of responses we were able to elicit were limited by the survey methodology (categorical responses to pre-developed closed questions with limited free-text answers). Interviews using open-ended questions might have revealed more nuances about the participants’ questions and concerns about competency-based medical education or Competence by Design, but this methodology was not chosen because of logistical challenges.

CONCLUSIONS

As we move into the era of competency-based medical education, it is critical to understand front-line educators’ perceptions and attitudes towards this new model of education. This cross-sectional needs assessment survey of faculty and senior residents in Canadian RCPSC EM training programs can provide a roadmap for faculty developers going forward. We will need to attend to threats to Competence by Design implementation, including concerns about effects on patient care and learner education.

Acknowledgements

Thank you to our participants for their honesty and answers. We would also like to acknowledge and thank the program directors from all of the participating sites, and, in particular, Dr. Nazanin Meshkat, who encouraged the team to go forward to complete this scholarly project.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/cem.2019.343

Ethics

This project received ethical exemption from the institutional review board (Hamilton Integrated Research Ethics Board).

Financial support

TC holds a PSI Foundation Knowledge Translation Fellowship Award. TC and JS have also previously received funding from the Royal College of Physicians and Surgeons of Canada for various unrelated projects.

Competing interests

None declared.

Previous presentations

This work has not been previously presented in a national forum, but a white paper with the results from our survey was created and distributed to the participating sites for their own faculty development purposes.

References

REFERENCES

1.Sherbino, J, Snell, L, Dath, D, et al. A national clinician-educator program: a model of an effective community of practice. Med Educ Online 2010;15:18.Google Scholar
2.Frank, JR, Snell, LS, Cate O Ten, , et al. Competency-based medical education: theory to practice. Med Teach 2010;32(8):63845.Google Scholar
3.Snell, LS, Frank, JR. Competencies, the tea bag model, and the end of time. Med Teach 2010;32(8):629–30.Google Scholar
4.Carraccio, CL, Englander, R. From Flexner to competencies: reflections on a decade and the journey ahead. Acad Med 2013;88(8):1067–73.Google Scholar
5.Ferguson, PC, Caverzagie, KJ, Nousiainen, MT, Snell, L. Changing the culture of medical training: an important step toward the implementation of competency-based medical education. Med Teach 2017;39(6):599602.Google Scholar
6.Holmboe, ES, Sherbino, J, Englander, R, et al. A call to action: the controversy of and rationale for competency-based medical education. Med Teach 2017;39(6):574–81.Google Scholar
7.Caverzagie, KJ, Nousiainen, MT, Ferguson, PC, et al. Overarching challenges to the implementation of competency-based medical education. Med Teach 2017;39(6):588–93.Google Scholar
8.Frank, JR, Snell, L, Englander, R, Holmboe, ES. Implementing competency-based medical education: moving forward. Med Teach 2017;39(6):568–73.Google Scholar
9.Lockyer, J, Carraccio, C, Chan, MK, et al. Core principles of assessment in competency-based medical education. Med Teach 2017;39(6):609–16.Google Scholar
10.Hodges, BD. A tea-steeping or i-Doc model for medical education? Academic Medicine. 2010;85(9):S3444.Google Scholar
11.Carraccio, C, Englander, R, Melle, E Van, et al. Advancing competency-based medical. 2016;91(5):645–9.Google Scholar
12.Cooney, R, Chan, TM, Gottlieb, M, et al. Academic Primer Series: key papers about competency-based medical education. West J Emerg Med 2017;18(4):713–20.Google Scholar
14.Royal College of Physicians and Surgeons of Canada. Competence by Design (CBD); 2014. Available at: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/canmeds/cbd/what_is_cbd_e.pdf (accessed April 8, 2019).Google Scholar
16.Royal College of Physicians and Surgeons of Canada. Competency by Design; 2015. Available at: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/canmeds/cbd/what_is_cbd_e.pdf (accessed February 28, 2016).Google Scholar
17.Bandiera, G, Lendrum, D. Dispatches from the front: emergency medicine teachers’ perceptions of competency-based education. CJEM 2011;13(3):155–61.Google Scholar
18.Frank, JR, Snell, L, Sherbino, J. The draft CanMEDS 2015 physician competency framework–series IV. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2014.Google Scholar
19.Bandiera, G, Lendrum, D. Daily encounter cards facilitate competency-based feedback while leniency bias persists. CJEM 2008;10(1):4450.Google Scholar
20.Sherbino, J, Kulasegaram, K, Worster, A, Norman, GR. The reliability of encounter cards to assess the CanMEDS roles. Adv Heal Sci Educ 2013;18(5):987–96.Google Scholar
21.Chan, TM, Jo, D, Shih, AW, et al. The Massive Online Needs Assessment (MONA) to inform the development of an emergency haematology educational blog series. Perspect Med Educ 2018;7(3):219–23.Google Scholar
22.Tseng, EK, Jo, D, Shih, AW, et al. Window to the unknown: using storytelling to identify learning needs for the intrinsic competencies within an online needs assessment. AEM Educ Train 2019;3:179–87.Google Scholar
23.Dilman, DA. Mail and Internet surveys: the tailored design method. New York, NY: Wiley; 2000.Google Scholar
24.Bandiera, G. How do I improve the quality of in-training assessment of learners? CJEM 2011;13(4):267–72.Google Scholar
25.Brown, GM, Lang, E, Patel, K, et al. A National Faculty Development Needs Assessment in Emergency Medicine. CJEM 2016;18(3):161–82.Google Scholar
26.Thurgur, L, Bandiera, G, Lee, S, Tiberius, R. What do emergency medicine learners want from their teachers? A multicenter focus group analysis. Acad Emerg Med 2005;12(9):856–61.Google Scholar
27.Bandiera, G, Lee, S, Tiberius, R. Creating effective learning in today's emergency departments: how accomplished teachers get it done. Ann Emerg Med 2005;45(3):253–61, doi:10.1016/j.annemergmed.2004.08.007.Google Scholar
28.Chan, TM, Dewark, K Van, Sherbino, J, et al. Failure to flow: an exploration of learning and teaching in busy, multi-patient environments using an interpretive description method. Perspect Med Educ 2017;6(6):380–7.Google Scholar
29.Li, S, Sherbino, J, Chan, TM. McMaster Modular Assessment Program (McMAP) through the years: residents’ experience with an evolving feedback culture over a 3-year period. AEM Educ Train 2017;1(1):514.Google Scholar
30.Dayal, A, O'Connor, DM, Qadri, U, Arora, VM. Comparison of male vs female resident milestone evaluations by faculty during emergency medicine residency training. JAMA Intern Med 2017;177(5):651.Google Scholar
31.Chan, T, Sherbino, J. The McMaster Modular Assessment Program (McMAP). Acad Med 2015;90(7):900–5.Google Scholar
32.Sebok-Syer, SS, Klinger, DA, Sherbino, J, Chan, TM. Mixed messages or miscommunication? Investigating the relationship between assessors? Workplace-based assessment scores and written comments. Acad Med 2017;92(12):1774–9.Google Scholar
33.McConnell, M, Sherbino, J, Chan, TM. Mind the gap: the prospects of missing data. J Grad Med Educ 2016;8(5):708–12.Google Scholar
34.Chan, TM, Sherbino, J, Mercuri, M. Nuance and noise: lessons learned from longitudinal aggregated assessment data. J Grad Med Educ 2017;9(6):724–9.Google Scholar
35.Hall, AK, Rich, J, Dagnone, J, et al. P061: implementing in emergency medicine: lessons learned from the first 6 months of transition at Queen's University A. CJEM 2018;20(S1):S78.Google Scholar
36.Dweck, CS. Mindset. 1st ed. New York: Penguin Random House LLC; 2003.Google Scholar
Figure 0

Table 1. Respondent demographics

Figure 1

Table 2. Priority professional development topics as identified by survey respondents

Supplementary material: PDF

Stefan et al. supplementary material

Stefan et al. supplementary material 1

Download Stefan et al. supplementary material(PDF)
PDF 404.9 KB
Supplementary material: File

Stefan et al. supplementary material

Stefan et al. supplementary material 2

Download Stefan et al. supplementary material(File)
File 121.9 KB