N. S. Hoang1, J. Lau1 1Stanford University,Department Of General Surgery,Palo Alto, CA, USA
Introduction:
Competency-based medical education (CBME) is seeing widespread implementation as medical education moves away from the Halstedian master-apprenticeship model (MAM). Despite the recent incorporation of milestones and entrustable professional activities (EPAs) into CBME frameworks, major challenges like reductionism and the loss of authenticity have remained fixed. It has been argued that concretely defining all aspects of a qualified physician is an impossible task and there is concern that reducing higher-order competencies can lead to the overfitting of data. If curricula become ‘overfit’ to existing data, essential qualities required of a modern day physician will not be targeted for development. Data science principles involve adapting numbers to meet real-world needs and can provide insight into the development of curricula and assessment tools to advance the implementation of CBME.
Methods:
A thought experiment was conducted using competencies as data points along a coordinate plane, with each data point representing one aspect of a competent physician. A complex polynomial representing CBME curricula was formulated to perfectly capture each competency. New data points were added which represent the qualitative aspects of competency that are remiss from atomized constructions of competency typically seen in checklists. The complex polynomial was ‘overfit’ to existing data and failed to predict the new data points. Data science principles were applied to consider how to prevent overfitting in the context of CBME curricula and assessment tools.
Results:
Curricular developments to prevent overfitting include retaining subjectivity in competency definitions (e.g. time is not completely de-valued as contributing to competence) and considering non-competency domains (e.g. cultural competency and stewardship). Mixed-methods assessment with multiple assessors in differing contexts was suggested to be frequent and embedded within the curriculum to encourage self-reflection. To prevent overfitting, the authors propose a multi-faceted assessment program involving the triangulation of methods, such that qualitative assessment tools can enrich impressions formed by quantitative tools and uncover divergent dimensions of behavior not captured by either tool alone. Dedicated faculty assessors, physician coaches, and assessment data summary techniques are described as a compromise to the current form of CBME that has alienated some educators with checklists, jargon, and demands for uncompensated assessor training for faculty.
Conclusion:
Data science principles can be effectively applied to competency definitions to address major challenges of CBME and to facilitate the development of curricula and assessment tools. Mixed-methods assessment provides a solution to the criticisms lodged against CBME and, if properly incorporated, can advance the implementation of CBME.