Try a new search

Format these results:

Searched for:

person:burkrj01

in-biosketch:true

Total Results:

45


Foreword: The Next Era of Assessment and Precision Education

Schumacher, Daniel J; Santen, Sally A; Pugh, Carla M; Burk-Rafel, Jesse
PMID: 38109655
ISSN: 1938-808x
CID: 5612462

Learner Assessment and Program Evaluation: Supporting Precision Education

Richardson, Judee; Santen, Sally A; Mejicano, George C; Fancher, Tonya; Holmboe, Eric; Hogan, Sean O; Marin, Marina; Burk-Rafel, Jesse
Precision education (PE) systematically leverages data and advanced analytics to inform educational interventions that, in turn, promote meaningful learner outcomes. PE does this by incorporating analytic results back into the education continuum through continuous feedback cycles. These data-informed sequences of planning, learning, assessing, and adjusting foster competence and adaptive expertise. PE cycles occur at individual (micro), program (meso), or system (macro) levels. This article focuses on program- and system-level PE.Data for PE come from a multitude of sources, including learner assessment and program evaluation. The authors describe the link between these data and the vital role evaluation plays in providing evidence of educational effectiveness. By including prior program evaluation research supporting this claim, the authors illustrate the link between training programs and patient outcomes. They also describe existing national reports providing feedback to programs and institutions, as well as 2 emerging, multiorganization program- and system-level PE efforts. The challenges encountered by those implementing PE and the continuing need to advance this work illuminate the necessity for increased cross-disciplinary collaborations and a national cross-organizational data-sharing effort.Finally, the authors propose practical approaches for funding a national initiative in PE as well as potential models for advancing the field of PE. Lessons learned from successes by others illustrate the promise of these recommendations.
PMID: 38166211
ISSN: 1938-808x
CID: 5736972

Precision Education: The Future of Lifelong Learning in Medicine

Desai, Sanjay V; Burk-Rafel, Jesse; Lomis, Kimberly D; Caverzagie, Kelly; Richardson, Judee; O'Brien, Celia Laird; Andrews, John; Heckman, Kevin; Henderson, David; Prober, Charles G; Pugh, Carla M; Stern, Scott D; Triola, Marc M; Santen, Sally A
The goal of medical education is to produce a physician workforce capable of delivering high-quality equitable care to diverse patient populations and communities. To achieve this aim amidst explosive growth in medical knowledge and increasingly complex medical care, a system of personalized and continuous learning, assessment, and feedback for trainees and practicing physicians is urgently needed. In this perspective, the authors build on prior work to advance a conceptual framework for such a system: precision education (PE).PE is a system that uses data and technology to transform lifelong learning by improving personalization, efficiency, and agency at the individual, program, and organization levels. PE "cycles" start with data inputs proactively gathered from new and existing sources, including assessments, educational activities, electronic medical records, patient care outcomes, and clinical practice patterns. Through technology-enabled analytics, insights are generated to drive precision interventions. At the individual level, such interventions include personalized just-in-time educational programming. Coaching is essential to provide feedback and increase learner participation and personalization. Outcomes are measured using assessment and evaluation of interventions at the individual, program, and organizational level, with ongoing adjustment for repeated cycles of improvement. PE is rooted in patient, health system, and population data; promotes value-based care and health equity; and generates an adaptive learning culture.The authors suggest fundamental principles for PE, including promoting equity in structures and processes, learner agency, and integration with workflow (harmonization). Finally, the authors explore the immediate need to develop consensus-driven standards: rules of engagement between people, products, and entities that interact in these systems to ensure interoperability, data sharing, replicability, and scale of PE innovations.
PMID: 38277444
ISSN: 1938-808x
CID: 5625442

The Next Era of Assessment: Can Ensuring High-Quality, Equitable Patient Care Be the Defining Characteristic?

Schumacher, Daniel J; Kinnear, Benjamin; Burk-Rafel, Jesse; Santen, Sally A; Bullock, Justin L
Previous eras of assessment in medical education have been defined by how assessment is done, from knowledge exams popularized in the 1960s to the emergence of work-based assessment in the 1990s to current efforts to integrate multiple types and sources of performance data through programmatic assessment. Each of these eras was a response to why assessment was performed (e.g., assessing medical knowledge with exams; assessing communication, professionalism, and systems competencies with work-based assessment). Despite the evolution of assessment eras, current evidence highlights the graduation of trainees with foundational gaps in the ability to provide high-quality care to patients presenting with common problems, and training program leaders report they graduate trainees they would not trust to care for themselves or their loved ones. In this article, the authors argue that the next era of assessment should be defined by why assessment is done: to ensure high-quality, equitable care. Assessment should place focus on demanding graduates possess the knowledge, skills, attitudes, and adaptive expertise to meet the needs of all patients and ensuring that graduates are able to do this in an equitable fashion. The authors explore 2 patient-focused assessment approaches that could help realize the promise of this envisioned era: entrustable professional activities (EPAs) and resident sensitive quality measures (RSQMs)/TRainee Attributable and Automatable Care Evaluations in Real-time (TRACERs). These examples illustrate how the envisioned next era of assessment can leverage existing and new data to provide precision education assessment that focuses on providing formative and summative feedback to trainees in a manner that seeks to ensure their learning outcomes prepare them to ensure high-quality, equitable patient outcomes.
PMID: 38109659
ISSN: 1938-808x
CID: 5612472

A Theoretical Foundation to Inform the Implementation of Precision Education and Assessment

Drake, Carolyn B; Heery, Lauren M; Burk-Rafel, Jesse; Triola, Marc M; Sartori, Daniel J
Precision education (PE) uses personalized educational interventions to empower trainees and improve learning outcomes. While PE has the potential to represent a paradigm shift in medical education, a theoretical foundation to guide the effective implementation of PE strategies has not yet been described. Here, the authors introduce a theoretical foundation for the implementation of PE, integrating key learning theories with the digital tools that allow them to be operationalized. Specifically, the authors describe how the master adaptive learner (MAL) model, transformative learning theory, and self-determination theory can be harnessed in conjunction with nudge strategies and audit and feedback dashboards to drive learning and meaningful behavior change. The authors also provide practical examples of these theories and tools in action by describing precision interventions already in use at one academic medical center, concretizing PE's potential in the current clinical environment. These examples illustrate how a firm theoretical grounding allows educators to most effectively tailor PE interventions to fit individual learners' needs and goals, facilitating efficient learning and, ultimately, improving patient and health system outcomes.
PMID: 38113440
ISSN: 1938-808x
CID: 5612362

Leveraging Electronic Health Record Data and Measuring Interdependence in the Era of Precision Education and Assessment

Sebok-Syer, Stefanie S; Small, William R; Lingard, Lorelei; Glober, Nancy K; George, Brian C; Burk-Rafel, Jesse
PURPOSE:The era of precision education is increasingly leveraging electronic health record (EHR) data to assess residents' clinical performance. But precision in what the EHR-based resident performance metrics are truly assessing is not fully understood. For instance, there is limited understanding of how EHR-based measures account for the influence of the team on an individual's performance-or conversely how an individual contributes to team performances. This study aims to elaborate on how the theoretical understandings of supportive and collaborative interdependence are captured in residents' EHR-based metrics. METHOD:Using a mixed methods study design, the authors conducted a secondary analysis of 5 existing quantitative and qualitative datasets used in previous EHR studies to investigate how aspects of interdependence shape the ways that team-based care is provided to patients. RESULTS:Quantitative analyses of 16 EHR-based metrics found variability in faculty and resident performance (both between and within resident). Qualitative analyses revealed that faculty lack awareness of their own EHR-based performance metrics, which limits their ability to act interdependently with residents in an evidence-informed fashion. The lens of interdependence elucidates how resident practice patterns develop across residency training, shifting from supportive to collaborative interdependence over time. Joint displays merging the quantitative and qualitative analyses showed that residents are aware of variability in faculty's practice patterns and that viewing resident EHR-based measures without accounting for the interdependence of residents with faculty is problematic, particularly within the framework of precision education. CONCLUSIONS:To prepare for this new paradigm of precision education, educators need to develop and evaluate theoretically robust models that measure interdependence in EHR-based metrics, affording more nuanced interpretation of such metrics when assessing residents throughout training.
PMID: 38207084
ISSN: 1938-808x
CID: 5686572

A New Tool for Holistic Residency Application Review: Using Natural Language Processing of Applicant Experiences to Predict Interview Invitation

Mahtani, Arun Umesh; Reinstein, Ilan; Marin, Marina; Burk-Rafel, Jesse
PROBLEM:Reviewing residency application narrative components is time intensive and has contributed to nearly half of applications not receiving holistic review. The authors developed a natural language processing (NLP)-based tool to automate review of applicants' narrative experience entries and predict interview invitation. APPROACH:Experience entries (n = 188,500) were extracted from 6,403 residency applications across 3 application cycles (2017-2019) at 1 internal medicine program, combined at the applicant level, and paired with the interview invitation decision (n = 1,224 invitations). NLP identified important words (or word pairs) with term frequency-inverse document frequency, which were used to predict interview invitation using logistic regression with L1 regularization. Terms remaining in the model were analyzed thematically. Logistic regression models were also built using structured application data and a combination of NLP and structured data. Model performance was evaluated on never-before-seen data using area under the receiver operating characteristic and precision-recall curves (AUROC, AUPRC). OUTCOMES:The NLP model had an AUROC of 0.80 (vs chance decision of 0.50) and AUPRC of 0.49 (vs chance decision of 0.19), showing moderate predictive strength. Phrases indicating active leadership, research, or work in social justice and health disparities were associated with interview invitation. The model's detection of these key selection factors demonstrated face validity. Adding structured data to the model significantly improved prediction (AUROC 0.92, AUPRC 0.73), as expected given reliance on such metrics for interview invitation. NEXT STEPS:This model represents a first step in using NLP-based artificial intelligence tools to promote holistic residency application review. The authors are assessing the practical utility of using this model to identify applicants screened out using traditional metrics. Generalizability must be determined through model retraining and evaluation at other programs. Work is ongoing to thwart model "gaming," improve prediction, and remove unwanted biases introduced during model training.
PMID: 36940395
ISSN: 1938-808x
CID: 5708082

Identifying Meaningful Patterns of Internal Medicine Clerkship Grading Distributions: Application of Data Science Techniques Across 135 U.S. Medical Schools

Burk-Rafel, Jesse; Reinstein, Ilan; Park, Yoon Soo
PROBLEM/OBJECTIVE:Residency program directors use clerkship grades for high-stakes selection decisions despite substantial variability in grading systems and distributions. The authors apply clustering techniques from data science to identify groups of schools for which grading distributions were statistically similar in the internal medicine clerkship. APPROACH/METHODS:Grading systems (e.g., honors/pass/fail) and distributions (i.e., percent of students in each grade tier) were tabulated for the internal medicine clerkship at U.S. MD-granting medical schools by manually reviewing Medical Student Performance Evaluations (MSPEs) in the 2019 and 2020 residency application cycles. Grading distributions were analyzed using k-means cluster analysis, with the optimal number of clusters selected using model fit indices. OUTCOMES/RESULTS:Among the 145 medical schools with available MSPE data, 64 distinct grading systems were reported. Among the 135 schools reporting a grading distribution, the median percent of students receiving the highest and lowest tier grade was 32% (range: 2%-66%) and 2% (range: 0%-91%), respectively. Four clusters was the most optimal solution (η2 = 0.8): cluster 1 (45% [highest grade tier]-45% [middle tier]-10% [lowest tier], n = 64 [47%] schools), cluster 2 (25%-30%-45%, n = 40 [30%] schools), cluster 3 (20%-75%-5%, n = 25 [19%] schools), and cluster 4 (15%-25%-25%-25%-10%, n = 6 [4%] schools). The findings suggest internal medicine clerkship grading systems may be more comparable across institutions than previously thought. NEXT STEPS/CONCLUSIONS:The authors will prospectively review reported clerkship grading approaches across additional specialties and are conducting a mixed-methods analysis, incorporating a sequential explanatory model, to interview stakeholder groups on the use of the patterns identified.
PMID: 36484555
ISSN: 1938-808x
CID: 5378842

Reimagining the Transition to Residency: A Trainee Call to Accelerated Action

Lin, Grant L; Guerra, Sylvia; Patel, Juhee; Burk-Rafel, Jesse
The transition from medical student to resident is a pivotal step in the medical education continuum. For applicants, successfully obtaining a residency position is the actualization of a dream after years of training and has life-changing professional and financial implications. These high stakes contribute to a residency application and Match process in the United States that is increasingly complex and dysfunctional, and that does not effectively serve applicants, residency programs, or the public good. In July 2020, the Coalition for Physician Accountability (Coalition) formed the Undergraduate Medical Education-Graduate Medical Education Review Committee (UGRC) to critically assess the overall transition to residency and offer recommendations to solve the growing challenges in the system. In this Invited Commentary, the authors reflect on their experience as the trainee representatives on the UGRC. They emphasize the importance of trainee advocacy in medical education change efforts; reflect on opportunities, concerns, and tensions with the final UGRC recommendations (released in August 2021); discuss factors that may constrain implementation; and call for the medical education community-and the Coalition member organizations in particular-to accelerate fully implementing the UGRC recommendations. By seizing the momentum created by the UGRC, the medical education community can create a reimagined transition to residency that reshapes its approach to training a more diverse, competent, and growth-oriented physician workforce.
PMID: 35263298
ISSN: 1938-808x
CID: 5220952

The Undergraduate to Graduate Medical Education Transition as a Systems Problem: A Root Cause Analysis

Swails, Jennifer L; Angus, Steven; Barone, Michael A; Bienstock, Jessica; Burk-Rafel, Jesse; Roett, Michelle A; Hauer, Karen E
The transition from undergraduate medical education (UME) to graduate medical education (GME) constitutes a complex system with important implications for learner progression and patient safety. The transition is currently dysfunctional, requiring students and residency programs to spend significant time, money, and energy on the process. Applications and interviews continue to increase despite stable match rates. Although many in the medical community acknowledge the problems with the UME-GME transition and learners have called for prompt action to address these concerns, the underlying causes are complex and have defied easy fixes. This article describes the work of the Coalition for Physician Accountability's Undergraduate Medical Education to Graduate Medical Education Review Committee (UGRC) to apply a quality improvement approach and systems thinking to explore the underlying causes of dysfunction in the UME-GME transition. The UGRC performed a root cause analysis using the 5 whys and an Ishikawa (or fishbone) diagram to deeply explore problems in the UME-GME transition. The root causes of problems identified include culture, costs and limited resources, bias, systems, lack of standards, and lack of alignment. Using the principles of systems thinking (components, connections, and purpose), the UGRC considered interactions among the root causes and developed recommendations to improve the UME-GME transition. Several of the UGRC's recommendations stemming from this work are explained. Sustained monitoring will be necessary to ensure interventions move the process forward to better serve applicants, programs, and the public good.
PMID: 36538695
ISSN: 1938-808x
CID: 5426192