Searched for: in-biosketch:true
person:burkrj01
Characterizing Residents' Clinical Experiences-A Step Toward Precision Education
Burk-Rafel, Jesse; Drake, Carolyn B; Sartori, Daniel J
PMID: 39693075
ISSN: 2574-3805
CID: 5764502
Foreword: The Next Era of Assessment and Precision Education
Schumacher, Daniel J; Santen, Sally A; Pugh, Carla M; Burk-Rafel, Jesse
PMID: 38109655
ISSN: 1938-808x
CID: 5612462
Learner Assessment and Program Evaluation: Supporting Precision Education
Richardson, Judee; Santen, Sally A; Mejicano, George C; Fancher, Tonya; Holmboe, Eric; Hogan, Sean O; Marin, Marina; Burk-Rafel, Jesse
Precision education (PE) systematically leverages data and advanced analytics to inform educational interventions that, in turn, promote meaningful learner outcomes. PE does this by incorporating analytic results back into the education continuum through continuous feedback cycles. These data-informed sequences of planning, learning, assessing, and adjusting foster competence and adaptive expertise. PE cycles occur at individual (micro), program (meso), or system (macro) levels. This article focuses on program- and system-level PE.Data for PE come from a multitude of sources, including learner assessment and program evaluation. The authors describe the link between these data and the vital role evaluation plays in providing evidence of educational effectiveness. By including prior program evaluation research supporting this claim, the authors illustrate the link between training programs and patient outcomes. They also describe existing national reports providing feedback to programs and institutions, as well as 2 emerging, multiorganization program- and system-level PE efforts. The challenges encountered by those implementing PE and the continuing need to advance this work illuminate the necessity for increased cross-disciplinary collaborations and a national cross-organizational data-sharing effort.Finally, the authors propose practical approaches for funding a national initiative in PE as well as potential models for advancing the field of PE. Lessons learned from successes by others illustrate the promise of these recommendations.
PMID: 38166211
ISSN: 1938-808x
CID: 5736972
A Theoretical Foundation to Inform the Implementation of Precision Education and Assessment
Drake, Carolyn B; Heery, Lauren M; Burk-Rafel, Jesse; Triola, Marc M; Sartori, Daniel J
Precision education (PE) uses personalized educational interventions to empower trainees and improve learning outcomes. While PE has the potential to represent a paradigm shift in medical education, a theoretical foundation to guide the effective implementation of PE strategies has not yet been described. Here, the authors introduce a theoretical foundation for the implementation of PE, integrating key learning theories with the digital tools that allow them to be operationalized. Specifically, the authors describe how the master adaptive learner (MAL) model, transformative learning theory, and self-determination theory can be harnessed in conjunction with nudge strategies and audit and feedback dashboards to drive learning and meaningful behavior change. The authors also provide practical examples of these theories and tools in action by describing precision interventions already in use at one academic medical center, concretizing PE's potential in the current clinical environment. These examples illustrate how a firm theoretical grounding allows educators to most effectively tailor PE interventions to fit individual learners' needs and goals, facilitating efficient learning and, ultimately, improving patient and health system outcomes.
PMID: 38113440
ISSN: 1938-808x
CID: 5612362
The Next Era of Assessment: Can Ensuring High-Quality, Equitable Patient Care Be the Defining Characteristic?
Schumacher, Daniel J; Kinnear, Benjamin; Burk-Rafel, Jesse; Santen, Sally A; Bullock, Justin L
Previous eras of assessment in medical education have been defined by how assessment is done, from knowledge exams popularized in the 1960s to the emergence of work-based assessment in the 1990s to current efforts to integrate multiple types and sources of performance data through programmatic assessment. Each of these eras was a response to why assessment was performed (e.g., assessing medical knowledge with exams; assessing communication, professionalism, and systems competencies with work-based assessment). Despite the evolution of assessment eras, current evidence highlights the graduation of trainees with foundational gaps in the ability to provide high-quality care to patients presenting with common problems, and training program leaders report they graduate trainees they would not trust to care for themselves or their loved ones. In this article, the authors argue that the next era of assessment should be defined by why assessment is done: to ensure high-quality, equitable care. Assessment should place focus on demanding graduates possess the knowledge, skills, attitudes, and adaptive expertise to meet the needs of all patients and ensuring that graduates are able to do this in an equitable fashion. The authors explore 2 patient-focused assessment approaches that could help realize the promise of this envisioned era: entrustable professional activities (EPAs) and resident sensitive quality measures (RSQMs)/TRainee Attributable and Automatable Care Evaluations in Real-time (TRACERs). These examples illustrate how the envisioned next era of assessment can leverage existing and new data to provide precision education assessment that focuses on providing formative and summative feedback to trainees in a manner that seeks to ensure their learning outcomes prepare them to ensure high-quality, equitable patient outcomes.
PMID: 38109659
ISSN: 1938-808x
CID: 5612472
Precision Education: The Future of Lifelong Learning in Medicine
Desai, Sanjay V; Burk-Rafel, Jesse; Lomis, Kimberly D; Caverzagie, Kelly; Richardson, Judee; O'Brien, Celia Laird; Andrews, John; Heckman, Kevin; Henderson, David; Prober, Charles G; Pugh, Carla M; Stern, Scott D; Triola, Marc M; Santen, Sally A
The goal of medical education is to produce a physician workforce capable of delivering high-quality equitable care to diverse patient populations and communities. To achieve this aim amidst explosive growth in medical knowledge and increasingly complex medical care, a system of personalized and continuous learning, assessment, and feedback for trainees and practicing physicians is urgently needed. In this perspective, the authors build on prior work to advance a conceptual framework for such a system: precision education (PE).PE is a system that uses data and technology to transform lifelong learning by improving personalization, efficiency, and agency at the individual, program, and organization levels. PE "cycles" start with data inputs proactively gathered from new and existing sources, including assessments, educational activities, electronic medical records, patient care outcomes, and clinical practice patterns. Through technology-enabled analytics, insights are generated to drive precision interventions. At the individual level, such interventions include personalized just-in-time educational programming. Coaching is essential to provide feedback and increase learner participation and personalization. Outcomes are measured using assessment and evaluation of interventions at the individual, program, and organizational level, with ongoing adjustment for repeated cycles of improvement. PE is rooted in patient, health system, and population data; promotes value-based care and health equity; and generates an adaptive learning culture.The authors suggest fundamental principles for PE, including promoting equity in structures and processes, learner agency, and integration with workflow (harmonization). Finally, the authors explore the immediate need to develop consensus-driven standards: rules of engagement between people, products, and entities that interact in these systems to ensure interoperability, data sharing, replicability, and scale of PE innovations.
PMID: 38277444
ISSN: 1938-808x
CID: 5625442
Learner Assessment and Program Evaluation: Supporting Precision Education
Richardson, Judee; Santen, Sally A; Mejicano, George C; Fancher, Tonya; Holmboe, Eric; Hogan, Sean O; Marin, Marina; Burk-Rafel, Jesse
Precision education (PE) systematically leverages data and advanced analytics to inform educational interventions that, in turn, promote meaningful learner outcomes. PE does this by incorporating analytic results back into the education continuum through continuous feedback cycles. These data-informed sequences of planning, learning, assessing, and adjusting foster competence and adaptive expertise. PE cycles occur at individual (micro), program (meso), or system (macro) levels. This article focuses on program- and system-level PE.Data for PE come from a multitude of sources, including learner assessment and program evaluation. The authors describe the link between these data and the vital role evaluation plays in providing evidence of educational effectiveness. By including prior program evaluation research supporting this claim, the authors illustrate the link between training programs and patient outcomes. They also describe existing national reports providing feedback to programs and institutions, as well as 2 emerging, multiorganization program- and system-level PE efforts. The challenges encountered by those implementing PE and the continuing need to advance this work illuminate the necessity for increased cross-disciplinary collaborations and a national cross-organizational data-sharing effort.Finally, the authors propose practical approaches for funding a national initiative in PE as well as potential models for advancing the field of PE. Lessons learned from successes by others illustrate the promise of these recommendations.
PMID: 38166211
ISSN: 1938-808x
CID: 5736982
Leveraging Electronic Health Record Data and Measuring Interdependence in the Era of Precision Education and Assessment
Sebok-Syer, Stefanie S; Small, William R; Lingard, Lorelei; Glober, Nancy K; George, Brian C; Burk-Rafel, Jesse
PURPOSE:The era of precision education is increasingly leveraging electronic health record (EHR) data to assess residents' clinical performance. But precision in what the EHR-based resident performance metrics are truly assessing is not fully understood. For instance, there is limited understanding of how EHR-based measures account for the influence of the team on an individual's performance-or conversely how an individual contributes to team performances. This study aims to elaborate on how the theoretical understandings of supportive and collaborative interdependence are captured in residents' EHR-based metrics. METHOD:Using a mixed methods study design, the authors conducted a secondary analysis of 5 existing quantitative and qualitative datasets used in previous EHR studies to investigate how aspects of interdependence shape the ways that team-based care is provided to patients. RESULTS:Quantitative analyses of 16 EHR-based metrics found variability in faculty and resident performance (both between and within resident). Qualitative analyses revealed that faculty lack awareness of their own EHR-based performance metrics, which limits their ability to act interdependently with residents in an evidence-informed fashion. The lens of interdependence elucidates how resident practice patterns develop across residency training, shifting from supportive to collaborative interdependence over time. Joint displays merging the quantitative and qualitative analyses showed that residents are aware of variability in faculty's practice patterns and that viewing resident EHR-based measures without accounting for the interdependence of residents with faculty is problematic, particularly within the framework of precision education. CONCLUSIONS:To prepare for this new paradigm of precision education, educators need to develop and evaluate theoretically robust models that measure interdependence in EHR-based metrics, affording more nuanced interpretation of such metrics when assessing residents throughout training.
PMID: 38207084
ISSN: 1938-808x
CID: 5686572
A New Tool for Holistic Residency Application Review: Using Natural Language Processing of Applicant Experiences to Predict Interview Invitation
Mahtani, Arun Umesh; Reinstein, Ilan; Marin, Marina; Burk-Rafel, Jesse
PROBLEM:Reviewing residency application narrative components is time intensive and has contributed to nearly half of applications not receiving holistic review. The authors developed a natural language processing (NLP)-based tool to automate review of applicants' narrative experience entries and predict interview invitation. APPROACH:Experience entries (n = 188,500) were extracted from 6,403 residency applications across 3 application cycles (2017-2019) at 1 internal medicine program, combined at the applicant level, and paired with the interview invitation decision (n = 1,224 invitations). NLP identified important words (or word pairs) with term frequency-inverse document frequency, which were used to predict interview invitation using logistic regression with L1 regularization. Terms remaining in the model were analyzed thematically. Logistic regression models were also built using structured application data and a combination of NLP and structured data. Model performance was evaluated on never-before-seen data using area under the receiver operating characteristic and precision-recall curves (AUROC, AUPRC). OUTCOMES:The NLP model had an AUROC of 0.80 (vs chance decision of 0.50) and AUPRC of 0.49 (vs chance decision of 0.19), showing moderate predictive strength. Phrases indicating active leadership, research, or work in social justice and health disparities were associated with interview invitation. The model's detection of these key selection factors demonstrated face validity. Adding structured data to the model significantly improved prediction (AUROC 0.92, AUPRC 0.73), as expected given reliance on such metrics for interview invitation. NEXT STEPS:This model represents a first step in using NLP-based artificial intelligence tools to promote holistic residency application review. The authors are assessing the practical utility of using this model to identify applicants screened out using traditional metrics. Generalizability must be determined through model retraining and evaluation at other programs. Work is ongoing to thwart model "gaming," improve prediction, and remove unwanted biases introduced during model training.
PMID: 36940395
ISSN: 1938-808x
CID: 5708082
Precision Medical Education
Triola, Marc M; Burk-Rafel, Jesse
Medical schools and residency programs are increasingly incorporating personalization of content, pathways, and assessments to align with a competency-based model. Yet, such efforts face challenges involving large amounts of data, sometimes struggling to deliver insights in a timely fashion for trainees, coaches, and programs. In this article, the authors argue that the emerging paradigm of precision medical education (PME) may ameliorate some of these challenges. However, PME lacks a widely accepted definition and a shared model of guiding principles and capacities, limiting widespread adoption. The authors propose defining PME as a systematic approach that integrates longitudinal data and analytics to drive precise educational interventions that address each individual learner's needs and goals in a continuous, timely, and cyclical fashion, ultimately improving meaningful educational, clinical, or system outcomes. Borrowing from precision medicine, they offer an adapted shared framework. In the P4 medical education framework, PME should (1) take a proactive approach to acquiring and using trainee data; (2) generate timely personalized insights through precision analytics (including artificial intelligence and decision-support tools); (3) design precision educational interventions (learning, assessment, coaching, pathways) in a participatory fashion, with trainees at the center as co-producers; and (4) ensure interventions are predictive of meaningful educational, professional, or clinical outcomes. Implementing PME will require new foundational capacities: flexible educational pathways and programs responsive to PME-guided dynamic and competency-based progression; comprehensive longitudinal data on trainees linked to educational and clinical outcomes; shared development of requisite technologies and analytics to effect educational decision-making; and a culture that embraces a precision approach, with research to gather validity evidence for this approach and development efforts targeting new skills needed by learners, coaches, and educational leaders. Anticipating pitfalls in the use of this approach will be important, as will ensuring it deepens, rather than replaces, the interaction of trainees and their coaches.
PMID: 37027222
ISSN: 1938-808x
CID: 5537182