Searched for: in-biosketch:true
person:brennj01
Review of the Medical Student Performance Evaluation: analysis of the end-users' perspective across the specialties
Bird, Jeffrey B; Friedman, Karen A; Arayssi, Thurayya; Olvet, Doreen M; Conigliaro, Rosemarie L; Brenner, Judith M
The Medical Student Performance Evaluation (MSPE) is an important tool of communication used by program directors to make decisions in the residency application process. To understand the perspective and usage of the MSPE across multiple medical specialties now and in anticipation of the planned changes in USMLE Step 1 score-reporting. A survey instrument including quantitative and qualitative measures was developed and piloted. The final survey was distributed to residency programs across 28 specialties in 2020 via the main contact on the ACGME listserv. Of the 28 specialties surveyed, at least one response was received from 26 (93%). Eight percent of all programs (364/4675) responded to the survey, with most respondents being program directors. Usage of the MSPE varied among specialties. Approximately 1/3 of end-users stated that the MSPE is very or extremely influential in their initial screening process. Slightly less than half agreed or strongly agreed that they trust the information to be an accurate representation of applicants, though slightly more than half agree that the MSPE will become more influential once USMLE Step 1 becomes pass/fail. Professionalism was rated as the most important component and noteworthy characteristics among the least important in the decision-making process. Performance in the internal medicine clerkship was rated as the most influential while neurology and psychiatry performances were rated as less influential. Overwhelmingly, respondents suggested that including comparative performance and/or class rank would make the MSPE more useful once USMLE Step 1 becomes pass/fail. MSPE end-users across a variety of specialties utilize this complex document in different ways and value it differentially in their decision-making processes. Despite this, continued mistrust of the MSPE persists. A better understanding of end-users' perceptions of the MSPE offers the UME community an opportunity to transform the MSPE into a highly valued, trusted document of communication.
PMCID:7899642
PMID: 33606615
ISSN: 1087-2981
CID: 4823282
Can Content Experts Rely on Others to Reliably Score Open-Ended Questions on Summative Exams?
Olvet, Doreen M; Bird, Jeffrey B; Fulton, Tracy B; Kruidering, Marieke; Papp, Klara K; Qua, Kelli; Willey, Joanne M; Brenner, Judith M
PMID: 34705711
ISSN: 1938-808x
CID: 5473692
Current State of the Medical Student Performance Evaluation: A Tool for Reflection for Residency Programs
Brenner, Judith M; Bird, Jeffrey B; Brenner, Jason; Orner, David; Friedman, Karen
BACKGROUND:The Medical Student Performance Evaluation (MSPE) provides important information to residency programs. Despite recent recommendations for standardization, it is not clear how much variation exists in MSPE content among schools. OBJECTIVES/OBJECTIVE:We describe the current section content of the MSPE in US allopathic medical schools, with a particular focus on variations in the presentation of student performance. METHODS:A representative MSPE was obtained from 95.3% (143 of 150) of allopathic US medical schools through residency applications to the Zucker School of Medicine at Hofstra/Northwell in select programs for the 2019-2020 academic year. A manual data abstraction tool was piloted in 2018-2019. After training, it was used to code all portions of the MSPE in this study. The results were analyzed, and descriptive statistics were reported. RESULTS:In preclinical years, 30.8% of MSPEs reported data regarding performance of students beyond achieving "passes" in a pass/fail curriculum. Only half referenced performance in the fourth year including electives, acting internships, or both. About two-thirds of schools included an overall descriptor of comparative performance in the final paragraph. Among these schools, a majority provided adjectives such as "outstanding/excellent/very good/good," while one-quarter reported numerical data categories. Regarding clerkship grades, there were numerous nomenclature systems used. CONCLUSIONS:This analysis demonstrates the existence of extreme variability in the content of MSPEs submitted by US allopathic medical schools in the 2019-2020 cycle, including the components and nomenclature of grades and descriptors of comparative performance, display of data, and inclusion of data across all years of the medical education program.
PMCID:8370358
PMID: 34434519
ISSN: 1949-8357
CID: 5473672
Changing Medical Education, Overnight: The Curricular Response to COVID-19 of Nine Medical Schools
Binks, Andrew P; LeClair, Renée J; Willey, Joanne M; Brenner, Judith M; Pickering, James D; Moore, Jesse S; Huggett, Kathryn N; Everling, Kathleen M; Arnott, John A; Croniger, Colleen M; Zehle, Christa H; Kranea, N Kevin; Schwartzstein, Richard M
Issue: Calls to change medical education have been frequent, persistent, and generally limited to alterations in content or structural re-organization. Self-imposed barriers have prevented adoption of more radical pedagogical approaches, so recent predictions of the 'inevitability' of medical education transitioning to online delivery seemed unlikely. Then in March 2020 the COVID-19 pandemic forced medical schools to overcome established barriers overnight and make the most rapid curricular shift in medical education's history. We share the collated reports of nine medical schools and postulate how recent responses may influence future medical education. Evidence: While extraneous pandemic-related factors make it impossible to scientifically distinguish the impact of the curricular changes, some themes emerged. The rapid transition to online delivery was made possible by all schools having learning management systems and key electronic resources already blended into their curricula; we were closer to online delivery than anticipated. Student engagement with online delivery varied with different pedagogies used and the importance of social learning and interaction along with autonomy in learning were apparent. These are factors known to enhance online learning, and the student-centered modalities (e.g. problem-based learning) that included them appeared to be more engaging. Assumptions that the new online environment would be easily adopted and embraced by 'technophilic' students did not always hold true. Achieving true distance medical education will take longer than this 'overnight' response, but adhering to best practices for online education may open a new realm of possibilities. Implications: While this experience did not confirm that online medical education is really 'inevitable,' it revealed that it is possible. Thoughtfully blending more online components into a medical curriculum will allow us to take advantage of this environment's strengths such as efficiency and the ability to support asynchronous and autonomous learning that engage and foster intrinsic learning in our students. While maintaining aspects of social interaction, online learning could enhance pre-clinical medical education by allowing integration and collaboration among classes of medical students, other health professionals, and even between medical schools. What remains to be seen is whether COVID-19 provided the experience, vision and courage for medical education to change, or whether the old barriers will rise again when the pandemic is over.
PMID: 33706632
ISSN: 1532-8015
CID: 4823432
Third year medical students impersonalize and hedge when providing negative upward feedback to clinical faculty
Olvet, Doreen M; Willey, Joanne M; Bird, Jeffrey B; Rabin, Jill M; Pearlman, R Ellen; Brenner, Judith
Medical students provide clinical teaching faculty with feedback on their skills as educators through anonymous surveys at the end of their clerkship rotation. Because faculty are in a position of power, students are hesitant to provide candid feedback. Our objective was to determine if medical students were willing to provide negative upward feedback to clinical faculty and describe how they conveyed their feedback. A qualitative analysis of third year medical students' open-ended comments from evaluations of six clerkships was performed using politeness theory as a conceptual framework. Students were asked to describe how the clerkship enhanced their learning and how it could be improved. Midway through the academic year, instructions to provide full names of faculty/residents was added. Overall, there were significantly more comments on what worked well than suggestions for improvement regarding faculty/residents. Instructing students to name-names increased the rate of naming from 35% to 75% for what worked well and from 13% to 39% for suggestions for improvement. Hedging language was included in 61% of suggestions for improvement, but only 2% of what worked well. Students described the variability of their experience, used passive language and qualified negative experiences with positive ones. Medical students may use linguistic strategies, such as impersonalizing and hedging, to mitigate the impact of negative upward feedback. Working towards a culture that supports upward feedback would allow students to feel more comfortable providing candid comments about their experience.
PMID: 33657329
ISSN: 1466-187x
CID: 4823332
Innovation in Leadership Development in Undergraduate Medical Education
Jordan, Tiffany M; Willey, Joanne M; Brenner, Judith M
In response to the need for physician leaders, the Donald and Barbara Zucker School of Medicine at Hofstra/Northwell developed the Klar Leadership Development and Innovation Management program. This novel program leverages its partnership with a large Northeast health system to longitudinally provide students with leadership fundamentals and mentored experiences.
PMCID:8368466
PMID: 34457857
ISSN: 2156-8650
CID: 5473682
Twelve tips for assessing medical knowledge with open-ended questions: Designing constructed response examinations in medical education
Hauer, Karen E; Boscardin, Christy; Brenner, Judith M; van Schaik, Sandrijn M; Papp, Klara K
Medical knowledge examinations employing open-ended (constructed response) items can be useful to assess medical students' factual and conceptual understanding. Modern day curricula that emphasize active learning in small groups and other interactive formats lend themselves to an assessment format that prompts students to share conceptual understanding, explain, and elaborate. The open-ended question examination format can provide faculty with insights into learners' abilities to apply information to clinical or scientific problems, and reveal learners' misunderstandings about essential content. To implement formative or summative assessments with open-ended questions in a rigorous manner, educators must design systems for exam creation and scoring. This includes systems for constructing exam blueprints, items and scoring rubrics, and procedures for scoring and standard setting. Information gained through review of students' responses can guide future educational sessions and curricular changes in a cycle of continuous improvement.
PMID: 31282798
ISSN: 1466-187x
CID: 5473642
Pandemics Past and Present: A Guided Inquiry Approach
Willey, Joanne M; Olvet, Doreen M; Bird, Jeffrey B; Brenner, Judith M
Background/UNASSIGNED:COVID-19 exposed undergraduate medical education curricular gaps in exploring historical pandemics, how to critically consume scientific literature and square it with the lay press, and how to grapple with emerging ethical issues. In addition, as medical students were dismissed from clinical environments, their capacity to build community and promote professional identity formation was compromised. Methods/UNASSIGNED:was developed using a modified guided inquiry approach. Students met daily for 2 weeks in groups of 15 to 18 with a process facilitator. During the first week, students reported on lessons learned from past pandemics; in the second week, students discussed ethical concerns surrounding COVID-19 clinical trials, heard from physicians who provided patient care in the HIV and COVID-19 pandemics, and concluded with an opportunity for reflection. Following the course, students were asked to complete an anonymous, voluntary survey to assess their perceptions of the course. Results/UNASSIGNED:With a response rate of 69%, an overwhelming majority of students agreed or strongly agreed that learning about historical pandemics helped them understand COVID-19 (72, 99%). The course successfully helped students understand current and potential COVID-19 management strategies as 66 (90%) agreed or strongly agreed they developed a better understanding of nonpharmacological interventions and new pharmacological treatments. Students also gained insight into the experiences of healthcare providers who cared for patients with HIV and COVID-19. Qualitative analysis of the open-ended comments yielded 5 main themes: critical appraisal of resources, responsibility of the physician, humanism, knowledge related to pandemics, and learning from history. Conclusions/UNASSIGNED:The onset of the COVID-19 crisis illustrated curricular gaps that could be remedied by introducing the history and biology of pandemics earlier in the curriculum. It was also apparent that learners need more practice in critically reviewing literature and comparing scientific literature with lay press. The flexible format of the course promotes the development of future iterations that could cover evolving topics related to COVID-19. The course could also be repurposed for a graduate or continuing medical education audience.
PMCID:7705775
PMID: 33294621
ISSN: 2382-1205
CID: 4722452
Patients don't come with multiple choice options: essay-based assessment in UME
Bird, Jeffrey B; Olvet, Doreen M; Willey, Joanne M; Brenner, Judith
Curricular revision efforts have resulted in learner-centered programs that value content integration and active learning. Yet, less attention has been placed on assessment methods that are learner-centered and promote assessment for learning. The use of context rich short answer question (CR-SAQ) exams in the preclinical years of medical school was evaluated to determine if this format aligns with the criteria for assessment for learning. Medical students and preclinical faculty members were sent a survey comprised of closed and open-ended questions about their experience using CR-SAQ exams. Data were analyzed using a mixed-method design. Open-ended responses were evaluated using thematic analysis within the framework of criteria for assessment for learning. A total of 274 students (94%) and 24 faculty (75%) completed the survey. Fifty four percent of students reported preferring a CR-SAQ exam format over multiple choice questions (MCQ) format. Quantitative data and qualitative comments by students supported that CR-SAQ exams aligned with criteria for assessment for learning, including acceptability, authenticity, educational effect, and the cueing effect. Student concerns included preparation for USMLE Step 1 exam, as well as the validity and reproducibility of CR-SAQ assessments. Faculty largely agreed with the benefits of the CR-SAQ, but were concerned about feasibility, acceptability and reproducibility. The CR-SAQ exam format assessment strategy supports assessment for learning in an undergraduate medical education setting. Both benefits and drawbacks of this method are presented, however students and faculty describe a broader impact that this assessment method has on their development as a physician.
PMCID:6720218
PMID: 31438809
ISSN: 1087-2981
CID: 4175022
The Revised Medical School Performance Evaluation: Does It Meet the Needs of Its Readers?
Brenner, Judith M; Arayssi, Thurayya; Conigliaro, Rosemarie L; Friedman, Karen
BACKGROUND:The Medical School Performance Evaluation (MSPE) is an important factor for application to residency programs. Many medical schools are incorporating recent recommendations from the Association of American Medical Colleges MSPE Task Force into their letters. To date, there has been no feedback from the graduate medical education community on the impact of this effort. OBJECTIVE:We surveyed individuals involved in residency candidate selection for internal medicine programs to understand their perceptions on the new MSPE format. METHODS:A survey was distributed in March and April 2018 using the Association of Program Directors in Internal Medicine listserv, which comprises 4220 individuals from 439 residency programs. Responses were analyzed, and themes were extracted from open-ended questions. RESULTS:A total of 140 individuals, predominantly program directors and associate program directors, from across the United States completed the survey. Most were aware of the existence of the MSPE Task Force. Respondents read a median of 200 to 299 letters each recruitment season. The majority reported observing evidence of adoption of the new format in more than one quarter of all medical schools. Among respondents, nearly half reported the new format made the MSPE more important in decision-making about a candidate. Within the MSPE, respondents recognized the following areas as most influential: academic progress, summary paragraph, graphic representation of class performance, academic history, and overall adjective of performance indicator (rank). CONCLUSIONS:The internal medicine graduate medical education community finds value in many components of the new MSPE format, while recognizing there are further opportunities for improvement.
PMCID:6699531
PMID: 31440345
ISSN: 1949-8357
CID: 5473652