Try a new search

Format these results:

Searched for:

in-biosketch:true

person:mannd01

Total Results:

172


[S.l.] : 11th Annual Conference on the Science of Dissemination and Implementation in Health, 2018

Design thinking for implementation science: A case study employing user-centered digital design methodology to create usable decision support

Chokshi, Sara; Belli, Hayley; Troxel, Andrea; Schwartz, Jessica; Blecker, Saul; Blaum, Caroline; Szerencsy, Adam; Testa, Paul; Mann, Devin
(Website)
CID: 4256142

Bridging the Gap Between Academic Research and Pragmatic Needs in Usability: A Hybrid Approach to Usability Evaluation of Health Care Information Systems

Mann, Devin M; Chokshi, Sara Kuppin; Kushniruk, Andre
BACKGROUND:Technology is increasingly embedded into the full spectrum of health care. This movement has benefited from the application of software development practices such as usability testing and agile development processes. These practices are frequently applied in both commercial or operational and academic settings. However, the relative importance placed on rapid iteration, validity, reproducibility, generalizability, and efficiency differs between the 2 settings and the needs and objectives of academic versus pragmatic usability evaluations. OBJECTIVE:This paper explores how usability evaluation typically varies on key dimensions in pragmatic versus academic settings that impact the rapidity, validity, and reproducibility of findings and proposes a hybrid approach aimed at satisfying both pragmatic and academic objectives. METHODS:We outline the characteristics of pragmatic versus academically oriented usability testing in health care, describe the tensions and gaps resulting from differing contexts and goals, and present a model of this hybrid process along with 2 case studies of digital development projects in which we demonstrate this integrated approach to usability evaluation. RESULTS:The case studies presented illustrate design choices characteristic of our hybrid approach to usability evaluation. CONCLUSIONS:Designed to leverage the strengths of both pragmatically and academically focused usability studies, a hybrid approach allows new development projects to efficiently iterate and optimize from usability data as well as preserves the ability of these projects to produce deeper insights via thorough qualitative analysis to inform further tool development and usability research by way of academically focused dissemination.
PMID: 30487119
ISSN: 2292-9495
CID: 3500652

INFUSING DESIGN THINKING INTO A HEALTH SYSTEM ONE DIGITAL INTERVENTION AT A TIME: THE AMBULATORY PRODUCT RESEARCH AND INNOVATION LAB (APRIL) AT NYU LANGONE HEALTH [Meeting Abstract]

Mann, Devin; Chokshi, Sara
ISI:000431185201157
ISSN: 0883-6612
CID: 3113972

USER-CENTERED DESIGN IN BEHAVIORAL MEDICINE-METHODS FOR PRAGMATIC INTERVENTIONS [Meeting Abstract]

Mullane, Sarah L.; Takemoto, Michelle L.; Mullane, Sarah L.; Ellis, Rebecca J. Bartlett; Mann, Devin
ISI:000431185201153
ISSN: 0883-6612
CID: 3113982

"The only advantage is it forces you to click 'dismiss'": Usability testing for interruptive versus non-interruptive clinical decision support [Meeting Abstract]

Blecker, S; Pandya, R K; Stork, S; Mann, D M; Austrian, J
Background: Clinical decision support (CDS) has been shown to im-prove compliance with evidence-based care but its impact is often diminished due to issues such as poor usability, insufficient integration into workflow, and alert fatigue. Non-interruptive CDS may be less subject to alert fatigue but there has been little assessment of its usability. The purpose of this study was to perform usability testing on interruptive and non-interruptive versions of a CDS. Methods: We conducted a usability study ofa CDS tool that recommended prescribing an angiotensin converting enzyme (ACE) inhibitor for inpatients with heart failure. We developed two versions of the CDS that varied in its format: an interruptive alert, in which the CDS popped-up at the time of order entry, and a non-interruptive alert, which was displayed in a checklist section of the Electronic Health Record (EHR). We recruited inpatient providers to use both versions in a laboratory setting. We randomly assigned providers to first trigger the interruptive or non-interruptive alert. Providers were given a clinical scenario and asked to " think aloud" as they worked through the CDS; we then conducted a brief semi-structured interview about usability. We used a constant comparative analysis informed by the Five Rights of CDS framework to analyze the interviews. Inpatient providers from different disciplines were recruited until thematic saturation was reached. Results: Of 12 providers who participated in usability testing, seven used the interruptive followed by the non-interruptive CDS and five used the non-interruptive CDS initially. We categorized codes into four themes related to the Five Rights of CDS and determined some codes to be general to the CDS while others were specific to the interruptive or non-interruptive version (Table). Providers noted that the interruptive alert was readily noticed but generally impeded workflow. Providers found the non-interruptive CDS to be less annoying but had lower visibility; although they liked the ability to address the non-interruptive CDS at any time, some providers questioned whether it would ultimately be used. Conclusions: Providers expressed annoyance in working with an inter-ruptive CDS. Although the non-interruptive CDS was more appealing, providers admitted that it may not be used unless integrated with workflow. One potential solution was a combination of the two formats: supplementing a non-interruptive alert with an occasional, well-timed interruptive alert if uptake was insufficient
EMBASE:622328861
ISSN: 1525-1497
CID: 3138052

Live usability testing of two complex clinical decision support tools [Meeting Abstract]

Richardson, S; Feldstein, D; McGinn, T; Park, L S; Khan, S; Hess, R; Smith, P D; Mishuris, R G; McCullagh, L; Mann, D M
Background: The potential of the electronic health record (EHR) and clinical decision support (CDS) to improve the practice of medicine have been significantly tempered by poor design and the resulting burden they place on health care providers. CDS is rarely tested in the real clinical environment. As a result many tools are hard to use, placing strain on providers and resulting in low adoption rates. This is the first study to evaluate CDS usability and the provider-computer-patient interaction in the real clinical environment. The objective of this study was to further understand barriers and facilitators of meaningful CDS usage within a real clinical context. Methods: This qualitative observational study was conducted with three primary care providers during a total of six patient care sessions. In patients with the chief complaint of sore throat the Centor Score was used to stratify the risk of group A strep pharyngitis. In patients with a chief complaint of cough or upper respiratory infection the Heckerling Rule was used to stratify the risk of pneumonia. During usability testing all human-computer interactions, including audio and continuous screen capture, were recorded using Camtasia software. Participants' comments and interactions with the tool during patient care sessions and participant comments during a post-session brief interview were placed into coding categories and analyzed for generalizable themes. Results: In the 6 encounters observed, primary care providers toggled between addressing either the computer or the patient during the visit. Minimal time was spent listening to the patient without engaging the EHR. Participants almost always used the CDS tool with the patient, asking questions to populate the calculator and discussing the results of the risk assessment; they reported the ability to do this as the major benefit of the tool. All primary care providers were interrupted during their use of the CDS tool by the need to refer to other sections of the chart. In half of the visits, patient's clinical symptoms challenged the applicability of the clinical prediction rule to calculate the risk of bacterial infection. Primary care providers rarely used the incorporated incentives for CDS usage, including progress notes and patient instructions/documentation. Conclusions: Live usability testing of these CDS tools generated insights about their role in the patient-provider interaction. CDS may contribute to the interaction by being simultaneously viewed by provider and patient. CDS can improve usability and lessen the strain it places on providers by being short, flexible and customizable to unique provider workflow. A useful component of CDS is being as widely applicable as possible and ensuring that its functions represent the fastest way to perform a particular task
EMBASE:622329670
ISSN: 1525-1497
CID: 3138942

PILOT AND FEASIBILITY TEST OF A MOBILE HEALTH-SUPPORTED INTERVENTION FOR STOPPING HYPERTENSION [Meeting Abstract]

Weerahandi, Himali; Quintiliani, Lisa M.; Paul, Soaptarshi; Chokshi, Sara K.; Mann, Devin M.
ISI:000442641401118
ISSN: 0884-8734
CID: 4181052

"Think aloud" and "Near live" usability testing of two complex clinical decision support tools

Richardson, Safiya; Mishuris, Rebecca; O'Connell, Alexander; Feldstein, David; Hess, Rachel; Smith, Paul; McCullagh, Lauren; McGinn, Thomas; Mann, Devin
OBJECTIVES: Low provider adoption continues to be a significant barrier to realizing the potential of clinical decision support. "Think Aloud" and "Near Live" usability testing were conducted on two clinical decision support tools. Each was composed of an alert, a clinical prediction rule which estimated risk of either group A Streptococcus pharyngitis or pneumonia and an automatic order set based on risk. The objective of this study was to further understanding of the facilitators of usability and to evaluate the types of additional information gained from proceeding to "Near Live" testing after completing "Think Aloud". METHODS: This was a qualitative observational study conducted at a large academic health care system with 12 primary care providers. During "Think Aloud" testing, participants were provided with written clinical scenarios and asked to verbalize their thought process while interacting with the tool. During "Near Live" testing participants interacted with a mock patient. Morae usability software was used to record full screen capture and audio during every session. Participant comments were placed into coding categories and analyzed for generalizable themes. Themes were compared across usability methods. RESULTS: "Think Aloud" and "Near Live" usability testing generated similar themes under the coding categories visibility, workflow, content, understand-ability and navigation. However, they generated significantly different themes under the coding categories usability, practical usefulness and medical usefulness. During both types of testing participants found the tool easier to use when important text was distinct in its appearance, alerts were passive and appropriately timed, content was up to date, language was clear and simple, and each component of the tool included obvious indicators of next steps. Participant comments reflected higher expectations for usability and usefulness during "Near Live" testing. For example, visit aids, such as automatically generated order sets, were felt to be less useful during "Near-Live" testing because they would not be all inclusive for the visit. CONCLUSIONS: These complementary types of usability testing generated unique and generalizable insights. Feedback during "Think Aloud" testing primarily helped to improve the tools' ease of use. The additional feedback from "Near Live" testing, which mimics a real clinical encounter, was helpful for eliciting key barriers and facilitators to provider workflow and adoption.
PMCID:5679128
PMID: 28870378
ISSN: 1872-8243
CID: 2687782

Problem-oriented charting: A review

Chowdhry, Shilpa M; Mishuris, Rebecca G; Mann, Devin
Problem-oriented charting is form of medical documentation that organizes patient data by a diagnosis or problem. In this review, we discuss the history and current use of problem-oriented charting by critically evaluating the literature on the topic. We provide insights with regard to our own institutional use of problem-oriented charting and potential opportunities for research.
PMID: 28551008
ISSN: 1872-8243
CID: 2581212

Design and implementation of electronic health record integrated clinical prediction rules (iCPR): a randomized trial in diverse primary care settings

Feldstein, David A; Hess, Rachel; McGinn, Thomas; Mishuris, Rebecca G; McCullagh, Lauren; Smith, Paul D; Flynn, Michael; Palmisano, Joseph; Doros, Gheorghe; Mann, Devin
BACKGROUND: Clinical prediction rules (CPRs) represent a method of determining individual patient risk to help providers make more accurate decisions at the point of care. Well-validated CPRs are underutilized but may decrease antibiotic overuse for acute respiratory infections. The integrated clinical prediction rules (iCPR) study builds on a previous single clinic study to integrate two CPRs into the electronic health record and assess their impact on practice. This article discusses study design and implementation of a multicenter cluster randomized control trial of the iCPR clinical decision support system, including the tool adaptation, usability testing, staff training, and implementation study to disseminate iCPR at multiple clinical sites across two health care systems. METHODS: The iCPR tool is based on two well-validated CPRs, one for strep pharyngitis and one for pneumonia. The iCPR tool uses the reason for visit to trigger a risk calculator. Provider completion of the risk calculator provides a risk score, which is linked to an order set. Order sets guide evidence-based care and include progress note documentation, tests, prescription medications, and patient instructions. The iCPR tool was refined based on interviews with providers, medical assistants, and clinic managers, and two rounds of usability testing. "Near live" usability testing with simulated patients was used to ensure that iCPR fit into providers' clinical workflows. Thirty-three Family Medicine and General Internal Medicine primary care clinics were recruited at two institutions. Clinics were randomized to academic detailing about strep pharyngitis and pneumonia diagnosis and treatment (control) or academic detailing plus use of the iCPR tool (intervention). The primary outcome is the difference in antibiotic prescribing rates between the intervention and control groups with secondary outcomes of difference in rapid strep and chest x-ray ordering. Use of the components of the iCPR will also be assessed. DISCUSSION: The iCPR study uses a strong user-centered design and builds on the previous initial study, to assess whether CPRs integrated in the electronic health record can change provider behavior and improve evidence-based care in a broad range of primary care clinics. TRIAL REGISTRATION: Clinicaltrials.gov ( NCT02534987 ).
PMCID:5351194
PMID: 28292304
ISSN: 1748-5908
CID: 2488562