Provider impressions of the use of a mobile crowdsourcing app in medical practice

Health Informatics J. 2016 Jun;22(2):221-31. doi: 10.1177/1460458214545896. Epub 2014 Aug 28.

Abstract

In our prior work, we conducted a field trial of the mobile application DocCHIRP (Crowdsourcing Health Information Retrieval Protocol for Doctors), designed to help clinicians problem-solve at the point of care by crowdsourcing their peers. Here, we present the results of our post-trial survey that investigated the impressions of participating clinicians regarding the use of medical crowdsourcing and to identify factors influencing adoption of the technology. In all, 72 valid surveys were received from 85 registered users (85% response rate). The majority of clinicians (>80%) felt crowdsourcing would be useful to diagnose unusual cases, facilitate patient referrals, and problem-solve at the point of care. Perceived barriers to adoption included interruptions in workflow and the reluctance to publicly expose knowledge gaps. While considered a useful alternative to existing methods, future studies are needed to investigate whether the approach and application can be modified to effectively address these barriers, and to determine whether crowdsourcing will enhance provider performance and the quality of care delivered.

Keywords: IT design and development; IT health care evaluation; clinical decision-making; collaborative work practices; information and knowledge management.

MeSH terms

  • Clinical Competence / standards*
  • Cooperative Behavior
  • Crowdsourcing / methods*
  • Health Personnel / psychology*
  • Humans
  • Internet*
  • Mobile Applications*
  • Referral and Consultation
  • Surveys and Questionnaires