Can Clinical Questions be Crowdsourced?

https://ucsf.co1.qualtrics.com/SE/?SID=SV_dcaWXoy8xeH3h3f 

The Project – Invitation

The Library at San Francisco General Hospital invites NCNMLG members to participate in a small study to see whether Quora, an online, social Q&A platform, can help find answers to clinical questions. Please share this survey with healthcare providers, health science students, and researchers in your institutions.

The Problem

Research has shown that although clinicians have multiple questions in the course of a typical patient encounter, many of these questions go unresearched and unanswered. If an answer is pursued, providers often turn to colleagues for quick answers, rather than the established research literature, which providers feel to be too time consuming. (1, 2)

Previous attempts

There have been several failed attempts to address this problem by crowdsourcing clinical questions and medical evidence. Medpedia, a “Wikipedia of medicine” that only physicians and biomedical researchers could edit, abruptly closed early in 2013, even though it had backing from prominent institutions, such as Harvard Medical School, Stanford School of Medicine, and The UC Berkeley School of Public Health. Smaller scale attempts in academic settings haven’t expanded beyond their local settings.  In 2004, a team of researchers at the National Library of Medicine developed and piloted the “Virtual Evidence Cart”, an open, online platform to enable clinicians to submit and answer questions, which didn’t attract widespread use outside of the pilot groups. (3) A team of physicians and researchers at UNC Chapel Hill created the Critical Appraisal Resource (CAR) as a teaching tool for medical residents. Although this tool was widely used by the residents – there were 625 clinical questions entered, and 1035 searches in the 10 month study period – CAR relied on mandatory curriculum integration, and didn’t expand beyond the program for UNC medical residents. (4)

How we’re different

We hypothesize that previous attempts to crowdsource clinical questions haven’t continued or expanded beyond their local scope, primarily because they have focused on a specific group of people or a specific institution. By contrast, Quora is open to all, has a culture of expert based answers, and enables upvoting and downvoting of both questions and answers. Posting questions on Quora could generate answers from a broad range of experts, and the most highly ranked answers can reach a large audience of clinicians as well as the general public. Omnicurious physicians may be incentivized to read and contribute to Quora in a way that they aren’t with platforms that are limited to medical or health sciences topics.

Our plan is to collect clinical questions, post them to Quora, and solicit participation from librarians, physicians and researchers across the country. We are planning to present our findings at the annual meeting of the Medical Library Association in May, 2014.

Please consider promoting this survey at your institutions, using the link above!

 

 

 

1.            Cook DA, Sorensen KJ, Wilkinson JM, Berger RA. Barriers and decisions when answering clinical questions at the point of care: a grounded theory study. JAMA internal medicine. 2013;173(21):1962-9.

2.            Ely JW, Osheroff JA, Chambliss ML, Ebell MH, Rosenbaum ME. Answering physicians’ clinical questions: obstacles and potential solutions. Journal of the American Medical Informatics Association : JAMIA. 2005;12(2):217-24.

3.            Liu F, Fontelo P, Muin M, Ackerman M. Virtual Evidence Cart – RP (VEC-RP). AMIA  Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2005:1034.

4.            Crowley SD, Owens TA, Schardt CM, Wardell SI, Peterson J, Garrison S, et al. A Web-based compendium of clinical questions and medical evidence to educate internal medicine residents. Academic medicine : journal of the Association of American Medical Colleges. 2003;78(3):270-4.