• Text Resize A A A
  • Print Print
  • Share Share on facebook Share on twitter Share

Leveraging the Power of the Crowd in Research and Data Analysis

Summary: 
This year at Health Datapalooza 2018, the Department of Health and Human Services (HHS) has put together a panel highlighting the impact of biomedical crowdsourcing on the scientific community.

By Katrina Theisz, Program Analyst, National Cancer Institute, National Institutes of Health

This year at Health Datapalooza 2018, the Department of Health and Human Services (HHS) has put together a panel highlighting the impact of biomedical crowdsourcing on the scientific community. Featuring talks from Jennifer Couch (National Institutes of Health), Sandeep Patel (HHS IDEA Lab) Stephanie Devaney (National Institutes of Health), Pietro Michelucci (Human Computation Institute), and Matt Biggerstaff (Centers for Disease Control and Prevention), we'll delve into the different ways engaging the public in scientific research can complement traditional research methods while moving the field forward.

By engaging with people who may not normally participate in scientific endeavors you can gain insight and creative solutions you may not be able to through standard scientific approaches. Citizen Science is an example of a collaborative approach to research involving the public, not just as subjects of or advisors to the research, but as direct collaborators and partners. People know their own lives, their health, their communities, and by working in partnership with them, researchers stand to gain so much. The word partnership is of importance here; that, depending on the project, the questions being asked, and how the study is performed, much of this work starts at the individual or community level.

At its heart, true citizen science is bottom-up, not top-down.

Graph that reads,"create, collaborate, connect" in a circle around the words "Citizen science"

Crowdsourcing, on the other hand, tends to start with researchers and filter down, typically in one of two ways: 1. voluntary participation or contributions solicited from unknown individuals (aka "the crowd," be they experts or not); and 2. opening a line of scientific inquiry to a group of experts (typically achieved through prizes and challenges). People are motivated to help science for a variety of reasons- some because the research may directly impact their lives, others simply because they like science. Sometimes adding a game-like or competitive component to the project is enough to draw people to it. An added bonus of tapping into the power of the crowd is gaining access to insights you don't expect. (for example, in Galaxy Zoo, participants sort through satellite imagery to identify different classes of galaxies- side note: by giving participants a forum to share their thoughts and converse with each other, citizen scientists realized they had found an entirely new kind of galaxy.)

But it's not without its hurdles. Biomedical citizen science and crowdsourcing come packaged with issues that don't commonly plague other types of projects that engage the public in scientific research. When it comes to sharing personal health data, for instance, there are data privacy and security issues that you won't find in astronomy citizen science projects. Therefore, trust and transparency are key to the success of any such project, starting at the beginning. It starts with consent- consent that is easy to understand, no law degree needed, no lengthy fine print. What data do the project leaders need? What will it be used for? Who can access it? Addressing those questions clearly up front (and sticking to them) is a great way to avoid issues later. Want to reuse those data later for a different project? Great! Re-consent your participants. Concerned that not everyone in your study is comfortable with the language being used? Consistent iconography can help. We're an increasingly visual society, and utilizing familiar icons and images can help to clearly convey content without resorting to wordiness.

And then there are the never-ending questions about data quality. How can traditional researchers ensure that donated data are accurate? What kinds of quality control methods work best? Available data suggest that the crowd is as accurate as (and sometimes more accurate than) individual experts, something that has been demonstrated time and time again. Additionally, humans are quite adept at making inferences, visual perception, and abstract thought, which, when paired with computers, can help to train algorithms to recognize certain objects or patterns.

On Friday, April 27th at Health Datapalooza 2018 we will delve into these concepts and much more. For more information on the conference please go to: http://www.academyhealth.org/events/2018-04/2018-health-datapalooza.

Editor's Note: You can register for the Health Datapalooza here. HHS is a sponsor of the conference.

Posted In: 
Health Data