Academia and the Ethics of Crowdsourced Research

BY HANNAH JOHNSTON, M. SIX SILBERMAN, AND JAMIE WOODCOCK

Dana Barr put the kids to bed, opened the computer, and logged in to Amazon Mechanical Turk to look for work. After about a minute, Dana saw a task from Albaventura University Cogsci. “Ten minute survey about political beliefs, $1.00.” Six dollars an hour—not the best rate, but not the worst either. With rent due next week, Dana couldn’t ignore it.

Dana started the task, answering each question carefully. Nearly thirty minutes later, Dana submitted the survey. It had taken longer than advertised, but Dana took pride in doing good work.

It was late. Dana closed the computer and got up.

The next morning, Dana logged back on to see if the payment had come through. It hadn’t; the work had been rejected. Dana wouldn’t be paid; there was no explanation.

Dana contacted the researcher through Mechanical Turk to ask what happened, but never got a reply.

*

Dana is not a real person, nor is Albaventura a real university. But the experience described above is common among “crowdworkers.”

Amazon Mechanical Turk is one of several “crowdsourcing platforms” used by academic researchers. These platforms let researchers post tasks to anonymous “crowds.” Workers are typically paid per task rather than hourly. The payment is set by the researcher, called a “requester.”

Crowdsourcing has become a key research tool in several disciplines. Social and behavioral scientists use it to recruit diverse participants quickly and cheaply for surveys and experiments. Computer scientists use it for data processing tasks, including producing “training data” for AI systems. Crowdsourcing offers major cost and speed advantages over traditional data processing strategies, such as hiring research assistants.

Research shows that crowdworkers do crowdwork for the money. Most tasks are completed by a small fraction of workers, who are effectively professional crowdworkers. These “pros” often need to work from home without a fixed working schedule, sometimes because of illness, disability, care obligations, or limited local job opportunities. Other crowdworkers supplement income from a “main job” with crowdwork income, but still often rely on their crowdwork income to cover their basic needs such as rent, food, and healthcare.

Yet there are also risks for crowdworkers. Two of the most important have to do with payment.

Crowdworkers are often underpaid. Research has typically found that even among requesters based at universities in, for example, the United States, many researchers pay less than the federal minimum wage ($7.25 per hour). Additionally, tasks that take longer than expected (like the survey Dana did) effectively reduce workers’ hourly wages.

In some cases, crowdworkers’ work is “rejected” and they aren’t paid at all. While rejection is intended to protect requesters from low-quality work, some requesters abuse this feature and decline to pay for tasks that they use. Additionally, sometimes requesters have technical problems evaluating work and accidentally reject work. When this happens, workers can contact the requester, but the requester is not obligated to reply.

People unfamiliar with labor law sometimes ask how it can be legal to not pay people for their work. Crowdsourcing platforms write their own legal terms and typically stipulate that workers are neither employees of the platform nor of the requester. Rather, they are independent contractors and therefore not entitled to employment protections such as minimum wage.

Because crowdsourcing platforms are complex, researchers unfamiliar with these problems facing workers may unintentionally contribute to them. Past research has found that crowdsourcing requesters can improve workers’ experiences by pricing tasks so that workers make at least minimum wage, responding to workers’ questions (especially about rejected work), and acting on worker feedback (for example, about task directions and time estimates). Maybe most fundamentally, requesters should remember that crowdworkers are people, and many rely on crowdwork income.

Many initiatives by workers and researchers have sought to improve working conditions and worker-researcher relations on crowdsourcing platforms. For example, the worker-organizers of Turkopticon, a system used by Mechanical Turk workers to rate requesters, are calling on Amazon to limit the impact of rejections on workers’ ability to get new work.

But despite growing awareness of the challenges facing crowdworkers, research indicates that wages have not meaningfully increased.

This is not a call for researchers to stop using crowdsourcing, or Mechanical Turk specifically. Such a call would only harm the workers who rely on crowdwork income to make ends meet. And although there are alternatives to Mechanical Turk, workers report that Mechanical Turk is still one of the only platforms where someone can sign up and start earning money immediately. Our message is that researchers should simply pay crowdworkers more.

The “Crowdsourcing Wage Pledge” is an attempt to help researchers do that.

In 2018 and 2021 we conducted surveys of academic users of crowdsourcing and found that most respondents would be willing to publicly and voluntarily commit to paying workers a minimum hourly wage.

We have built a prototype website that lets researchers do this. Researchers can use our website to commit to pay workers a fair wage on a per-project basis.

While we understand that academic researchers are not the only requesters on crowdsourcing platforms, we hope that this initiative leads to better earnings for the crowdworkers who provide such crucial inputs to academic work.

To learn more about the Crowdsourcing Wage Pledge, visit wagepledge.org.

Hannah Johnston is a postdoctoral researcher at Northeastern University in Boston. M. Six Silberman is a software engineer at Organise Platform, a London-based social enterprise that supports workers’ rights campaigns. Jamie Woodcock, a senior lecturer at the Open University and a researcher based in London, is the author of several books on labor and the gig economy. This blog post reflects the views of the authors and is not endorsed or supported by their employers. The Crowdsourcing Wage Pledge is supported by Not-Equal, a network funded by UK Research and Innovation.