There’s a Problem With ‘Crowd Labor’

There’s a Problem With ‘Crowd Labor’

There’s a Problem With ‘Crowd Labor’

Online outsourcing comes with costs.

Copy Link
Facebook
X (Twitter)
Bluesky
Pocket
Email

If the online labor platform Mechanical Turk is any indication, the workplace of the future has no walls, no national borders, and virtually no limits to growth. Mechanical Turk, or MTurk, is a rapidly expanding “crowd labor” service run by Amazon, which harnesses the networked nature of our world to create a mammoth transnational hiring hall. But is this digital workplace without limits also operating without sensible labor rules?

A new survey of MTurk’s workforce data by Pew Research Center reveals an “on-demand” labor market that looks a lot like sweatshop-via-telecommute. MTurk users, or “Turkers,” connect to “requesters” through an online platform assigning short-term “intelligence task” projects, ranging from web research to answering questionnaires, in exchange for small electronic payments. It’s piecework for the information age.

The workforce seemingly represents the “future of work,” both its endemic inequities and social promise: younger and more educated, about half holding college degrees, primarily white. But despite their credentials, MTurk earnings often dip well below the legal minimum: “about half of workers (52%) in our sample who were asked about their incomes report earning a rate of less than $5 an hour…. Only 8% say they earn $8 per hour or more.” Those rates are in striking contrast with the national figures: Only 4 percent of wage workers earn at or below the federal minimum of $7.25 per hour, a rate that drops to just 2 percent for workers with bachelor’s degrees.

For many, MTurk isn’t just a side gig. Two-thirds of surveyed workers reported that they used MTurk’s platform daily, and a quarter rely on the site for “all” or “most” of their incomes.

A century ago, the “home work” system earned women pennies an hour cutting trimmings in their kitchens. Today’s Turkers grind away at “short, repetitive ‘micro tasks’ that…could be completed in a few minutes,” earning a shocking dime or less a pop. Work is churned out of isolated environments fanned across countries with vastly different labor standards.

This gig infrastructure outstrips domestic regulations, says civic-media researcher Denise Cheng via e- mail:

Typically, on-demand services require supply and demand to exist within the same region…. It’s much harder to regulate globally distributed marketplaces…. While mTurk is similar to on-demand services like Uber, it isn’t as easy to regulate because it is global.

The social opportunity cost may be immeasurable, endless hours of soul-deadening rote keyboard clicking, with virtually no labor or occupational-safety protections.

Millions of micro tasks add up to big consequences for some fields. According to Pew, “Online outsourcing holds a particular appeal for academics and nonprofit research organizations–many of whom have limited resources compared with corporate America.” MTurk offers minimally regulated, remote access to a bottomless supply of research subjects to fill out huge volumes of survey forms; Pew found nearly nine in 10 requests were for survey completion. According to Google, in 2015 alone, MTurk produced data for more than 800 published academic studies, “ranging from medical research to social science.”

Overall, the crowd labor sector employs roughly 48 million registered workers worldwide, with a current value of some $2 billion set to expand more than tenfold within four years.

Much of that business will be dominated by cash-strapped social scientists pressured to produce more with less. Academics originate just over a third of task requests, a slightly bigger share than businesses (a consolidated pool of employers, concentrated in several large companies), with many gunning for credentials in a “publish or perish” field, chasing hot research topics on dwindling fellowship funds. Turkers working on academic research aren’t sorting spreadsheets for corporations; they’re shaping serious scholarship with clinical and ethical implications. Industrial-scale data harvesting may lead to sloppy or exploitative methodologies, hurting participants and researchers.

According to Science magazine, data analyses indicate that Turker respondents typically skew American, younger, single, more urban, and politically liberal (though surveyers can limit pools intentionally by region and other criteria). Although traditional survey methodologies are also imperfect, particularly in relying heavily on college students as respondents, in-person study participants regularly earn more than MTurkers, about $8 to $10 per hour, with tighter regulatory oversight. Beyond demographics, the pool of MTurk respondents could be limited in critical ways, raising questions about representation, bias, and even the integrity of the responses, Science reports:

Seven psychology labs in the United States, Europe, and Australia ran 114,000 experimental sessions over a 3-year period. The number of unique people among the subjects came to only 30,000. Rather than a pool of half-a-million subjects always on tap…the true number of Turkers that are willing to take part in an experiment at any one time is only about 7300.

There’s been much criticism of the monopolistic control over academic publishing wielded by distributors like JSTOR, but what about the consolidation of research platforms in one private company? For both gig workers and ad hoc employers, crowd labor in the academy awkwardly fuses the furiously profit-driven on-demand economy with research fields that are, ideally, based on deliberation, peer review, and transparency.

Kristy Milland, a veteran Turker, psychology researcher, and crowd-labor advocate, says via e-mail that she has warned her academic colleagues that ethical review protocols are failing to grapple with the ramifications of crowd labor:

we’ve become professional participants…. not only are they exploiting [MTurk] workers; they’re getting data with biases they have not explored adequately enough. When it comes to psychological research, for example, they are often running CLINICAL surveys on a population who might be lying in order to get paid!

If researchers’ scholarly production is based on exploiting MTurk’s hired guns, she adds, “their use of it blindly just to save time and money is disgusting, both to their discipline and to the workers they’re abusing.”

But as an academic herself, Milland points to what’s really driving the gigification of scholarship: a globalizing academic landscape that markets both knowledge and labor as packets of data. “The real issue is publish or perish, as that is what is driving them to make these potentially fatal mistakes in order to pursue a paper. If we don’t change that system, they’ll just find another non-representative population to exploit.”

MTurk didn’t invent the problem, but it does scale it up. While intellectual capital is produced and processed at unprecedented rates, what we don’t know is also growing exponentially.

Ad Policy
x