‘Overlooked’ workers who train AI can face harsh conditions, advocates say

Written by on May 20, 2024

‘Overlooked’ workers who train AI can face harsh conditions, advocates say
ABC

(NEW YORK) — Krystal Kauffman, a gig worker, spent one workday watching footage captured by a camera that had been placed on a baby’s head, labeling objects as they came into view, she said. For another job, she said she looked at images of feet, while on another, she marked aerial photographs of animals.

Over nearly a decade, Kauffman has performed thousands of tiny tasks that have helped companies assemble the immense data sets used to train artificial intelligence (AI), she said.

“It’s supposed to look like these products are magic,” Kauffman, who performs tasks on the platform Amazon Mechanical Turk and advocates for workers as a lead organizer for the group Turkopticon, told ABC News. “People don’t know that behind all of this is a workforce – a human workforce.”

AI has reshaped everything from medical diagnoses, to wedding vows, to stock market gains, but the technology wouldn’t be possible without gig workers across the globe, like Kauffman.

However, analysts and advocates said the workers whose efforts help train AI are often denied knowledge of the end product they help create, or the company behind it. They also risk rejection of their work after it has been completed, which can leave them without pay or recourse to collect it.

Philadelphia still the 6th-biggest U.S. city, but San Antonio catching up, census data shows
“If we want to build a better society, we can’t ignore the tens of millions of people who are doing this work,” Sonam Jindal told ABC News. Jindal is the lead of AI, labor and the economy at the Partnership on AI, a coalition of AI organizations. “If they’re overlooked and facing precarious conditions, that’s a problem,” she said.

To mimic human discernment, AI products typically use an algorithm that responds to queries based on lessons learned from scanning a large quantity of text, images or video. An AI tool that helps doctors diagnose cancer, for instance, may train on digital copies of CT scans.

The training material, however, oftentimes must first be curated by human workers, who make the content legible for an AI model, Jindal said.

“AI models don’t know on their own how to distinguish between cats and dogs, whether or not someone has cancer or not, whether something is a stop sign or not,” Jindal explained. “People are very heavily involved in building these datasets.”

A worldwide gig workforce began to swell a decade ago, in part to complete such AI-related tasks, according to a report published in 2021 by Open Research Europe. Roughly 14 million workers have obtained work through online platforms like Amazon Mechanical Turk and Upwork, the study found, which operate as go-betweens for freelance workers and tech firms.

Many of those global workers live in the U.S. Roughly 96% of workers on Amazon Mechanical Turk, for example, log in from the U.S., according to data site MTurk Tracker.

Online gig workers in the U.S. retain flexible schedules, but their tasks carry many of the key characteristics of a “bad job,” Matt Beane, an assistant professor in the Technology Management Program at the University of California, Santa Barbara, told ABC News.

“A bad job basically is one that doesn’t give you a lot of autonomy around what you get to do,” Beane said. “In other words, you don’t feel like there’s a meaningful connection between what you’re doing and some valuable output in the world.”

The lack of meaning stems in part from the mundane nature of the tasks, and the dearth of information provided to freelance workers about the product being developed or the company making it, Jindal said.

“Transparency is a huge problem,” Jindal said. “This partially has to do with a very utilitarian approach to building AI models. People will say, ‘I just need the data.'”

“It gets passed on to someone else who may not have the full context,” Jindal added.

In addition to a lack of clarity about the final product, the AI gig workers run the risk of what they refer to as “mass rejection,” which is when a company declines a batch of work after it has been completed.

In such cases, a worker both loses out on pay and lacks a means for appealing the judgment, Kauffman said, while the company keeps the data the worker produced. Sometimes, she added, companies offering work on Amazon Mechanical Turk reject the data without cause, and change their username as a means of avoiding accountability.

Workers consequently not only lose out on the immediate income, but they also suffer a blow to their approval rating on the platform, which determines the quality of work made available to them, Kauffman said.

“So the more rejections you have, the worse your approval rating gets,” Kauffman explained. “Something like that can take away a person’s entire livelihood.”

In response to ABC News’ request for comment, Amazon said Mechanical Turk monitors for mass rejections and takes appropriate action if they encounter them, up to and including suspension.

The average rejection rate on the platform is less than 1%, Amazon added. Further, the company said it has a Participation Agreement and an Acceptable Use Policy to ensure there is no abuse in the marketplace by either those requesting work, or those agreeing to do tasks.

In her work for Turkopticon, Kauffman and other workers put pressure on Amazon to improve the conditions for the AI gig workforce, she said. The explosion in the popularity of AI products, she added, has generated a surge in public attention around the challenges such workers face.

“It just feels like the power is building and the awareness is building,” Kauffman said. “It’s this incredible feeling.”

Copyright © 2024, ABC Audio. All rights reserved.





Reader's opinions

Leave a Reply


Current track

Title

Artist