Alexej Savreux, a 34-year-old from Kansas City, says he has done all sorts of jobs over the years. He has prepared fast-food sandwiches, worked as a janitor and garbage hauler, and taken on technical sound tasks for live theater.
Nowadays, however, his work is less hands-on: He is a trainer for artificial intelligence.
Savreux is part of a hidden army of contract workers who, in recent years, have been working behind the scenes to teach AI systems how to analyze data so they can generate the kind of texts and images that have impressed users of newly popular products like ChatGPT. To improve the accuracy of AI, he has labeled photos and made predictions about what text the apps should generate next.
The pay: $15 an hour and up, with no additional benefits.
Out of the spotlight, Savreux and other contractors have spent countless hours over the past few years teaching OpenAI’s systems to give better responses in ChatGPT. Their feedback fulfills an urgent and endless need for the company and its AI competitors: the provision of data streams with sentences, labels, and other information that serve as training data.
“We are auxiliary workers, but without us, there would be no AI language systems,” said Savreux, who has worked for technology startups, including OpenAI, the San Francisco-based company that released ChatGPT in November and sparked a wave of hype about generative AI.
“You can design all the neural networks you want, you can involve all the researchers you want, but without labelers, you don’t have ChatGPT. You have nothing,” said Savreux.
It’s not a job that will bring Savreux fame or wealth, but it is an essential and often overlooked role in the field of AI, where the apparent magic of a new technological frontier can overshadow the work of contract workers.
“Many discussions around AI are very self-congratulatory,” said Sonam Jindal, program director for AI, Labor and Economy at the Partnership on AI, a non-profit organization based in San Francisco that promotes research and education in the field of artificial intelligence.
“But we’re missing a big part of the story: that this is still heavily reliant on a large human workforce,” she said.
The technology industry has relied on the work of thousands of less skilled, lower-paid workers for decades to build their computer empires: from punch card operators in the 1950s to more recent Google contractors who have complained about their second-class status, including yellow badges that distinguish them from full-time employees. Online contract work through websites like Amazon Mechanical Turk became even more popular during the pandemic.
Now the emerging AI industry is following a similar approach.
The work is characterized by its
uncertain, on-demand nature, with people being employed either directly by a company or through a third-party provider specializing in temporary work or outsourcing, via written contracts. Benefits such as health insurance are rare or non-existent – which means lower costs for technology companies – and the work is usually anonymous, with all recognition going to startup managers and researchers.
The Partnership on AI warned in a 2021 report of a rising demand for so-called “data enrichment work”. It recommended the industry advocate for fair wages and other improved practices and published voluntary guidelines for companies last year.
DeepMind, a Google AI subsidiary, is so far the only technology company that has publicly committed to these guidelines.
“Many people have realized that this is important to do. The challenge now is to get companies to implement it,” said Jindal.
“This is a new job being created by AI,” she added. “We have the potential for this to be a high-quality job and for the workers who do this work to be respected and appreciated for their contributions to enabling this progress.”
Demand has risen, and some AI contract workers are demanding more. In Nairobi, Kenya, on Monday, more than 150 people who worked on AI for Facebook, TikTok, and ChatGPT voted to form a union, citing low wages and the mental strain of the work, Time Magazine reported. Facebook and TikTok did not immediately respond to the vote. OpenAI declined to comment.
So far, AI contract work in the US has not triggered a similar movement among Americans who silently build AI systems word by word.
Savreux, who works from home on a laptop, got into AI contract work through an online job posting. He credits AI contract work – along with a previous job at sandwich chain Jimmy John’s – with helping him escape homelessness.
“Sometimes these necessary, tedious jobs are downplayed,” he said. “It’s the necessary, entry-level area of machine learning.” The $15 an hour is more than the minimum wage in Kansas City.
Job postings for AI contractors refer both to the fascination
of working in a cutting-edge industry and the sometimes tedious nature of the work. An ad from Invisible Technologies, a temporary staffing firm, for an “Advanced AI Data Trainer” indicates that the job is entry-level and starts at $15 an hour, but also that it could be “useful to humanity.”
“Think of it like a language teacher or personal tutor for some of the world’s most influential technologies,” the job posting says. Invisible’s client is not named, but it says that the new employee “would be working according to protocols developed by the world’s leading AI researchers.” Invisible did not immediately respond to a request for further information about its offerings.
There is no definitive tally of how many contractors work for AI companies, but it is an increasingly common form of work worldwide. Time magazine reported in January that OpenAI relied on low-wage Kenyan workers to label texts containing hate speech or sexually abusive language, so its apps are better able to independently identify toxic content.
OpenAI hired about 1,000 remote contractors in regions such as Eastern Europe and Latin America in January to label data or train corporate software in computer engineering tasks, according to the online news platform Semafor.
OpenAI is still a small company, with about 375 employees in January, as CEO Sam Altman tweeted, but this number does not include contractors and does not reflect the full scope of the operation or its ambitions. A spokesperson for OpenAI said no one was available to answer questions about the use of AI contractors.
The work of creating data for training AI models is not always easy and sometimes complex enough to attract budding AI entrepreneurs.
Jatin Kumar, a 22-year-old in Austin, Texas, said he has been working as an AI contractor for a year since graduating college with a degree in computer science. He said it allowed him to gain insight into the development of generative AI technology in the near future.
“It allows you to think about ways to use this technology before it hits the public market,” said Kumar. He is also working on his own technology startup, Bonsai, which develops software to support hospital billing.
As a conversation trainer, Kumar said his main work involves generating prompts: participating in a back-and-forth conversation with chatbot technology, which is part of the lengthy process of training AI systems. The tasks have become more complex with experience, but they started very simply.
“Every 45 or 30 minutes, you would get a new task, generate new prompts,” he said. The prompts could be as simple as “What is the capital of France?” he said.
Kumar said he worked with about 100 other contractors on tasks to generate training data, correct responses, and optimize the model through feedback to the responses.
He said other employees handled “flagged” conversations: reviewing examples submitted by ChatGPT users who, for one reason or another, reported the chatbot’s response to the company for review. When a flagged conversation comes in, he said,
it is sorted by the type of error involved and then used for further training of AI models.
“At first, it was a way for me to help out OpenAI and learn more about existing technologies,” said Kumar. “But now I can’t imagine stepping out of this role.”
Source: nbcnews.com