Digital Sweatshops: The Dark Side of Artificial Intelligence

This is a free excerpt from one of Shortform’s Articles. We give you all the important information you need to know about current events and more.

Don't miss out on the whole story. Sign up for a free trial here .

How is artificial intelligence (AI) trained? What are digital sweatshops? What can be done to prevent exploitation?

With the artificial intelligence market projected to hit $1.3 trillion over the next decade, companies are scrambling to train their AI products. However, training AI comes with hidden costs. These include exploitative labor practices in developing countries and refugee camps.

Continue reading to learn more about the dark side of AI.

AI Training Practices Are Exploitative

With the AI market projected to grow from $40 billion in 2022 to $1.3 trillion or more in the next decade, companies want to better position themselves to get a piece of the pie. This means continuously improving their AI products by training them. But what exactly does it mean to train AI, and who’s tasked to do it?

What Does It Mean to Train AI?

Many people know AI as the tool that makes it easier to write emails, code, and generate images by typing a description in a chatbox. But the technology can do so much more in other fields, from energy production to health—and all this is possible because AI is trained on massive amounts of data. For example, you can ask OpenAI’s ChatGPT to write in a particular literary style because someone has fed it vast volumes of text from books and articles; self-driving cars can run on their own because they’ve been trained to recognize objects like traffic signs and pedestrians.

But an AI model’s output is only as good as its input, so the data it’s fed has to be of good quality. This is where humans come in: Scores of people are now tasked with data labeling (or data annotation), which entails reviewing and tagging text, images, audio, or video to help machines make sense of the data.

Who Is Training AI?

The manpower requirements to sift through and label so much data can be expensive, so companies often try to minimize labor costs. Instead of employing local workers full time, they save money by outsourcing AI training to workers in the Global South, in refugee camps, or even in prison.

The Global South

Some companies like OpenAI and Meta have used outsourcing companies to fill their AI training needs. These outsourcing companies, in turn, have found armies of workers in the Global South, a term used to refer to developing countries in Africa, Asia, and Latin America. 

Various outlets have reported that AI trainers in countries like Kenya, Venezuela, and the Philippines toil in “digital sweatshops”—crowded internet cafes or offices—where they’re often paid a pittance (around $1-3 an hour in Kenya, for example) for data labeling. That’s if they’re even paid at all.

Aside from exploitative labor practices, workers may also be forced to view traumatic content, tasked to label examples of, say, violence and sexual abuse so that AI could learn to identify them.

Refugee Camps

Companies are also finding AI trainers in refugee camps in places like Africa and the Middle East. The companies package the jobs as opportunities for refugees—a way to help them stay afloat without giving them handouts (one company’s motto: “Give work, not aid”). As in the Global South where it can be a struggle to make a living, refugee camps offer a ready supply of workers who are willing to take what they can get even without any assurance of rights, job security, or a living wage.

Prisons

Offshore workers can largely train AI in English, which means companies that need to train AI in other languages have to look elsewhere. In Finland, where wages are high and unemployment benefits provide a sufficient safety net, the pay for AI training isn’t enough to entice local Finnish-speaking workers, so companies are turning to the country’s prisons for cheap labor. At three Finnish prisons, data labeling is now one of the jobs that inmates can volunteer—and get paid—for (along with more traditional prison jobs like making road signs). 

Companies say this is a way to empower inmates and prepare them for the outside world, but it remains to be seen whether the jobs actually give inmates transferable skills.

What Can Be Done to Prevent Exploitation?

While there are new or developing regulations surrounding AI, they largely center on protecting the rights of users. To prevent people from being exploited, regulations would also need to consider the protection of workers, ensuring fair and transparent wage policies and humane working conditions.

Digital Sweatshops: The Dark Side of Artificial Intelligence

Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .

Here's what you’ll get when you sign up for Shortform :

  • Complicated ideas explained in simple and concise ways
  • Smart analysis that connects what you’re reading to other key concepts
  • Writing with zero fluff because we know how important your time is

Hannah Aster

Hannah graduated summa cum laude with a degree in English and double minors in Professional Writing and Creative Writing. She grew up reading books like Harry Potter and His Dark Materials and has always carried a passion for fiction. However, Hannah transitioned to non-fiction writing when she started her travel website in 2018 and now enjoys sharing travel guides and trying to inspire others to see the world.

Leave a Reply

Your email address will not be published.