CINXE.COM

LISA DESJARDINS: Artificial intelligence comes in many forms, and whether you realize it or not, it is now woven into things we see and touch every day. Things like Googling anything or tagging a friend on Facebook, filing your taxes, using TurboTax, and even using your phone's speak to text feature. AI also is all over the headlines these days as societies confront the growing technology. This week, the U.S. Copyright Office ruled that AI images cannot be legally protected. China launched a crackdown on the popular AI program ChatGPT. And the New York Times recently published a rant by Microsoft's new AI chat feature, which tried to convince the writer to leave his wife. Thought by many to be powered only by computers, AI often relies on a massive human workforce, and there are significant questions about the treatment of those workers. Joining us now to discuss is Sonam Jindal with Partnership on AI, a nonprofit coalition committed to the responsible use of artificial intelligence. I want to start with just understanding the basics here. What are these jobs exactly that people are doing that keep artificial intelligence working? SONAM JINDAL, Program Lead, Partnership on AI: Yeah, absolutely. Well, artificial intelligence products and models require massive amounts of data. So to make it very simplistic, a computer system or an AI algorithm doesn't know on its own whether or not something is a cat or a dog. It needs to be told what is a cat or a dog? So there are people who go through images and label them cat, cat, dog, dog and over time, the algorithm can understand it. And that is true of all AI models. All of them require data and humans to look at that data and classify it. LISA DESJARDINS: Can you help us understand where these people work? I thought maybe this would be call centers, but it's not. What parts of the world and what kind of workplaces are we talking about? SONAM JINDAL: Yeah, this work is done across the globe, but a lot of this work is done in the global south, because, frankly, there's cheaper labor there, and a lot of this is done over digital platforms. So, people are doing this work sometimes in their own homes, sometimes they go in-person, but a lot of it is digitally arbitrated. LISA DESJARDINS: What do we know about the actual content that these workers are sifting through and trying to get a computer to understand? SONAM JINDAL: Yeah, so it really depends on the AI model that companies are trying to build. So, for example, if you have a healthcare company trying to build an algorithm, an AI model, to understand whether or not someone has breast cancer based off of their images, people might be looking at radiology images to understand if someone has cancer or not, and then train the model so that the AI algorithm can actually understand and make predictions based off of the images. But people have to look through that data first. A lot of the work that people have to do sometimes means that they have to interact with toxic content. So, one of the reasons why search and social media are usable is because we see information that we want to see, but that requires people looking through information that we don't want to see. LISA DESJARDINS: Talking about the conditions then for this industry, am I right that there's really no standards for how these workers are treated? And what about the pay? How does that work? SONAM JINDAL: Yeah, I think part of the problem is that we -- when we think about artificial intelligence, we get really excited. There's technology involved, and it's automated, and there aren't people involved. I think part of that narrative is that we forget that humans are central to this work. Oftentimes they face very low pay. Sometimes this work is done for pennies. They have uncertainty of whether or not they're going to get paid, and there's really not a lot of power to contest those decisions. LISA DESJARDINS: These workers are almost in sort of an isolated intellectual factory of sorts. You know, part of the thing in my mind is often we think AI that means machines taking over human jobs. What do we know about how many jobs that actually AI could provide, how important this area could be in terms of labor around the world? SONAM JINDAL: Absolutely. Yeah, I mean, as AI becomes a bigger part of our economy, these are the jobs that are going to enable AI to be built. And I think that's like one of the biggest takeaways in this is that what we call artificial intelligence is not really artificial intelligence. It's human intelligence that we're putting and embedding into data so that we can all benefit from that collective intelligence. And so it's really important that we start valuing that intelligence for what it's worth. This is a labor force that's going to grow, or the demand for this labor is going to grow. And it's important that we recognize the important contributions that these workers are making so that we can develop better AI. LISA DESJARDINS: Briefly, do you see a path toward figuring out standards or figuring out how to establish even basic workplace rules for these workers? SONAM JINDAL: Yeah, absolutely. I -- so one of the things that my organization, Partnership on AI does is try to work with different stakeholders across the industry, academia and civil society to figure out how do we actually improve conditions for these workers? What are the guidelines that we should be following? And ultimately, it's important that becomes regulated and codified and mandated, and there are formal protections for this class of workers. But in the meantime, we have a set of guidelines and resources available for any AI developer to start using today so that they can start incorporating considerations for workers into their day-to-day practices as they're setting up their projects. Sonam Jindal in the Partnership for AI, we certainly have benefited from your human intelligence today. Thank you. SONAM JINDAL: Thank you.