Our first Learning Lab was on Thursday the 23rd of May (8am PDT | 11am EDT | 6pm EAT/GMT) and focused on the AI Supply Chain & Labour. In this session, we illustrated how tech companies obfuscate human labour, and lead consumers to believe that it’s all AI and automation.
Speakers for the May Learning Lab:
Catherine Bracy (CEO of Tech Equity)
James (Moyez) Oyange (union organizer and former TikTok content moderator)
Martha Dark (director of Foxglove)
Andrea Dehlendorf (organizer and adviser on political economy of AI issues)
Alix Dunn, moderator (CEO of Computer Says Maybe)
Watch the recording
Password: AI
AI & Labour: how can we prevent tech firms from outsourcing accountability?
Many of the points raised in this session were grounded in the blatant asymmetry of power in tech labour: the fact that tech companies exert too much influence into their industry (and others!), with barely a smidge of accountability — and their workers have very little leverage.
Sama is a content moderation company that Meta, OpenAI, and others lean on to outsource workers — when these workers tried to unionise, Meta made them all ‘redundant’, and took their business elsewhere. This story is key, because it opens up two top-level considerations for those who want to meaningfully address this kind of worker exploitation and union-busting.
The first is that this false ‘redundancy’ makes clear that these workers were not redundant, but a nuisance. For a social media platform, content moderation is a core business practice. In fact, as Martha mentioned, social media is almost exclusively content moderation. Facebook employs more content moderators than engineers, and they are essential safety workers for any platform.
The second consideration is that content moderators represent a sizeable foundational slab of invisible labour that props up big tech firms. It’s extremely important to understand the role of content moderators in the ecosystem, because this specific type of labour (offshore, cheap, and invisible) is used in many ways, beyond moderating content:
To mask a lack of actual innovation, such as ‘AI-powered’ checkout systems, as we saw recently in Amazon Fresh stores.
To provide services that are fundamental to the big tech business model, but without the accountability of an employment relationship.
To skirt regulatory capture, by classifying this work as a ‘service’ rather than a product — where a product would have its supply chain scrutinised. Think about the people who label training data for AI models; this is a very similar job to content moderation.
There is a long and precarious supply chain that sits behind the products of these technology companies, and making this visible would be an effective lever at driving change in this area. Andrea mentioned the 2013 factory fire in Bangladesh which killed over a thousand workers. This event catalysed a labour-led movement to improve worker conditions in the garment industry; the AI supply chain and tech workers in general should not require the same catastrophic event to push regulation forward.
A lot of this is down to a lack of transparency: the outsourced tech workforce is diffused across global majority countries, and is supporting the proliferation of digital technologies. These are all non-physical products, that make the system involved in creating them that much harder to keep track of. At least with a factory you can see it, and all the workers are in one place, and accounted for. Whereas data on outsourced workers is not reported on, making it very difficult to even know how many there are, and what specific areas they work in.
The discussion made clear that doing what we can to make these processes transparent, and giving workers more power, are the best starting points for dismantling the outsized influence tech firms have on the labour market.
Links that were shared in the session
Open Letter to President Biden from Tech Workers in Kenya (Foxglove)
Tech firms’ Kenyan contractors lobby Biden for labor protections (WaPo)
So, Amazon’s ‘AI-powered’ cashier-free shops use a lot of … humans. Here’s why that shouldn’t surprise you (The Guardian)
Google scraps minimum wage, benefits rules for suppliers and staffing firms (Reuters)
More from our panellists
Catherine Bracy wanted to share TechEquity's initial research on contract workers in tech and their research update from earlier this year. See also their Responsible Contracting Standard for tech workers.
Andrea Dehlendorf wanted to share the following:
“I am working on a project with Brian Kettenring on landscaping and laying the foundation for global collaboration between those working across disciplines on AI with a political economy framework at the recently launched Global Fund for a New Economy (GFNE). We are holding a Global Learning and Strategy Convening on the Political Economy of AI, Data, and Digital Technologies next month in Mexico City, including a conversation on AI Supply Chain with support from both Omidyar and the Ford Foundation. Our goal is to deepen relationships, develop a shared analysis, and seed interest in some new collaborations.”
GFNE was founded in early 2024 with two main objectives: the first aim is to substantially increase the resource base of the new economy field — a nascent international field of think tanks, universities, community organizations, businesses, advocates, and media. The second is to dramatically strengthen the institutional infrastructure of the new economy sector through a number of activities, including investment in transnational communication and accelerates coordination and impact.