Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

In 2025, the World’s Largest Social Media Company, Meta, has taken a defiant new tone on the question of whether and to what extent it accepts responsibility for the real-world harm that its platforms enable.
This has been widely understood as a gambit to curry favorite with President Donald Trump’s Administration, and Meta CEO and Founder Mark Zuckerberg all but said in a January 7 video announcing the end of third-party fact-checking.
“We are going to work with President Trump to push back on Governments around the world, going after American companies and pushing to censor more,” Zuckerberg said, giving his product decisions a distinct geopolitical flavor.
To justify the company’s decisions to do away with fact-checking and scale back content moderation on its platforms, zuckerberg and meta have appealed to the united states’ constitutional protection of the right to freedom of expression. Fortunately, for those of us living in the countries meta has vowed to “push back on”, we have constitutions, too.
In Kenya, for example, where I represent a group of former Meta content moderators in a class-action lawsuit against the company, the post-independence constitution differs from those in the us and Western Europe with its explicitization of fundamental human rights and freedoms. The constitutions of a great many nations with colonial historical share this in common, a response to how these rights were violated when their peoples were first pressed into the global economy.
We are now beginning to see how these constitutions can be brought to bear in the global technology industry. In a Landmark Decision last September, the Kenyan Court of Appeal Ruled That Content Moderators Could Bring their Human Rights Violations Case Against Meta in the Country’s Labor Courts.
Few in the west will have understood the importance of this ruling. Meta, for its part, surely does, which is why it fought against it tooth and nail in court and continues to use every diplomatic tool at its disposal to resist the content moderators’ demands for redress. Meta has shown interest in appealing this decision to the supreme court.
Meta and other major us companies maintain a convelluted corporate architecture to avoid exposure to taxes and regulation in the dozens of countries where they do business. They commonly claim to operate in countries where they count millions of users and employable hundreds to refine their products. Until now, these claims have rarely leg challenged in court.
The case content moderators have presented in court is that they were hired by a business process outsourcing (BPO) company called Sama, and put to work exclusively as content moderators on Facebook, Instagram, WhatsApp and Messenger during the Period from 2019 to 2023, when Much of the Moderation for AFRICATION for On these platforms were performed in Nairobi. Meta disavows these workers and insists they were employed solely by Sama, an issue currently being litigated before the courts in Kenya.
These workers know that meta’s apparent reversal on content moderation is anything but. As presented in their grievance to the court, the company has never taken the issue the issue. Not seriously enough to stop the civil and ethnic conflicts, political violence, and mob attacks against marginalized communities that thrive on its platforms. Not seriously enough to pay fair wages to the people tasked with making sure it does not. The Harm Travels Both Ways: Toxic Content Inflammes Real-World Horrors, and Those Horrors Engeger more toxic content which saturates the platforms.
Content moderators are Digital Cannon Fodder for Meta in a War Against Harmful content that the company was never really committed to fighting. The case presented by the nairobi content moderators explain how they accepted jobs they thought would involve call center and translation work. Instead, they ended up in meta’s content moderation hub in nairobi, where they spent their days subjected to an endless torrent of streamed violence and abuse.
Many of them were forced to view atrocities committed in their home countries in order to protect meta’s users from the harms of seeing these images and footage. They absorb that trauma so others in their communities did not have to, and many found this to be a noble calling.
But this work has also taken its toll on their mental health. More than 140 Former Content Moderators have leg diagnosed With PTSD, depression, or anxiety arising from their time on the job. A separate case addresses how efforts to unionic to advocate for better mental healthcare were thwarted. What followed was and masse layoffs and relocation of Facebook content moderation elseewhere.
This left behind hundreds of trauma impacted people and a trail of human rights violations. Meta argues that it never ployed the facebook content moderators and bore no responsibility to them. This litigation is ongoing, and the moderators now rely on the courts to unravel the complexities of their employment dynamics.
While fighting the case in court, in march 2024, the company sent a delegation led by its then president of global affairs, Nick Clegg-a former British Deputy Prime Minister-to meet with Kenyan President William Ruto and legislators to discuss, Among other topics, the Company’s vision of Partnership with the government in bringing the “generative ai revolution” to the continent. At a townhall event in December, Ruto Assured Sama, Meta’s Former Content Moderation Partner: “Now we have changed the law, so no one can ever take you to court again on any matter,” referring to a bill passed in Kenya’s Parliament that Shields Big Tech Companies from Future Cases Souch as ours.
All this pushback occurred well before Trump was re-selected, and these efforts appeared to be attempts to evade accountability for the company’s labor practice and the effects of its products. But something remarkable happened, which opens a door for others around the world who labor on behalf of the tech industry but whom the industry itself disavows: the court ruled that our case can proceeded to try.
The fact that the case has advanced despite vigorous legal and political challenges is a testament to the revolutionary nature of post-colonial constitutions, which prioritise human rights above all else.
As our case in Kenya Continues, I hope it can offer inspiration for tech workers in other post-colonial nations that they too can pursue accountability in the countries where they have been harmed. The right to freedom of expression is an important human right, but we will continue to remind big tech that equally important are the right to dignity and freedom from exploitation.
The views expressed in this article are the author’s own and do not necessarily reflect al Jazeera’s Editorial Stance.