By Jean-Andre Deenik | ADRF

South Africa’s seventh democratic elections in May 2024 marked a critical turning point — not just in the political sphere, but in the digital one too. For the first time in our democracy’s history, the information space surrounding an election was shaped more by algorithms, platforms, and private tech corporations than by public broadcasters or community mobilisation.

We have entered an era where the ballot box is not the only battleground for democracy. The online world — fast-moving, largely unregulated, and increasingly dominated by profit-driven platforms — has become central to how citizens access information, express themselves, and participate politically.

At the Legal Resources Centre (LRC), we knew we could not stand by as these forces influenced the lives, choices, and rights of South Africans — particularly those already navigating inequality and exclusion. Between May 2024 and April 2025, with support from the African Digital Rights Fund (ADRF), we implemented the Democratising Big Tech project: an ambitious effort to expose the harms of unregulated digital platforms during elections and advocate for transparency, accountability, and justice in the digital age.

Why This Work Mattered

The stakes were high. In the run-up to the elections, political content flooded platforms like Facebook, YouTube, TikTok, and X (formerly Twitter). Some of it was civic-minded and constructive — but much of it was misleading, inflammatory, and harmful.

Our concern wasn’t theoretical. We had already seen how digital platforms contributed to offline violence during the July 2021 unrest, and how coordinated disinformation campaigns were used to sow fear and confusion. Communities already marginalised — migrants, sexual minorities, women — bore the brunt of online abuse and harassment.

South Africa’s Constitution guarantees freedom of expression, dignity, and access to information. Yet these rights are being routinely undermined by algorithmic systems and opaque moderation policies, most of which are designed and governed far beyond our borders. Our project set out to change that.

Centering People: A Public Education Campaign

The project was rooted in a simple truth: rights mean little if people don’t know they have them — or don’t know when they’re being violated. One of our first goals was to build public awareness around digital harms and the broader human rights implications of tech platforms during the elections.

We launched Legal Resources Radio, a podcast series designed to unpack the real-world impact of technologies like political microtargeting, surveillance, and facial recognition. Our guests — journalists, legal experts, academics, and activists — helped translate technical concepts into grounded, urgent conversations.

We spoke to:

Alongside the podcasts, we used Instagram to host

Holding Big Tech to Account

A cornerstone of the project was our collaboration with Global Witness, Mozilla, and the Centre for Intellectual Property and Information Technology Law (CIPIT). Together, we set out to test whether major tech companies (TikTok, YouTube, Facebook, and X) were prepared to protect the integrity of South Africa’s 2024 elections. To do this, we designed and submitted controlled test advertisements that mimicked real-world harmful narratives, including xenophobia, gender-based disinformation, and incitement to violence. These ads were submitted in multiple South African languages to assess whether the platforms’ content moderation systems, both automated and human, could detect and block them. The findings revealed critical gaps in platform preparedness and informed both advocacy and public awareness efforts ahead of the elections.

The results were alarming.

  • Simulated ads with xenophobic content were approved in multiple South African languages;
  • Gender-based harassment ads directed at women journalists were not removed;
  • False information about voting — including the wrong election date and processes — was accepted by TikTok and YouTube.

These findings confirmed what many civil society organisations have long argued: that Big Tech neglects the Global South, failing to invest in local language moderation, culturally relevant policies, or meaningful community engagement. These failures are not just technical oversights. They endanger lives, and they undermine the legitimacy of our democratic processes.

Building an Evidence Base for Reform

Beyond exposing platform failures, we also produced a shadow human rights impact assessment. This report examined how misinformation, hate speech, and algorithmic discrimination disproportionately affect marginalised communities. It documented how online disinformation isn’t simply digital noise — it often translates into real-world harm, from lost trust in electoral systems to threats of violence and intimidation.

We scrutinised South Africa’s legal and policy frameworks and found them severely lacking. Despite the importance of online information ecosystems, there are no clear laws regulating how tech companies should act in our context. Our report recommends:

  • Legal obligations for platforms to publish election transparency reports;
  • Stronger data protection and algorithmic transparency;
  • Content moderation strategies inclusive of all South African languages and communities;
  • Independent oversight mechanisms and civil society input.

This work is part of a longer-term vision: to ensure that South Africa’s digital future is rights-based, inclusive, and democratic.

Continental Solidarity

In April 2025, we took this work to Lusaka, Zambia, where we presented at the Digital Rights and Inclusion Forum (DRIF) 2025. We shared lessons from South Africa and connected with allies across the continent who are also working to make technology accountable to the people it impacts.

What became clear is that while platforms may ignore us individually, there is power in regional solidarity. From Kenya to Nigeria, Senegal to Zambia, African civil society is uniting around a shared demand: that digital technology must serve the public good — not profit at the cost of people’s rights.

What Comes Next?

South Africa’s 2024 elections have come and gone. But the challenges we exposed remain. The online harms we documented did not begin with the elections, and they will not end with them.

That’s why we see the Democratising Big Tech project not as a one-off intervention, but as the beginning of a sustained push for digital justice. We will continue to build coalitions, push for regulatory reform, and educate the public. We will work with journalists, technologists, and communities to resist surveillance, expose disinformation, and uphold our rights online.

Because the fight for democracy doesn’t end at the polls. It must also be fought — and won — in the digital spaces where power is increasingly wielded, often without scrutiny or consequence.

Final Reflections

At the LRC, we do not believe in technology for technology’s sake. We believe in justice — and that means challenging any system, digital or otherwise, that puts people at risk or threatens their rights. Through this project, we’ve seen what’s possible when civil society speaks with clarity, courage, and conviction.

The algorithms may be powerful. But our Constitution, our communities, and our collective will are stronger.