Applications are Open for a New Round of Africa Digital Rights Funding!

Announcement |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is calling for proposals to support digital rights work across Africa.

This call for proposals is the 10th under the CIPESA-run Africa Digital Rights Fund (ADRF) initiative that provides rapid response and flexible grants to organisations and networks to implement activities that promote digital rights and digital democracy, including advocacy, litigation, research, policy analysis, skills development, and movement building.

 The current call is particularly interested in proposals for work related to:

  • Data governance including aspects of data localisation, cross-border data flows, biometric databases, and digital ID.
  • Digital resilience for human rights defenders, other activists and journalists.
  • Censorship and network disruptions.
  • Digital economy.
  • Digital inclusion, including aspects of accessibility for persons with disabilities.
  • Disinformation and related digital harms.
  • Technology-Facilitated Gender-Based Violence (TFGBV).
  • Platform accountability and content moderation.
  • Implications of Artificial Intelligence (AI).
  • Digital Public Infrastructure (DPI).

Grant amounts available range between USD 5,000 and USD 25,000 per applicant, depending on the need and scope of the proposed intervention. Cost-sharing is strongly encouraged, and the grant period should not exceed eight months. Applications will be accepted until November 17, 2025. 

Since its launch in April 2019, the ADRF has provided initiatives across Africa with more than one million US Dollars and contributed to building capacity and traction for digital rights advocacy on the continent.  

Application Guidelines

Geographical Coverage

The ADRF is open to organisations/networks based or operational in Africa and with interventions covering any country on the continent.

Size of Grants

Grant size shall range from USD 5,000 to USD 25,000. Cost sharing is strongly encouraged.

Eligible Activities

The activities that are eligible for funding are those that protect and advance digital rights and digital democracy. These may include but are not limited to research, advocacy, engagement in policy processes, litigation, digital literacy and digital security skills building. 

Duration

The grant funding shall be for a period not exceeding eight months.

Eligibility Requirements

  • The Fund is open to organisations and coalitions working to advance digital rights and digital democracy in Africa. This includes but is not limited to human rights defenders, media, activists, think tanks, legal aid groups, and tech hubs. Entities working on women’s rights, or with youth, refugees, persons with disabilities, and other marginalised groups are strongly encouraged to apply.
  • The initiatives to be funded will preferably have formal registration in an African country, but in some circumstances, organisations and coalitions that do not have formal registration may be considered. Such organisations need to show evidence that they are operational in a particular African country or countries.
  • The activities to be funded must be in/on an African country or countries.

Ineligible Activities

  • The Fund shall not fund any activity that does not directly advance digital rights or digital democracy.
  • The Fund will not support travel to attend conferences or workshops, except in exceptional circumstances where such travel is directly linked to an activity that is eligible.
  • Costs that have already been incurred are ineligible.
  • The Fund shall not provide scholarships.
  • The Fund shall not support equipment or asset acquisition.

Administration

The Fund is administered by CIPESA. An internal and external panel of experts will make decisions on beneficiaries based on the following criteria:

  • If the proposed intervention fits within the Fund’s digital rights priorities.
  • The relevance to the given context/country.
  • Commitment and experience of the applicant in advancing digital rights and digital democracy.
  • Potential impact of the intervention on digital rights and digital democracy policies or practices.

The deadline for submissions is Monday, November 17, 2025. The application form can be accessed here.

Democratising Big Tech: Lessons from South Africa’s 2024 Election

By Jean-Andre Deenik | ADRF

South Africa’s seventh democratic elections in May 2024 marked a critical turning point — not just in the political sphere, but in the digital one too. For the first time in our democracy’s history, the information space surrounding an election was shaped more by algorithms, platforms, and private tech corporations than by public broadcasters or community mobilisation.

We have entered an era where the ballot box is not the only battleground for democracy. The online world — fast-moving, largely unregulated, and increasingly dominated by profit-driven platforms — has become central to how citizens access information, express themselves, and participate politically.

At the Legal Resources Centre (LRC), we knew we could not stand by as these forces influenced the lives, choices, and rights of South Africans — particularly those already navigating inequality and exclusion. Between May 2024 and April 2025, with support from the African Digital Rights Fund (ADRF), we implemented the Democratising Big Tech project: an ambitious effort to expose the harms of unregulated digital platforms during elections and advocate for transparency, accountability, and justice in the digital age.

Why This Work Mattered

The stakes were high. In the run-up to the elections, political content flooded platforms like Facebook, YouTube, TikTok, and X (formerly Twitter). Some of it was civic-minded and constructive — but much of it was misleading, inflammatory, and harmful.

Our concern wasn’t theoretical. We had already seen how digital platforms contributed to offline violence during the July 2021 unrest, and how coordinated disinformation campaigns were used to sow fear and confusion. Communities already marginalised — migrants, sexual minorities, women — bore the brunt of online abuse and harassment.

South Africa’s Constitution guarantees freedom of expression, dignity, and access to information. Yet these rights are being routinely undermined by algorithmic systems and opaque moderation policies, most of which are designed and governed far beyond our borders. Our project set out to change that.

Centering People: A Public Education Campaign

The project was rooted in a simple truth: rights mean little if people don’t know they have them — or don’t know when they’re being violated. One of our first goals was to build public awareness around digital harms and the broader human rights implications of tech platforms during the elections.

We launched Legal Resources Radio, a podcast series designed to unpack the real-world impact of technologies like political microtargeting, surveillance, and facial recognition. Our guests — journalists, legal experts, academics, and activists — helped translate technical concepts into grounded, urgent conversations.

We spoke to:

Alongside the podcasts, we used Instagram to host

Holding Big Tech to Account

A cornerstone of the project was our collaboration with Global Witness, Mozilla, and the Centre for Intellectual Property and Information Technology Law (CIPIT). Together, we set out to test whether major tech companies (TikTok, YouTube, Facebook, and X) were prepared to protect the integrity of South Africa’s 2024 elections. To do this, we designed and submitted controlled test advertisements that mimicked real-world harmful narratives, including xenophobia, gender-based disinformation, and incitement to violence. These ads were submitted in multiple South African languages to assess whether the platforms’ content moderation systems, both automated and human, could detect and block them. The findings revealed critical gaps in platform preparedness and informed both advocacy and public awareness efforts ahead of the elections.

The results were alarming.

  • Simulated ads with xenophobic content were approved in multiple South African languages;
  • Gender-based harassment ads directed at women journalists were not removed;
  • False information about voting — including the wrong election date and processes — was accepted by TikTok and YouTube.

These findings confirmed what many civil society organisations have long argued: that Big Tech neglects the Global South, failing to invest in local language moderation, culturally relevant policies, or meaningful community engagement. These failures are not just technical oversights. They endanger lives, and they undermine the legitimacy of our democratic processes.

Building an Evidence Base for Reform

Beyond exposing platform failures, we also produced a shadow human rights impact assessment. This report examined how misinformation, hate speech, and algorithmic discrimination disproportionately affect marginalised communities. It documented how online disinformation isn’t simply digital noise — it often translates into real-world harm, from lost trust in electoral systems to threats of violence and intimidation.

We scrutinised South Africa’s legal and policy frameworks and found them severely lacking. Despite the importance of online information ecosystems, there are no clear laws regulating how tech companies should act in our context. Our report recommends:

  • Legal obligations for platforms to publish election transparency reports;
  • Stronger data protection and algorithmic transparency;
  • Content moderation strategies inclusive of all South African languages and communities;
  • Independent oversight mechanisms and civil society input.

This work is part of a longer-term vision: to ensure that South Africa’s digital future is rights-based, inclusive, and democratic.

Continental Solidarity

In April 2025, we took this work to Lusaka, Zambia, where we presented at the Digital Rights and Inclusion Forum (DRIF) 2025. We shared lessons from South Africa and connected with allies across the continent who are also working to make technology accountable to the people it impacts.

What became clear is that while platforms may ignore us individually, there is power in regional solidarity. From Kenya to Nigeria, Senegal to Zambia, African civil society is uniting around a shared demand: that digital technology must serve the public good — not profit at the cost of people’s rights.

What Comes Next?

South Africa’s 2024 elections have come and gone. But the challenges we exposed remain. The online harms we documented did not begin with the elections, and they will not end with them.

That’s why we see the Democratising Big Tech project not as a one-off intervention, but as the beginning of a sustained push for digital justice. We will continue to build coalitions, push for regulatory reform, and educate the public. We will work with journalists, technologists, and communities to resist surveillance, expose disinformation, and uphold our rights online.

Because the fight for democracy doesn’t end at the polls. It must also be fought — and won — in the digital spaces where power is increasingly wielded, often without scrutiny or consequence.

Final Reflections

At the LRC, we do not believe in technology for technology’s sake. We believe in justice — and that means challenging any system, digital or otherwise, that puts people at risk or threatens their rights. Through this project, we’ve seen what’s possible when civil society speaks with clarity, courage, and conviction.

The algorithms may be powerful. But our Constitution, our communities, and our collective will are stronger.

Kenyan Journalists Trained on Digital Rights and Addressing Online Harms

By Lyndcey Oriko |

The National Cohesion and Integration Commission (NCIC) and the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) have trained 60 Kenyan journalists on addressing digital harms such as hate speech and disinformation.

The training in Naivasha in June 2024 targeted journalists and media workers based in Nakuru County, which the Commission has identified as a conflict hotspot. The journalists were equipped with the knowledge and skills to navigate the complexities of reporting on digital rights and online harms in a more professional and ethical way, particularly during sensitive periods such as conflicts and protests.

The training happened at a time when Kenya was experiencing protests and demonstrations dubbed the #RejectTheFinanceBill2024. The protests saw significant mobilisation and engagement on social media platforms, predominantly TikTok and X. The country had also experienced internet throttling despite assurances by the communications regulator that they had no plans to switch off the internet and calls by civil society actors for the government not to interrupt internet services.

In his opening remarks, NCIC’s Commissioner, Dr. Danvas Makori, underscored the critical role journalists play in mitigating hate speech and fostering peace, particularly during sensitive periods such as conflicts and protests. He highlighted the importance of ethical reporting, particularly in the face of rising disinformation and online hate speech.

Dr. Wairagala Wakabi from CIPESA discussed the challenges to internet freedom, including increased censorship and harassment of journalists and independent content creators. He challenged participants to engage in research to inform their reporting and to leverage the emerging technologies to always verify and fact-check as a way of combating disinformation and online hate speech.

The workshops included in-depth sessions on balancing freedom of expression,  which is guaranteed by article 33 of the Constitution of Kenya 2010, with necessary limitations, such as those aimed at combating hate speech, which is stipulated in the National and Cohesion Integration (NCI) Act, 2008. The training emphasised the importance of protecting offline and online rights, and the journalists were reminded of their responsibilities to uphold rights and freedoms while avoiding content that could harm others.

Making references to the #RejectTheFinanceBill2024, the discussions also tackled various forms of online harm, emphasising the importance of civic education, policy enforcement, and ethical reporting.

On his part, Kyalo Mwengi, the Director Legal Services at NCIC, emphasised the fundamental role of journalists in fostering peace. The training was essential to equip journalists with the skills to verify information, understand the nuances of conflict-sensitive reporting, and to effectively use social media to promote cohesion rather than division and to ensure that the public receives reliable and truthful information.

Liban Guyo, Director Peace Building and Reconciliation at the Commission highlighted the importance of contextualising stories, especially those about conflicts. He said the media can escalate or de-escalate a conflict through their reporting, which underscored the need for conflict-sensitive reporting.

Mwengi also presented some of the Commission’s recommendations to the Parliamentary Cohesion and Equality Committee, which is considering amendments to the 2008 Act through the National Cohesion and Integration Bill, 2023. He noted that because the NCI Act was enacted prior to the passage of the 2010 Constitution, it lacked constitutional powers, thereby affecting its performance and effectiveness. Accordingly, the Commission was proposing that the NCIC should be anchored within the Constitution, like other Commission, with clear funding mechanisms and guaranteed independence. In addition, the amendments should consider the prevailing digital landscape to craft robust online hate speech regulations.

In her remarks, Lucy Mwangi from the Media Council of Kenya (MCK) urged journalists to apply the training’s teachings daily, emphasising ethical standards and the promotion of peace and accuracy, both online and offline. She stressed the importance of being registered and carrying press cards to uphold professional integrity, including ensuring their personal safety.

Some of the key issues raised by the participants include the high cost of verifying information, low digital literacy, lack of awareness of conflict-sensitive reporting, and the reactive approach by social media platforms to hate speech and misinformation that allows harmful content to spread quickly. The workshop not only provided valuable insights into the responsibilities of journalists in the digital age but also fostered a collaborative spirit among media professionals to address the challenges posed by online harms. Given the recent protests against proposed tax hikes in Kenya, the timing of this training was particularly relevant, underscoring the need for responsible reporting amidst heightened social tensions. Overall, this initiative represents a proactive step towards promoting ethical journalism and safeguarding digital rights in Kenya.

Inspiring Inclusion on Women’s Day 2024

By Juliet Nanfuka |

Today, the world celebrates International Women’s Day 2024 under the theme of #InspireInclusion, which encourages the realisation of a gender-equal world free of bias, stereotypes and discrimination. However, amidst the global celebration, it is crucial to spotlight the persistent challenges faced by African female journalists, both online and offline.

A 2020 global survey conducted by UNESCO confirmed a disturbing trend: online attacks targeting women journalists are on the rise at an alarming rate. These attacks are part of a deliberate strategy to intimidate, degrade, and silence women in the media industry. Such violence aims to instill fear, undermine professionalism, discredit journalistic integrity, erode trust in factual reporting, and ultimately stifle women’s active participation in public discourse especially as these attacks don’t just affect the targeted journalists – they also impact their sources and audiences, encourage self-censorship leading to a chilling effect on freedom of expression and access to information.

Research shows that the tactics used to attack women journalists is dominated by online trolling which often takes the form of gendered and sexualised attacks and, often involves body shaming. Trolling which has evolved into the practice of coordinated cyber armies that run campaigns – sometimes sponsored by some government officials and other powerful political actors.

It should be noted that online violence also shifts into offline spaces – with potentially deadly consequences. However, despite this, there remains a disturbing trend, particularly for African women journalists who experience online abuse – they often hesitate to seek justice and, when they do, encounter challenges in having their complaints taken seriously and thoroughly investigated.

Notably, the low levels of digital security skills and the inadequacy of existing laws in tackling trolling and Technology-Facilitated Gender-Based Violence (TFGBV), only exacerbate the challenges African women journalists face in the profession.

African female journalists are instrumental in conveying key narratives, shedding light on issues of importance, and amplifying marginalised voices and concerns. However, the increased affronts to their profession and presence in online discourse encourage self-censorship and unmeasurable impact on access to information and freedom of expression of this key segment of society.

In the first Africa Media Freedom and Journalists’ Safety Report released in 2022, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) in partnership with the United Nations Educational, Scientific and Cultural Organization (UNESCO) reiterated the growing presence of Technology-Facilitated Gender-Based Violence (TFGBV) as a deterrent to press freedom, especially for women.

It is against this backdrop that CIPESA has consistently pursued various interventions aimed at enhancing the safety and inclusion of women in online spaces. Some of the initiatives have been specific to addressing the needs of African women journalists, such as a Media Masterclass and Reporting Grant, research into online safe spaces for women, both of which were conducted under the WomenAtWeb project of Deutsche Welle (DW). Further, CIPESA gave grants aimed at enhancing gendered digital inclusion and women journalists’ safety under the Africa Digital Rights Fund to beneficiaries in Somalia, Malawi and Tanzania, as well as in Ghana and Nigeria

This year, in partnership with the International Programme for the Development of Communication (IPDC) of UNESCO, CIPESA is supporting media development efforts to promote a safe, independent, and pluralistic press, including through addressing the gender dynamics of media freedom and journalists’ safety in Africa.

In recognition of Women’s Month, a series of workshops will be hosted alongside Digital Security Cafes for women journalists, media practitioners, and content producers in Ethiopia, Tanzania, and Uganda.

The workshops will include discussions based on the findings of Africa Media Freedom and Journalists’ Safety Report with a focus on elevating awareness of what can be done to pursue more inclusive measures for women journalists.

Further women’s month efforts will be a webinar on African women in politics with the aim of highlighting the importance of increased political inclusion of women in politics.  The role of active online engagement will be highlighted as a key driver enabling the needs of women in politics in various African countries and as a tool to participate in the information society meaningfully.  More importantly, the webinar will cast a spotlight on how women in active politics in various African countries are pushing back against the negative narratives online and the role that actors such as policy makers  and platforms have to play in addressing TFGBV associated with political spaces and discourse.

Register to participate in the webinar here

New Law in Uganda Imposes Restrictions on Use of Internet

By Rodney Muhumuza |

Ugandan President Yoweri Museveni has signed into law legislation criminalizing some internet activity despite concerns the law could be used to silence legitimate criticism.

The bill, passed by the legislature in September, was brought by a lawmaker who said it was necessary to punish those who hide behind computers to hurt others. That lawmaker argued in his bill that the “enjoyment of the right to privacy is being affected by the abuse of online and social media platforms through the sharing of unsolicited, false, malicious, hateful and unwarranted information.”

The new legislation increases restrictions in a controversial 2011 law on the misuse of a computer. Museveni signed the bill on Thursday, according to a presidential spokesman’s statement.

The legislation proposes jail terms of up to 10 years in some cases, including for offenses related to the transmission of information about a person without their consent as well as the sharing or intercepting of information without authorization.
Opponents of the law say it will stifle freedom of expression in a country where many of Museveni’s opponents, for years unable to stage street protests, often raise their concerns on Twitter and other online sites.
Others say it will kill investigative journalism.

The law is “a blow to online civil liberties in Uganda,” according to an analysis by a watchdog group known as Collaboration on International ICT Policy for East and Southern Africa, or CIPESA.

The Committee to Protect Journalists is among groups that urged Museveni to veto the bill, noting its potential to undermine press freedom.

“Ugandan legislators have taken the wrong turn in attempting to make an already problematic law even worse. If this bill becomes law, it will only add to the arsenal that authorities use to target critical commentators and punish independent media,” the group’s Muthoki Mumo said in a statement after lawmakers passed the bill.

Museveni, 78, has held power in this East African country since 1986 and won his current term last year.

Although Museveni is popular among some Ugandans who praise him for restoring relative peace and economic stability, many of his opponents often describe his rule as authoritarian.

This article was first published by the Washington Post on Oct 13, 2022