CIPESA Endorses a New Deal for AI at the Global India AI Summit

At the Global India AI Summit held on February 16-21, 2026, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) endorsed the call for ReGenAI: A New Deal for the AI Economy. Led by IT for Change, the initiative emerged out of the Towards Regenerative AI conference, which was held as a pre-summit event last year.

The endorsement followed the Roundtable on AI Governance: Redlines to Baselines discussion held on February 18, 2026. The roundtable convened more than 40 leading civil society organisations, researchers, and policy practitioners from across the Global South, alongside various institutions from the North, to rethink and reshape current debates on AI governance.

The roundtable was organised by the Global Digital Justice Forum (Data Privacy Brasil, Derechos Digitales, EngageMedia, ETC Group, IT for Change, Research ICT Africa, Tech Global Institute) Ada Lovelace Institute, Centre for Communication Governance – National Law University Delhi, Planetary AI Network – University of Edinburgh, and The Future Society.

The discussion built on the framework outlined in ReGenAI: A New Deal for the AI Economy, which calls for a shift in the AI paradigm toward meaningful and dignified work, diversified economies, pluralistic knowledge societies, and planetary flourishing.

The meeting highlighted the need for a New Deal that advances an agenda that challenges the dominance of today’s prevailing AI models and calls for systems built upon justice, dignity, inclusion, and sustainability. It argues that the current AI paradigm is defined by the invisibility of workers, the concentration of data value in the hands of a few powerful actors, the extraction of knowledge from communities without recognition or reciprocity, and ecological harm driven by economic interests.

Meanwhile, the New Deal aligns closely with the ambitions of the Africa Declaration on Artificial Intelligence and reflects CIPESA’s call for more participatory and inclusive AI governance.

By endorsing the New Deal for AI at the summit, CIPESA adds its voice to a growing movement demanding a more democratic and accountable AI future. It also serves as a reminder that Africa’s role in the global AI conversation should expand beyond supplying data, labour, or markets for technologies developed elsewhere.

Endorse the New Deal for AI here.

Uganda’s Election and the Lingering Legacy of Internet Blockage

By Juliet Nanfuka |

In two days, as Uganda heads to its presidential and parliamentary elections slated for January 15, 2026, citizens, civil society actors, journalists, and digital rights defenders were stumped with the question, “will they shut down the internet again?” Or, this time, will we see a commitment to adherence to one of the basic fundamentals of digital democracy and have an election in which access to digital communications remains open?  

In recent weeks, anxiety about an impending internet blackout has surged despite Dr. Aminah Zawedde, Permanent Secretary of the Ministry of ICT and National Guidance, and Hon. Nyombi Thembo, Executive Director of the Uganda Communications Commission (UCC), dismissing rumours of plans to shut down the internet, calling them “false and misleading”.

However, for many, these pronouncements have done little to quell suspicions, especially due to the actions witnessed during the 2016 and 2021 elections. During those previous two elections, access to digital communications was restricted, resulting in a block to online communication, commerce, and key avenues for civic engagement.

Various actions in the lead up to the polls have also served to compound the suspicions. In a report issued in January 2026, the United Nations Office of the High Commissioner for Human Rights (OHCHR) describes the arrests of state critics as “arbitrary and discriminatory” and outside of the country’s constitutional guarantees.

Despite the strong constitutional protection of rights, the human rights situation in Uganda during the period under review has been characterized by increasingly restrictive legislation and their arbitrary and discriminatory application. The Government of Uganda has continued to rely on legislation such as the Public Order Management Act (POMA), the Anti-Terrorism Act, the NGO Act, the Computer Misuse (Amendment) Act and the Penal Code Act to shrink civic and democratic space and further weaken political participation, particularly of political opponents and their supporters, as well as the work of civil society, including journalists and human rights defenders.” OHCHR Report on Uganda

Meanwhile, independent media has come under increasing pressure, experiencing various forms of clampdowns in the lead up to the elections, including the denial of advertising spend. In October 2025, independent outlets – NTV Uganda and The Daily Monitor – were denied accreditation to cover parliamentary and presidential proceedings. Reports of harassment, equipment confiscation as well as attacks on journalists during election campaign coverage, and raids on media offices, have been commonplace – underscoring a deteriorating environment for media freedom.

Meanwhile the satellite internet provider Starlink, which has services that can operate independently of terrestrial networks, was halted in Uganda after a regulatory directive in early January 2026, rendering all Starlink terminals inactive ahead of polling day. The satellite internet service provider was providing  services without a valid local license. Critics still argue that the directive serves to limit alternatives for connectivity in the event of broader restrictions on internet access, feeding anxieties about reduced access to independent channels of information.

The UCC has also come under fire following its warning to broadcasters and digital content creators against live coverage of riots, protests, or incidents that could disrupt public order. The regulator stated that only the Electoral Commission may declare election results, and sharing unverified results is illegal. Dr. Zawedde stated, “Media platforms must not be abused to incite violence, spread misinformation, or undermine the credibility of the electoral process.”

By the afternoon of January 13, 2026, a directive circulating online had been issued by UCC to mobile network operators to block public access to the internet, effective at 18:00.

In a public statement, Access Now and the global #KeepItOn coalition had urged President Yoweri Museveni and relevant national authorities to ensure unrestricted internet access throughout the electoral period and to refrain from any disruptive measures that impede the free flow of information. The statement stresses the fundamental role that connectivity plays in inclusive participation, freedom of expression, and the credibility of the electoral process.

Likewise, the African Commission on Human and Peoples’ Rights (ACHPR) also reaffirmed that internet access is a core human right and a necessary condition for free and fair elections, warning against restrictions that would stifle civic space. The Commission called on the Government of Uganda to ratify the African Charter on Democracy, Elections and Governance, signed on January 27, 2013, which emphasises the importance of a culture of peaceful change of power based on regular, free, fair and transparent elections conducted by competent, independent and impartial electoral bodies.

For democracy to flourish in Uganda, authorities must demonstrate their commitment to open digital spaces. This means not only publicly guaranteeing uninterrupted internet access before, during, and after the elections but also building trust through transparency and accountability.  Citizens deserve to communicate freely, monitor the electoral process, and hold all actors accountable without fear of arbitrary disruption.

Ultimately, Uganda’s electoral credibility will not be judged by what happens at polling stations, but by whether the state resists the temptation to control information by disrupting digital access. In an era where civic participation, journalism, election transparency, and even livelihoods heavily rely on digital access, a disruption would signal a fear of accountability.

If the government chooses restraint in the coming hours, it would mark a major departure from a troubling past and offer Ugandans a rare assurance in the election process. If it does not, history will record yet another election where the digital access was shut down to presumably manage dissent rather than protect democracy.

Inform Africa Expands OSINT Training and DISARM-Based Research With CIPESA

ADRF |

Information integrity work is only as strong as the methods behind it. In Ethiopia’s fast-changing information environment, fact-checkers and researchers are expected to move quickly while maintaining accuracy, transparency, and ethical care. Inform Africa has expanded two practical capabilities to address this reality: advanced OSINT-based fact-checking training and structured disinformation research using the DISARM framework, in collaboration with the Collaboration on International ICT Policy for East and Southern Africa (CIPESA).

This work was advanced with support from the Africa Digital Rights Fund (ADRF), administered by CIPESA. At a time when many civic actors face uncertainty, the fund’s adaptable support helped Inform Africa sustain day-to-day operations and protect continuity, while still investing in verification and research methods designed to endure beyond a single project cycle.

The collaboration with CIPESA was not only administrative. It was anchored in shared priorities around digital rights, information integrity, and capacity building. Through structured coordination and learning exchange, CIPESA provided a partnership channel that strengthened the work’s clarity and relevance, and helped position the outputs as reusable methods that can be applied beyond a single team. The collaboration also reinforced a regional ecosystem approach: improving practice in one context while keeping the methods legible for peer learning, adaptation, and future joint work.

The implementation followed a phased timetable across the project activity period from April through November 2025. Early work focused on scoping and method design, aligning the training and research approaches with practical realities in newsrooms and civil society. Mid-phase work concentrated on developing the OSINT module and applying DISARM as a structured research lens, with iterative refinement as materials matured. The final phase focused on consolidation, documentation discipline, and packaging the outputs to support repeatable use, including onboarding, internal training, and incident review workflows.

A central focus has been an advanced OSINT training module built to move beyond tool familiarity into a complete verification workflow. Verification is treated as a chain of decisions that must be consistent and auditable: how to intake a claim, determine whether it is fact-checkable, plan the evidence, trace sources, verify images and videos, confirm the place and time, and document each step clearly enough for an editor or peer to reproduce the work. The aim is not only to reach accurate conclusions but also to show the route taken, including which evidence was prioritized and how uncertainty was handled.

This documentation discipline is not bureaucracy. It is a trust technology. In high-risk information environments, preserved sources, verification logs, and clear decision trails protect credibility, strengthen editorial oversight, and reduce avoidable errors. The module prioritizes hands-on, production-style assignments that mirror real newsroom constraints and trains participants to avoid overclaiming, communicate uncertainty responsibly, and present evidence in ways that non-expert audiences can follow.

In parallel, Inform Africa has applied the DISARM framework to disinformation research. DISARM provides a shared language for describing influence activity through observable behaviors and techniques, without drifting into assumptions. The priority has been to remain evidence-bound: collecting and preserving artifacts responsibly, maintaining a structured evidence log, reducing harm by avoiding unnecessary reproduction of inflammatory content, and avoiding claims of attribution beyond what the evidence supports. This DISARM-informed approach has improved internal briefs, strengthened consistency, and made incidents easier to compare over time and across partners.

Three lessons stand out from this work with CIPESA and ADRF. First, quality scales through workflow, not only through talent. Second, evidence discipline is a strategic choice that protects credibility and reduces harm in both fact-checking and research. Third, shared frameworks reduce friction by improving clarity and consistency across teams. Looking ahead, Inform Africa will integrate the OSINT module into routine training and onboarding and continue to apply DISARM-informed analysis in future incident reviews and deeper studies, reinforcing information integrity as a public good.

This article was first published by Informa Africa on December 15, 2025

Towards Inclusive AI Policies in Africa’s Digital Transformation

By CIPESA Writer |

On November 13, 2025, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) took part in the global PILNET summit on Artificial Intelligence (AI) and its impact on the work of Civil Society Organisations (CSOs). Over three days, the summit assembled stakeholders from across the world in Rome, Italy, to deliberate on various topics under the theme, “Amplifying Impact: Pro Bono & Public Interest Law in a Shifting World.”

CIPESA contributed to a session titled, “Pro bono for AI: Addressing legal risks and enhancing opportunities for CSOs”. The session focused on AI and its potential impacts on the work and operations of CSOs. CIPESA emphasised the need for a universally acceptable and adaptable framework to guide the increased application of AI in the fast-evolving technological era. Furthermore, CIPESA highlighted its efforts in developing a model policy on AI for CSOs in Africa, which is being undertaken with the support of the Thomson Reuters Foundation through its global pro bono network.

Edrine Wanyama, Programme Manager Legal at CIPESA, centered his discussions around ethical and rights-respecting AI adoption, and emphasised the necessity for CSOs to enhance their knowledge and measures of accountability while navigating the AI ecosystem.

Mara Puacz, the Head of Impact at Tech To The Rescue, Ana de la Cruz Cubeiro, a Legal Officer at PILnet, and Megan King, a Senior Associate at Norton Rose Fulbright, shared similar sentiments on the benefits of AI, which include expanding advocacy work and initiatives of CSOs.

They noted the increased demand for transparency and accountability in AI development and use, and the need to minimise harms that marginalised communities face from AI enabled analysis of data sets that often perpetuate bias, gaps in data, and limited or poorly digitalised language sets.

The session cited various benefits of AI for CSOs, such as enabling human rights monitoring, documenting and reporting at various fronts like the Universal Periodic Review, aiding democratic participation, and tracking and documenting trends. Others are facilitating and enhancing environmental protection, such as through monitoring pollution and providing real-time support to agri-business and the health sector by facilitating pest and disease identification and diagnosis for viable solutions.

However, funding constraints not only affect AI deployment but also capacity building to address the limited skills and expertise in AI deployment. In Africa, the inadequacy of relevant infrastructure, data sovereignty fears amongst states, and the irresponsible use of AI and related technologies present additional challenges.

Meanwhile, between October 23 and 24, 2025, CIPESA joined KTA Advocates and the Centre for Law, Policy and Innovation Initiative (CeLPII), to co-host the 8th Annual Symposium under the theme of “Digital Trade, AI and the Creative Economy as Drivers for Digital Transformation”.

The symposium explored the role of AI in misinformation and disinformation, as well as its potential to transform Uganda’s creative economy and digital trade. CIPESA emphasised the need to make AI central in all discussions of relevant sectors, including governments, innovators, CSOs and the private sector, among others, to identify strategies, such as policy formulation and adoption, to check potential excesses by AI.

Conversations at the PILNET summit and the KTA Symposium align with CIPESA’s ongoing initiatives across the continent, where countries and regional blocs are developing AI strategies and policies to inform national adoption and its application. At the continental level, in 2024, the African Union (AU) adopted the Continental AI Strategy, which provides a unified framework for using AI to drive digital transformation and socio-economic development of Africa.

Amongst the key recommendations from the discussions is the need for:

  • Wide adoption of policies guiding the use of AI by civil society organisations, firms, the private sector, and innovators.
  • Nationwide and global participation of individuals and stakeholders, including governments, CSOs, the private sector, and innovators, in AI processes and how it works to ensure that no one is left behind. This will ensure inclusive participation.
  • Awareness creation and continuous education of citizens, CSOs, innovators, firms, and the private sector on the application and value of AI in their work.
  • The adoption of policies and laws that specifically address the application of AI at national, regional and international levels and at organisational and institutional levels to mitigate the potential adverse impacts of AI rollout.

#BeSafeByDesign: A Call To Platforms To Ensure Women’s Online Safety

By CIPESA Writer |

Across Eastern and Southern Africa, activists, journalists, and women human rights defenders (WHRDs) are leveraging online spaces to mobilise for justice, equality, and accountability.  However, the growth of online harms such as Technology-Facilitated Gender-Based Violence (TFGBV), disinformation, digital surveillance, and Artificial Intelligence (AI)-driven discrimination and attacks has outpaced the development of robust protections.

Notably, human rights defenders, journalists, and activists face unique and disproportionate digital security threats, including harassment, doxxing, and data breaches, that limit their participation and silence dissent.

It is against this background that the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), in partnership with Irene M. Staehelin Foundation, is implementing a project aimed at combating online harms so as to advance digital rights. Through upskilling, advocacy, research, and movement building, the initiative addresses the growing threats in digital spaces, particularly affecting women journalists and human rights defenders.

The first of the upskilling engagements kicked off in Nairobi, Kenya, at the start of December 2025, with 25 women human rights defenders and activists in a three-day digital resilience skills share workshop hosted by CIPESA and the Digital Society Africa. Participants came from the Democratic Republic of Congo, Madagascar, Malawi, South Africa, Tanzania, Uganda, Zambia, and Zimbabwe. It coincides with the December 16 Days Of Activism campaign, which this year is themed “Unite to End Digital Violence against All Women and Girls”.

According to the United Nations Population Fund (UNFPA), TFGBV is “an act of violence perpetrated by one or more individuals that is committed, assisted, aggravated, and amplified in part or fully by the use of information and communication technologies or digital media against a person based on their gender.” It includes cyberstalking, doxing, non-consensual sharing of intimate images, cyberbullying, and other forms of online harassment.

Women in Sub-Saharan Africa are 32% less likely than men to use the internet, with the key impediments being literacy and digital skills, affordability, safety, and security. On top of this gender digital divide, more women than men face various forms of digital violence. Accordingly, the African Commission on Human and Peoples’ Rights (ACHPR) Resolution 522 of 2022 has underscored the urgent need for African states to address online violence against women and girls.

Women who advocate for gender equality, feminism, and sexual minority rights face higher levels of online violence. Indeed, women human rights defenders, journalists and politicians are the most affected by TFGBV, and many of them have withdrawn from the digital public sphere due to gendered disinformation, trolling, cyber harassment, and other forms of digital violence. The online trolling of women is growing exponentially and often takes the form of gendered and sexualised attacks and body shaming.

Several specific challenges must be considered when designing interventions to combat TFGBV. These challenges are shaped by legal, social, technological, and cultural factors, which affect both the prevalence of digital harms and violence and the ability to respond effectively. They include weak and inadequate legal frameworks; a lack of awareness about TFGBV among policymakers, law enforcement officers, and the general public; the gender digital divide; and normalised online abuse against women, with victims often blamed rather than supported.

Moreover, there is a shortage of comprehensive response mechanisms and support services for survivors of online harassment, such as digital security helplines, psychosocial support, and legal aid. On the other hand, there is limited regional and cross-sector collaboration between CSOs, government agencies, and the private sector (including tech companies).

A guiding strand for these efforts will be the #BeSafeByDesign campaign that highlights the necessity of safe platforms for women as well as the consequences when safety is missing. The #BeSafeByDesign obligation shifts the burden of responsibility of ensuring safety in online spaces away from women and places it on platforms where more efforts on risk assessments, accessible and stronger reporting pathways, proactive detection of abuse, and transparent accountability mechanisms are required. The initiative will also involve the practical upskilling of at-risk women in practical cybersecurity.