When Fighting Disinformation Becomes a Threat to Freedom

By Reyhana Masters |

The phrase “misinformation crisis” used to evoke images of shadowy troll farms and bot networks manipulating elections from afar. Today, the crisis is extremely close – in WhatsApp groups, TikTok reels, and “breaking news” alerts that collapse under scrutiny. The more urgent question is no longer whether Africa faces a polluted information ecosystem but how the continent responds to it.

A February 2026 regional engagement convened by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) gathered members of the judiciary, data protection authorities, communications regulators, law enforcement officers and National Human Rights Institutions (NHRIs) to examine the scale and impact of digital harms.

CIPESA’s Victor Kapiyo set the tone with a reminder that disinformation is not simply about false content; it is about power, intent, amplification, and impact. Discussions focused on responses that separate genuine harm from protected expression.

Disinformation has become sophisticated and professionalised, often backed by political or commercial interests with the resources to manipulate narratives at scale. It moves across borders, shielded by opaque algorithms and corporate structures that complicate national oversight.

Nigeria’s elections illustrate this phenomenon, with political contestation unfolding not only at rallies and ballot boxes, but across encrypted messaging platforms, influencer networks and algorithm-driven feeds.

Fabricated audio recordings, doctored endorsements, and deepfake videos circulated widely. One false claim suggested that President Donald Trump would intervene in Nigeria’s election – a fabrication designed to exploit geopolitical anxieties as well as domestic political and religious tensions.

What makes the Nigerian case instructive is not only the scale of falsehoods, but the architecture behind them. Influencers are reportedly paid significant sums to seed and normalise partisan narratives. Political actors assemble coordinated digital teams to produce, test and amplify content across multiple platforms simultaneously.

“Elections and armed conflicts are key drivers of disinformation. Governments have used both disinformation and the response to it to entrench themselves in power, shrink civic space, and target opponents and critics.” Source: Disinformation Pathways and Effects: Case Studies from Five African Countries.

Even trained journalists, facing financial strain in struggling media markets, are sometimes recruited into propaganda networks that blur the line between professional reporting and political messaging. Moreover, some foreign state actors invest in narrative campaigns to advance their geopolitical interests, viewing African electoral environments as arenas for strategic influence.

A Wider Continental Pattern

Across Africa, disinformation thrives at the intersection of several reinforcing vulnerabilities: intense political competition, widening economic inequality, weak and underfunded media ecosystems, gaps in platform governance, low levels of media literacy and the growing entanglement of foreign geopolitical interests in domestic affairs.

In many contexts, independent newsrooms struggle financially, leaving audiences vulnerable to cheaper, sensationalist content engineered for virality. Regulatory frameworks are often outdated or overly broad, oscillating between under-enforcement and heavy-handed crackdowns that conflate criticism with criminality.

Meanwhile, global technology platforms operate across borders with inconsistent content moderation standards, creating jurisdictional grey zones that undermine accountability.

Beyond Criminalisation

Experience from across the continent suggests that criminalising individual users for “false information” is a blunt and frequently counter-productive response. Without clear legal definitions, disinformation laws can be weaponised against journalists, opposition figures and ordinary citizens exercising legitimate expression.

Indeed, this has been witnessed in countries such as Kenya and Uganda, where laws on “false news” or “computer misuse” have been invoked to arrest and prosecute individuals over what appears to be protected speech.

Effective responses to disinformation require a more layered approach. Clear and precise legal definitions are essential to distinguish between harmful coordinated manipulation and protected speech. Safeguards must be embedded to prevent abuse of disinformation laws for political ends. Platform accountability mechanisms need strengthening, particularly around transparency in political advertising, algorithmic amplification, and coordinated inauthentic behaviour.

Equally critical is sustained investment in media literacy so that citizens are better equipped to interrogate sources and narratives. Independent journalism must be protected and financially supported as a public good. Oversight of coordinated political digital campaigns – including disclosure of funding sources and sponsorship structures – is necessary to illuminate the financial and logistical structures behind viral content.

Following the Money

Focusing on individual users such as those who forward or share content misses the deeper architecture of harm. Without tracing and addressing the networks that design, fund and amplify these campaigns, regulatory responses risk treating symptoms rather than causes.

Participants were urged to draw careful distinctions between misinformation (false information shared without harmful intent), disinformation (deliberate deception), and malinformation (genuine information used to cause harm). Yet these distinctions are often blurred in law. As Kapiyo explained, “when legislation uses vague terms like ‘false news’, ‘annoying’, or ‘offensive’, it creates a net so wide that legitimate criticism can be trapped within it.”

Across several African countries, disinformation laws have been invoked not to dismantle coordinated fraud networks, but to prosecute critics, journalists and opposition voices. This is specifically when governments intervene in digital spaces when their political legitimacy is threatened or when electoral narratives are challenged and when protest movements emerge.

However, the same urgency is not always visible when harmful misinformation spreads socially, when children are exposed to abuse content, or when online fraud syndicates operate at scale.

Several participants observed that enforcement patterns often mirror political anxieties rather than objective harm assessments. “We must ask ourselves,” one judicial officer reflected during the discussions, “are we responding to harm, or are we responding to discomfort?”

Another participant from an NHRI cautioned that credibility is eroded when states appear animated only by speech that threatens authority. “If citizens see that the law moves fastest against critics but slowest against fraudsters and child exploitation networks, trust collapses,” she noted. “And once trust collapses, regulation itself becomes suspect.”

Kapiyo urged the room to think beyond reactionary fixes and toward structural reform: “Digital harms are real but so are constitutional protections. The challenge is not choosing one over the other but instead the solution lies in designing responses that respect both.”

This tension between legitimate regulation and opportunistic control formed a key undercurrent throughout the engagement. Participants repeatedly returned to the same conclusion: a polluted ecosystem cannot be cleaned with contaminated tools. If the response lacks proportionality, clarity and fairness, it risks becoming part of the problem it seeks to solve.

Participants agreed that responses must balance addressing harm with protecting constitutional rights. The test of legality, legitimacy and proportionality remains essential: if a restriction fails one, it fails entirely.

From Discussion to Duty

As the engagement drew toward its close, the conversation shifted from diagnosis to responsibility. Who, precisely, must act and how?

For legislators, the recommendation was unequivocal: draft narrowly tailored laws grounded in clear definitions. Avoid vague formulations such as “false news” that collapse complex categories into blunt offences. Embed explicit safeguards against abuse, including independent oversight and sunset clauses that require periodic review.

For the judiciary, the charge was equally clear: rigorously interrogate executive claims of harm. Apply constitutional proportionality tests consistently. Insist on evidence of coordinated manipulation rather than speculative assertions of public disorder. Judicial independence, several participants noted, is the difference between regulation and repression.

Communications regulators and data protection authorities were urged to strengthen transparency requirements for political advertising and algorithmic amplification. “If money is shaping narratives,” one regulator observed, “then disclosure must follow the money.” Cross-border cooperation will be essential, particularly where coordinated campaigns operate across jurisdictions.

Law enforcement agencies were encouraged to prioritise organised fraud networks, child exploitation rings and coordinated digital criminal enterprises – areas where harm is demonstrable and urgent – rather than focusing disproportionate energy on individual expression. Capacity-building in digital forensics and evidence preservation was identified as critical.

And for civil society and media institutions, the focus is on resilience: invest in investigative capacity to expose coordinated campaigns, strengthen fact-checking networks, and expand media literacy initiatives so that citizens can interrogate viral narratives without defaulting to cynicism.

The Forum on Internet Freedom in Africa 2026 (FIFAfrica26) Heads to Mauritius!

By FIFAfrica |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is excited to announce that the 2026 Forum on Internet Freedom in Africa (#FIFAfrica26) will take place in Mauritius from September 28 to October 1, 2026. This marks the 13th edition of FIFAfrica, which serves as Africa’s premier platform for advancing digital rights, inclusion, and governance conversations.

Mauritius is Africa’s most stable democracy, with strong rule of law and robust protection for freedom of expression, both offline and online. The country is also a pioneer in technology-driven governance and digital transformation. Hosting FIFAfrica in Mauritius offers a space for policy-oriented discussions on data governance, regulation of Artificial Intelligence (AI), platform accountability, and digital trade within a context that is progressively navigating these transitions.

Recognised by the Oxford Insights Government AI Readiness Index 2024 as a continental front-runner in AI adoption, Mauritius has established a National AI Unit, is steadily expanding digital public services, and is pursuing a national strategy for the African Continental Free Trade Area (AfCFTA) Digital Trade Protocol. The country is an emerging digital and financial services hub, with notable achievements and ambitions in fintech, cross-border data flows, and digital public services.

It is against this backdrop that over 500 participants will convene at FIFAfrica26 to delve into the evolving digital landscape in Africa and cast a light on the most pressing internet freedom issues today. This builds on a legacy of previous editions hosted in Uganda, South Africa, Ghana, Ethiopia, Zambia, Tanzania, Senegal, and Namibia.

FIFAfrica offers a unique, multi-stakeholder platform where key stakeholders, including policymakers, journalists, global platform operators, telecommunications companies, regulators, human rights defenders, academia, and law enforcement deliberate and craft rights-based responses for a resilient and inclusive digital society for Africans.

Key themes at FIFAfrica26 will include:

  • Digital Democracy and Civic Participation
  • Data Governance and Sovereignty
  • Artificial Intelligence and Emerging Technologies
  • Platform Accountability
  • Digital Inclusion
  • Digital Economy, Trade and Practices
  • Movement Building
  • Digital Security and Safety

The Forum will also serve to gather insights that will shape Africa’s voice in global digital governance processes like WSIS+20. These global processes represent critical opportunities for African voices to influence the emerging digital and AI governance agendas.

Additionally, the 2026 edition of the annual State of Internet Freedom in Africa report will be launched. The reports serve to capture the evolving policy landscape through in-depth research and analysis.

Be part of the #InternetFreedomAfrica movement

Over the years, the Forum has been co-hosted with various government ministries, regional and national partners, and a vibrant network of allies and collaborators. Together, this community has illustrated a commitment towards building an inclusive digital rights and internet freedom ecosystem around the continent.

Partner with us, host a side event, or support the participation of individuals who might otherwise be unable to attend the Forum. Email us at [email protected] or schedule a call with us to discuss how to collaborate with FIFAfrica26.

Next Steps

In the lead-up to FIFAfrica26, please take note of the following important dates.

Important Dates
Opening of registrationApril 7, 2026
Call for proposals and travel support announcementApril 7, 2026 
Closure of proposal acceptance June 5, 2026
Notification of successful applicants Jule 7, 2026
Pre-event daysSeptember 28-29, 2026 
Main event daysSeptember 30 – October 01, 2026

About FIFAfrica

Since its launch in 2014, the Forum on Internet Freedom in Africa (FIFAfrica) has grown into Africa’s premier multi-stakeholder convening on digital rights, digital democracy, and internet governance. The Forum has consistently shaped continental and global conversations on freedom of expression, access to information, privacy, data governance, and has integrated more recent shifts in the digital ecosystem including on topics like cryptocurrency, AI, platform accountability, and digital public infrastructure. Visit the FIFAfrica website for updates: https://internetfreedom.africa/

CIPESA Public Dialogue Series: Interrogating Digital Public Infrastructure in East Africa

By CIPESA Writer |

Digital transformation is reshaping governance, service delivery, and civic life across Africa. At the centre of this transformation is the growing adoption of Digital Public Infrastructure (DPI) — foundational interoperable digital systems such as digital identity programmes, payment systems, and data exchange frameworks that enable governments to deliver services at scale.

Across Eastern Africa, governments are increasingly investing in DPI as a core pillar of their digital transformation strategies. These systems promise to improve administrative efficiency, expand access to services, and support more inclusive digital economies.

However, DPI is not merely a technical infrastructure. It is also an institutional and political infrastructure. The way these systems are designed, governed, and implemented can shape power relations, accountability structures, privacy protections, and citizen participation in the digital state.

Despite the growing importance of DPI, public debate around these systems remains limited. A study by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) into media coverage of DPI in Eastern Africa shows that reporting is largely government-centric and event-driven, focusing primarily on announcements and service delivery benefits while giving limited attention to governance arrangements, procurement processes, rights protections, and questions of inclusion.

Strengthening informed public discourse around DPI is therefore critical. Greater participation by civil society, journalists, policymakers, technologists, and citizens can help ensure that emerging digital systems are transparent, accountable, inclusive, and aligned with the public interest.

To contribute to this goal, CIPESA is convening a series of public dialogues on DPI in Eastern Africa. Through four in-depth discussions, the CIPESA public dialogue series will explore key dimensions of DPI implementation such as governance and accountability, data protection and trust, inclusion and equity, and cross-regional learning, while bringing together diverse stakeholders to deepen public understanding and encourage more critical engagement with the region’s digital transformation.

The details of the CIPESA Public Dialogue dare listed below. Be sure to mark your calendar for each dialogue!

Follow @cipesaug on social media and join the conversation using #DPIAfrica and #DPIJournalism.

Dialogue 1: Interrogating DPI: Governance, Power, and Accountability

Background: As governments across Eastern Africa accelerate the rollout of Digital Public Infrastructure systems, questions of governance, oversight, and accountability are becoming increasingly important.

While DPI initiatives are often presented as tools for efficiency and innovation, they also shape power relations within the digital state. Decisions about who designs these systems, who controls the data they generate, and how procurement and partnerships are structured can significantly influence how public digital systems operate and whom they ultimately serve.

Yet public scrutiny of these governance questions remains limited. Media coverage frequently focuses on the technical benefits of DPI, such as improved service delivery, while giving less attention to governance arrangements, procurement transparency, and mechanisms for accountability when systems fail.

This dialogue will examine the political economy of DPI, focusing on questions of governance, oversight, transparency, and accountability as the region expands its digital infrastructure.

Date: March 24, 2026 | 15:00 PM Nairobi | Reserve your seat

Dialogue 2: Interrogating DPI: Data, Privacy, and Trust

Background: Digital Public Infrastructure systems depend heavily on the collection, processing, and exchange of large volumes of personal data. While these systems can improve efficiency and coordination across government services, they also raise significant questions about privacy, surveillance, and data protection.

Public discourse around DPI in Eastern Africa has largely focused on service delivery benefits, with relatively limited attention to the risks associated with data governance and citizen trust.

CIPESA’s media analysis similarly shows that journalists tend to under-report issues of data protection, surveillance, and the enforcement of privacy laws, despite growing public concerns about the misuse of personal data and weak institutional safeguards.

This dialogue will examine whether DPI systems in Eastern Africa are being designed and implemented in ways that protect rights and build public trust.

Date: March 31, 2026 | 15:00 PM Nairobi | Reserve your seat

Dialogue 3: Interrogating DPI: Inclusion, Equity, and Gender

Background: Digital Public Infrastructure is often framed as inclusive by design. However, evidence from across Eastern Africa suggests that issues of equity, access, and representation remain underexplored in both policy debates and media coverage.

Media analysis conducted by CIPESA reveals limited reporting on how DPI systems affect citizens differently based on gender, geography, income, and digital access. It also highlights a significant gender imbalance in media sources, with roughly 80 percent of quoted sources being male.

Yet digital systems can inadvertently reinforce existing inequalities if barriers related to connectivity, digital literacy, affordability, identification documents, or social norms are not addressed.

This dialogue will explore whether DPI initiatives are truly delivering on their promise of inclusion, and who may be left behind by digital transformation.

Date: April 7, 2026 | 15:00 PM Nairobi | Reserve your seat

Dialogue 4: Interrogating DPI: Cross-Regional Learning Session

Africa is undergoing a profound digital transformation. The African Union Digital Transformation Strategy (2020–2030) encourages member states to develop Digital Public Infrastructure and Digital Public Goods as foundations for inclusive service delivery, digital trade, and economic growth.

However, public participation in shaping these developments remains limited, partly due to insufficient public discourse and limited specialised reporting on DPI and DPGs.

To address this gap, Co-Develop partnered with regional organisations, including the Media Foundation for West Africa (MFWA) and CIPESA, to establish journalism fellowships focused on DPI reporting in West and Eastern Africa.

MFWA launched the first fellowship in West Africa in 2023, generating valuable lessons and case studies. CIPESA has since adapted the fellowship model for Eastern Africa, creating opportunities for cross-regional learning among journalists and ecosystem actors.

This session will bring fellows and stakeholders from both regions together to share lessons, experiences, and strategies for strengthening public discourse on DPI and DPGs.

Date: April 14, 2026 | 15:00 PM Nairobi | Reserve your seat

Webinar: Advancing Platform Accountability for Women’s Online Safety in Africa

By CIPESA Writer |

The right to freedom of information, access to information, and democratic engagement belongs equally to all citizens. Yet across Africa, women and girls continue to face significant barriers that prevent them from exercising these foundational rights in digital spaces. This calls for urgent legal and enforcement mechanisms to ensure that women and girls can safely access, contribute to, and participate in information ecosystems. Furthermore, social media and Artificial Intelligence (AI) platforms must address the gendered concerns that impact women’s experiences in online spaces, including disproportionate online harassment, algorithmic discrimination, and digital exclusion. See more here.

In support of this year’s International Women’s Day (IWD) commemoration, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is hosting a webinar titled “Advancing Platform Accountability for Women’s Online Safety in Africa” on Thursday, 19 March 2026 (14:00–15:30 EAT). The webinar resonates with this year’s IWD theme: “Rights. Justice. Action. For ALL Women and Girls.”

Register for the webinar here.

The webinar brings together experts to discuss what efforts are being made to enhance the safety of women in online spaces and to hold platforms accountable. It supports the sentiments of United Nations (UN) General Assembly President Annalena Baerbock, who last week, in her opening remarks at the 70th UN Commission on the Status of Women, noted that the gaps in legal rights afforded to women are deliberate. She stated that “these are not oversights but deliberate choices — choices that violate the UN Charter, the Universal Declaration of Human Rights, and 70 years of commitments made in this Commission.”

This webinar serves as a platform for discussion and insight into the various efforts needed to address these gaps. The discussion is open to all and features expert panelists from diverse backgrounds with a vested interest in advancing the safety of women in online spaces.

Meet the Panelists

Barbra Okafor | Founder and Lead Strategist, The Agency Lab

Barbra is the Founder of The Agency Lab, an initiative empowering African creatives and organisations to secure data ownership, protect Intellectual Property (IP), and navigate fair compensation in the AI economy. Drawing on her previous roles as Content Programming Lead at TikTok Sub-Saharan Africa and Senior Producer at BBC Media Action, Barbra translates complex digital transitions into actionable strategies. Her work sits at the critical intersection of the African creator economy, technology, and governance.

Marie-Simone Kadurira | Feminist researcher and communications strategist

Marie-Simone works at the intersection of gender, technology, and social justice. Her work focuses on technology-facilitated gender-based violence (TFGBV), digital rights, and the ways in which online platforms shape access, safety, and agency for women and marginalised communities. She has contributed to research and advocacy efforts examining how online harms disproportionately affect women, particularly in the Global South, and has worked with international organisations to develop policy and communications strategies aimed at strengthening platform accountability and advancing safer digital environments. Her work engages with questions of governance, content moderation, and the structural inequalities embedded within digital ecosystems. She is currently engaged in research on gender-based violence prevention and supports initiatives that centre community-led approaches to justice, safety, and digital inclusion.

Mercy Mutemi | Executive Director, The Oversight Lab Africa

Mercy’s work advocates for fair regulation and deployment of technology across Africa. She focuses on restorative and retributive justice solutions for those who have been harmed by technology. This includes workers who have been exploited to build and maintain technological systems and those who have been harmed by algorithms.  She has worked on several cases and initiatives focused on addressing inequality and the unconsidered consequences of tech algorithms for African communities and societies. She currently represents a cohort of content moderators based throughout Africa in a suit over workplace human rights violations.

Abdul Waiswa | Head of Litigation, Prosecution and Legal Advisory, Uganda Communications Commission

Waiswa works as the Head of Litigation, Prosecution and Legal Advice at the Uganda Communications Commission (UCC), which is the statutory regulator of the converged communications sector in Uganda, with a mandate to license, supervise, and facilitate the development of a robust communication sector in Uganda. He oversees the legal advisory, licensing, and enforcement functions of UCC and supports the implementation of the Government of Uganda’s policies on ICT. Waiswa is a regular participant in national, regional, and international engagements on internet jurisdiction, data, and other ICT-related policy matters.

Lilian Nalwoga | Programme Manager, CIPESA

Lillian has several years of ICT policy research and advocacy experience, having joined the CIPESA as a Policy Officer in 2007. She has facilitated and coordinated ICT policy workshops – including coordinating the East African Internet Governance Forum. Lillian has a Bachelors of Development Studies (Makerere University, Uganda) with a Postgraduate Diploma in Project Management as well as advanced training in Internet studies. She holds a Master’s in Digital Media and Society from Uppsala University, Sweden. She is also the former President of the Internet Society (ISOC) Uganda.

The webinar forms part of the #BeSafeByDesign campaign, which calls for improved platform accountability in Africa. The campaign is part of a project supported by the Irene M. Staehelin Foundation. Since December 2025, the project has pursued a series of collaborative activities aimed at improving online safety and governance. These included a convening in Nairobi, Kenya, which served as the launch of the #BeSafeByDesign campaign. The convening assembled human rights defenders and activists from eight African countries for upskilling in digital resilience. In February 2026, a meeting held in Port Louis, Mauritius brought together 30 participants — including judges, magistrates, law enforcement officers, communications regulators, data protection authorities, and National Human Rights Institutions (NHRIs). Participants recommended that African governments strengthen their engagement with big tech companies through regional mechanisms, such as the African Union, to present a more coordinated voice on issues of platform accountability.

CIPESA Welcomes the Annulment of Sections of Uganda’s Computer Misuse Act

By Edrine Wanyama |

Uganda’s Constitutional Court has delivered a major ruling that has outlawed several sections of the Computer Misuse Act, Cap 96, and ordered the government and its agencies to stop any further enforcement of the nullified provisions. These stringent provisions had significantly restricted the use of various communication platforms, including social media. The court ruling marks an important step towards ending enduring limitations on freedom of expression, access to information and other online freedoms.

The Computer Misuse (Amendment) Act, 2022, which introduced a range of offences including unauthorised access, unauthorised sharing of information about children, hate speech, sharing of unsolicited and malicious information, and misuse of social media, has been outlawed in its entirety. These provisions were overly broad, vaguely worded and carried severe penalties.

In response to a number of petitions filed by individuals and civil society organisations, which were consolidated for determination, the Constitutional Court found that the Computer Misuse (Amendment) Bill, 2022, was passed into law without complying with the provisions of rule 24(3) of the Rules of Procedure of Parliament, which contravened articles 88 and 89 of the Constitution. Parliament’s rules of procedure and the Constitution require that the quorum should be ascertained before passing of laws.

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA), which was a co-petitioner in the case, had in its analysis and comments to the Parliamentary Committee on Information and Communications Technology argued that while addressing cybercrime was necessary, overly broad laws risk shrinking the digital civic space by limiting freedom of expression and access to information.

Moreover, the law was passed long after the Supreme Court ruling in Charles Onyango Obbo and Another v Attorney General, which had outlawed the criminalisation of false news in section 50 of the Penal Code Act. CIPESA had raised concerns about this inconsistency in the law prior to the filing of the petition.

Importantly, the Constitutional Court also struck down sections 162 and 163 of the Penal Code Act, which criminalised defamation. The Court found that these provisions violate article 9 of the African Charter on Human and Peoples’ Rights and are a limitation to the right to freedom of expression, contrary to regional and international human rights standards.

In the lead judgement of Justice Irene Mulyagonja, Court found that:

  • “Parliament passed the Computer Misuse (Amendment) Bill, 2022 into an Act of Parliament without complying with the provisions of rule 24(3) of the Rules of Procedure of Parliament made under Article 94 of the Constitution.
  • The enactment of the Computer Misuse (Amendment) Bill into an Act of Parliament without complying with rule 24(3) of the Rules of Procedure of Parliament was inconsistent with Articles 88 and 89 of the Constitution, and as a result, the Computer Misuse (Amendment) Act, 2022, was null and void.
  • The provisions of the Computer Misuse Act (2023 Edition) that were challenged in Constitutional Petitions 34, 37 and 42 of 2022 are therefore all null and void because they were enacted without following the law.
  • Section 162 of the Penal Code Act contravenes Article 9 of the African Charter on Human and Peoples’ Rights; and section 163 that defines the term “defamation” therein does not meet the standard of the law that is required by Article 9(2) of the Charter, and is inconsistent therewith to that extent and therefore null and void.”

Uganda has in recent years experienced significant restrictions on digital civic space. During the general elections in January 2026, the government shut down the internet for five days. In 2024, in the lead up to elections, were charged under the annulled law with malicious information on X and insulting the President and the First Family. These actions are often justified on grounds such as preventing online misinformation and disinformation or safeguarding national security, but their broad application raises serious concerns for digital rights and the right to free expression.

Over the years,  several civic actors, including journalists and media professionals, human rights defenders, political opponents, have faced intimidation, arrests, and prosecution under these contentious provisions of the Computer Misuse Law.

Despite the Constitutional Court’s progressive decision, which is a positive step towards enhancing legislative accountability and reaffirming Uganda’s commitments under regional and international human rights instruments, there is no ultimate guarantee that the right to fundamental freedoms and civic liberties guaranteed by the Constitution will be respected.

It should be noted that the Court’s decision largely focused on procedural issues rather than examining the constitutional guarantees on freedom of expression and access to information. This leaves open the possibility that similar provisions could be reintroduced if proper legislative procedures are followed.

Continuous advocacy for progressive provisions remains necessary.

Given the volatile nature of Uganda’s digital space, there is a need for Parliament to ensure harmonisation of national laws with regional and international standards, conduct wide consultations on proposed laws, and undertake human rights impact assessments.

CIPESA welcomes the current judgement as progressive but emphasises the need for decisiveness in implementation of the orders by the court. Without sustained vigilance, restrictive laws in addition to the Uganda Communications Act, Public Order Management Act, Uganda Peoples Defence Forces Act, the Regulation of Interception of Communications Act and the Anti-terrorism Act may re-emerge in different forms.

The protection and promotion of civil liberties in digital spaces must remain a priority.