Kenya’s Digital Crossroads: Surveillance, Activism, and the Urgent Fight for Digital Rights in 2025

Victor Kapiyo |

In East Africa, Kenya has over the years been regarded as a model of excellence in digital rights. However, more recently, the country has been plagued by alarming practices that threaten its standing. These include a heightened crackdown on activism, including the abduction and intimidation of activists and journalists, politically motivated internet censorship, rising disinformation, cyber threats and data breaches, and a media decline. Nonetheless, it has not been all doom and gloom as there are glimmers of hope aided by increasing internet use and a population that has displayed remarkable resilience and pushback against continued threats to digital rights.

In this brief, The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) explores these trends and presents some recommendations for consideration by stakeholders. 

The key trends that the brief highlights include the following:

  1. Increased Internet Usage: Kenya’s internet and social media usage has been on the rise with 22.7 million internet users, of which 13 million are active on social media. Likewise, cellular mobile connections stood at 66 million in 2024. However, digital access remains uneven across the country, with urban areas reporting the highest adoption, and affordability being the main barrier to access.
  2. Growing Crackdowns on Activism: The #OccupyParliament and #RejectFinanceBill2024 protests were met with excessive force, arrests, abductions, and crackdowns on organisers and participants. The government also invoked various laws and deployed sophisticated digital tools to monitor protestors without adequate oversight.
  3. Spiralling Censorship of Online Speech: The government imposed a nationwide internet shutdown during the #RejectFinanceBill2024 protests and later blocked access to Telegram for two weeks to prevent cheating during the national examination period. Further, government officials issued several warnings to the public over the “irresponsible use of social media” and threatened to regulate social media platforms and block websites.
  4. Disinformation Persists: Kenya’s disinformation enterprise remains sophisticated, lucrative and largely funded by political actors that exploit the divisions around ideological, ethnic, economic, and demographic lines while harnessing the power of social media. However, government responses to disinformation through the enforcement of the Computer Misuse and Cybercrimes Act (2018) continue to raise concerns about censorship due to the misapplication of the law to muzzle legitimate speech.
  5. Gaps in Access to Information: Access to information about key government projects has remained inconsistent, with widespread secrecy, delays or outright refusal characterising projects. However, the delivery of government services online through the eCitizen portal has facilitated enhanced access to information and services even as the digital divide expands.
  6. Growing Data Breaches and Cyber Threats: As the population embraces the digital economy, the number of cyber threats recorded increased in 2024, with the majority being system vulnerabilities, malware and brute force attacks. Also, there are concerns over unchecked state surveillance, and the adequacy of safeguards to protect citizens’ data amidst rising data breaches.
  7. Media under Siege: Kenya’s media rankings have declined, with the Media Council of Kenya reporting a total of 74 cases of press freedom violations in 2024. The violations included cases of harassment, intimidation, and arbitrary arrests of journalists, particularly those reporting on politically sensitive topics such as corruption, protests, and human rights abuses.
  8. Change of Guard at the MoICDE: William Kabogo was appointed as the third Cabinet Secretary for the Ministry of Information, Communications, and the Digital Economy (MoICDE) in a span of two years. The billionaire politician announced his readiness to regulate social media and shut down the internet if national security is threatened. 

In conclusion, the brief calls for continued vigilance and action to stem the downward spiral. 

Summary of Recommendations:

  1. The government should commit to maintaining free, open and secure internet access by international human rights standards.
  2. The government should take measures to expand ICT infrastructure in rural and underserved areas.
  3. The Computer Misuse and Cybercrimes Act should be amended to narrow its scope and ensure that response measures comply with the three-part test and the law is not used to censor or suppress digital rights.
  4. The capacity of the Office of the Data Protection Commissioner should be strengthened to ensure compliance of all government online services and digital initiatives to the Data Protection Act.
  5. Kenya should address its cybersecurity constraints.
  6. Stakeholders should work to strengthen legal protections for journalists and media outlets.
  7. The government, including at the county level, should continue to invest in the digitisation of public records and services to facilitate efficiency, transparency and accountability.

Read the full brief here.

Co-Develop & CIPESA Empowering East Africa’s Journalists to Shape the Future of Digital Public Infrastructure Discourse

Partnership |

What if the future of digital public infrastructure (DPI) was shaped not just by policymakers and technologists, but by the stories that journalists tell? At Co-Develop, we believe that the stories told about DPI can drive awareness, accountability, and action. This is why Co-Develop is pleased to announce a new partnership with the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) to launch a DPI Journalism Fellowship in Eastern Africa. 

Whatever it is, the way you tell your story online can make all the difference.

Building on the success of our West African program, this initiative will equip journalists across Burundi, the Democratic Republic of the Congo, Ethiopia, Kenya, Rwanda, Somalia, South Sudan, Tanzania, and Uganda with the skills, resources, and networks needed to report on DPI and Digital Public Goods (DPGs) with depth and clarity. By promoting informed public discourse, the program aims to make digital transformation more accessible and transparent to the communities it serves. 

Digital public infrastructure refers to the foundational digital systems, platforms, and services that enable secure, efficient, and inclusive delivery of public and private services. Examples of DPI include digital identity (ID) systems, instant payment platforms, data exchange frameworks, open data platforms, and eGovernment platforms. 

The West African fellowship trained 20 journalists who produced over 115 impactful stories, leading to policy debates and meaningful reforms—such as the digitization of birth certificates in Nigeria following a corruption exposé. Inspired by this success, we are excited to bring this model to Eastern Africa, where digital transformation is rapidly advancing, yet public engagement remains limited.

Despite significant DPI developments, many citizens remain unaware of their implications. The media plays a critical role in bridging this gap. The new journalism fellowship will empower journalists to report on DPI in ways that highlight its benefits, challenges, and impact on everyday lives.

Through a structured program of training, reporting grants, and collaborations with regional media houses, the fellowship will amplify DPI narratives, encourage investigative journalism, and promote greater transparency and accountability. Drawing from insights gained in West Africa, we aim to build a scalable and sustainable model that strengthens public understanding of DPI, ensuring it is not just a policy conversation but a lived reality for millions. 

“We welcome this timely initiative, which will strengthen the role of journalism in shaping the future of DPI in Eastern Africa and we look forward to seeing the transformative impact of this collaboration”, said Dr. Wairagala Wakabi, the CIPESA Executive Director. “As digital transformation accelerates, journalists can play a vital role in ensuring that the public remains informed about the opportunities and challenges associated with DPI.”

He added that the fellowship will help bridge the gap between policymakers and the public, enhancing transparency and accountability in digital governance.

We look forward to seeing the transformative impact of this collaboration.

“This fellowship aligns with CIPESA’s commitment to promoting and protecting digital rights, the digital economy, and inclusive digital societies across Africa,” Dr. Wakabi said. “We expect this program to produce a new cadre of investigative journalists who will drive meaningful public discourse on DPI and DPGs. We hope that this will in turn contribute to policies and practices that ensure digital public infrastructure serves all citizens equitably while remaining rights-respecting.” 

This article was first published on the Co-Develop Website

CIPESA and Partners Advocate for Inclusion of Technology-Facilitated Gender-Based Violence in Uganda’s Sexual Offences Bill

By Ainembabazi Patricia |

On February 18, 2025, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) alongside Pollicy and the Gender Tech Initiative appeared before Members of Uganda’s Parliament to advocate for the inclusion of Technology-Facilitated Gender-Based Violence (TFGBV) in the Sexual Offences Bill 2024.

The rapid evolution of digital technologies has reshaped societal interactions, leading to increased perpetration of online violence. In Uganda, online users increasingly face digital forms of abuse that often mirror or escalate offline sexual offences, yet efforts to combat gender-based violence are met with both legal and practical challenges.

The Sexual Offences Bill aims to address sexual offences by providing for the effectual prevention of sexual violence, enhancement of the punishment of sexual offenders, providing for the protection of victims during trial of sexual offences, and providing for extra-territorial application of the law.

In the presentation to Committee of Legal and Parliamentary Affairs and the Committee on Gender, Labour, and Social Development, CIPESA and partners emphasised the necessity of closing the policy gap between digital and physical sexual offenses in the Bill, to ensure that Uganda’s legal system is responsive to the realities of technology advancements and related violence. We argued that while the Bill is timely and presents real issues of sexual violence especially against women, there are some pertinent aspects that have been left out and should be included.

According to the United Nations Population Fund (UNFPA), TFGBV is “an act of violence perpetrated by one or more individuals that is committed, assisted, aggravated, and amplified in part or fully by the use of information and communication technologies or digital media, against a person on the basis of their gender.” It includes cyberstalking, doxing, non-consensual sharing of intimate images, cyberbullying, and other forms of online harassment.

In Uganda, TFGBV is not addressed by the existing laws including the Penal Code Act and the Computer Misuse Act. Adding TFGBV to the bill will provide an opportunity to bridge this legal gap by explicitly incorporating TFGBV as a prosecutable offence.

CIPESA and partners’ recommendations to the Committees were to:

1. Include and Explicitly Define TFGBV

Under Part I (Preliminary), the Bill provides definitions for various terms related to sexual offences, including references to digital and online platforms. However, it does not explicitly define TFGBV or recognise its various manifestations. This omission limits the Bill’s effectiveness in addressing emerging forms of online sexual offences.

We propose an introduction of a new clause under Part I defining TFGBV, to ensure the Bill adequately addresses offences committed via digital means. The definition should align with international standards, such as the UNFPA’s definition of TFGBV, and should ensure consistency with Uganda’s digital policy frameworks, including the Constitution of the Republic of Uganda 1995, the Data Protection and Privacy Act, 2019, the Computer Misuse (Amendment) Act 2022, Penal Code Act Cap 120, and the Uganda Communications Act 2013.

2. Recognising Various Forms of TFGBV

Clause 7 of the Bill provides for the penalisation of Indecent Communication or transmission of sexual content without consent. It criminalises the sharing of unsolicited material of a sexual nature, including the unauthorised distribution of nude images or videos. However, the provision does not explicitly mention cyber harassment, online grooming, sextortion, or non-consensual intimate image sharing (commonly known as “revenge porn”).  As such, we recommended the expansion of Clause 7 to explicitly recognise and define offences such as Cyber harassment, Non-consensual intimate image sharing, Online grooming, and Sextortion. This addition will clarify legal pathways for victims and broaden the scope of protection against digital sexual exploitation. 

3. Replacing “Online Platform” with “Technology-Facilitated Gender-Based Violence”

In clause 1 the Bill defines “on-line platform” as any computer-based technology that facilitates the sharing of information, ideas, or other forms of expression. This encompasses social media sites, websites, and other digital communication tools. Clause 6 addresses the offense of indecent exposure, criminalising the intentional display of one’s sexual organs in public or through electronic means, including online platforms and clause 7 pertains to the non-consensual sharing of intimate content. However, these provisions do not comprehensively categorise TFGBV as a distinct form of sexual offences. Accordingly, “Online Platform” should be replaced with “Technology-Facilitated Gender-Based Violence” to ensure the Bill adequately captures all digital gender-based offences, including deepfake pornography, cyberstalking, and sexual exploitation through content generated by artificial intelligence.

4. Criminalising Voyeurism

The Bill does not explicitly criminalise voyeurism, which refers to the act of secretly observing, recording, or distributing images or videos of individuals in private settings without their consent, often for sexual gratification. Thee is increasing prevalence of voyeurism through hidden cameras, non-consensual recordings, and live-streamed sexual abuse.  Voyeurism should be criminalised with a clear definition provided under clause 1 and the scope and penalty defined under Part II of the Bill.

5. Strengthening Accountability for Technology Platforms

The Bill does not impose specific responsibilities on digital platforms and service providers in cases of TFGBV. We argued for the addition of a new clause under Part III (Procedures and Evidential Provisions) mandating digital platforms and service providers to cooperate in investigations related to TFGBV, and provide relevant data and evidence upon request by law enforcement. Similarly,  the provision should expand into the obligation to ensure data protection compliance and  implementation of proactive measures to detect, remove, and report sexual exploitation content.  This provision will enhance accountability and facilitate the prosecution of perpetrators. 

6. Aligning Uganda’s Legislation with Regional and International Frameworks

The Bill does not explicitly state its alignment with regional and international human rights instruments addressing sexual violence and digital rights.  We recommend an addition of a new clause under Part I (Preliminaries) stating that the Bill shall be interpreted in a manner that aligns with the African Commission on Human and Peoples’ Rights (ACHPR) Resolution 522 (2022) and the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW). This will reinforce Uganda’s commitment to and application of international best practices in combating sexual offences. 

7. Enhancing Legal Remedies for Survivors

Clause 42 (Settlement in Capital Sexual Offences) prohibits compromise settlements in cases of rape, aggravated rape, and defilement, prescribing a 10-year prison sentence for offenders who attempt to settle such cases outside court. However, the Bill does not provide civil remedies for victims of TFGBV-related crimes, nor does it ensure access to psycho-social support.  We recommend an expansion of Clause 42 to include  civil remedies, including compensation for victims of TFGBV, psychosocial and legal support, ensuring survivors receive necessary rehabilitation, and mandatory reporting obligations for online platforms hosting TFGBV-related content. 

The inclusion of TFGBV in the Sexual Offences Bill 2024 will not only strengthen the fight against gender-based violence but also ensure that survivors access justice. The proposed legislative changes will reinforce Uganda’s commitment to upholding digital rights and gender equality in the digital age. The country will also join the ranks of pioneers such as South Africa who have taken legislative steps to criminalise online gender-based violence.

By incorporating the proposed provisions and amendments, the Sexual Offences Bill, 2024 will clearly define online-based sexual offenses, bring perpetrators of online violence to book and provide protection for survivors of digital sexual offences. It will also contribute to the building and strengthening of accountability for technology platforms. Once enacted, the law will also go strides in ensuring that Uganda’s legal framework aligns with regional and international human rights standards on protection of survivors while guaranteeing effective prosecution of offenders of technology-facilitated sexual offences.

Download the Full report here

Policy Alternatives for an Artificial Intelligence Ecosystem in Uganda

CIPESA |

Economic projections show that by 2030, artificial intelligence (AI) will add USD 15.7 trillion to the global economy. Of this, USD 1.2 trillion will be generated in Africa and could boost the continent’s Gross Domestic Product by 5.6%. Despite AI’s transformative potential, there are concerns about the risks it poses to individuals’ rights and freedoms. There is therefore a need to foster a trusted and ethical AI ecosystem that elicits peoples’ confidence while guaranteeing an enabling atmosphere for innovation, to best harness AI for the greater public good for all. 

The discussion on AI in Uganda is still in early stages. Nonetheless, the country needs to develop a comprehensive and AI-specific legal and institutional governance framework to provide for regulatory oversight over AI and the diverse actors in the AI ecosystem. Currently, various pieces of legislation, which majorly focus on general-purpose technologies, constitute the legal framework relevant to AI. However, these laws do not provide sufficient regulatory cover to AI, its associated benefits and mitigation of risks to human security, rights and freedoms. 

In a new policy brief, the Collaboration on ICT Policy for East and Southern Africa (CIPESA) reviews the AI policymaking journeys of various countries, such as Kenya, South Africa, Singapore, Luxembourg, France and Germany, and proposes 11 actions Uganda could take to fulfil its aspiration to effectively regulate and harness AI.

The existing key policy frameworks include the Uganda Vision 2040, which emphasises the importance of Science, Technology, Engineering and Innovation (STEI) as critical drivers of economic growth and social transformation; and the National Fourth Industrial Revolution (4IR) Strategy that aims to accelerate Uganda’s development into an innovative, productive and competitive society using 4IR technologies, with  emphasis on  using AI in the public sector to improve financial management and tax revenue collection. Meanwhile, the third National Development Plan (NDP III) identifies the promotion of digital transformation and the adoption of 4IR technologies, including AI, as critical components for achieving Uganda’s vision of becoming a middle-income country. 

The legal frameworks that impact AI-related oversight include the Constitution that lays out crucial benchmarks for the regulation of AI. It provides for the role of the state in stimulating agricultural, industrial, technological and scientific development by adopting appropriate policies and enacting enabling legislation. The constitution also provides for the right to privacy, freedom from discrimination, and the right to equality. 

Other key laws include the Data Protection and Privacy Act of 2019 which, even if it was not drafted with AI in mind, is directly relevant to the regulation of AI technologies through the lens of data protection. The Computer Misuse Act of 2011 provides a framework that addresses unlawful use of computers and electronic systems. Relevant to the governance of AI is section 12, which criminalises unauthorised access to a computer or electronic system.  

The National Information Technology Authority, Uganda (NITA-U) Act offers a foundation for improving infrastructure to support AI regulation efforts, and  established NITA-U, a body responsible for regulating, coordinating, and promoting information technology in the country. 

Overall, the current policy and legal framework, however fragmented, provides a starting point for enacting comprehensive, AI-specific legislation.

The growing adoption of AI brings a host of opportunities that positively impact society, including improved productivity and efficiency for individuals, the health sector, civil society organisations, the media, financial institutions, manufacturing industries, supplier chains, agriculture, climate and weather research and academia. AI is also being used by public agencies such as Uganda Revenue Authority to support more effective revenue collection. Uganda’s telecommunications operators are also utilising AI, for example to send targeted messages that encourage users to subscribe to loan offers such as Airtel Wewole and MTN MoKash..

Prospects for AI Regulation in Uganda

As Uganda’s journey of AI adoption and usage gains traction, the following guiding actions that underlie progressive AI frameworks across various countries could help quicken and offer direction to Uganda’s AI aspirations.

  1. Establishment of an AI governance institutional framework to guide the national adoption and usage of AI.
  2. Development and implementation of a “living” framework of best practices on AI that operates across the diverse sectors affected by AI. Singapore provides a best practice in this regard where, as a national agenda, there is consistent codification of best practices that inform the safe evolution of AI in the different spheres. The best practices framework allows for complementing of the regulatory framework. By adopting this best practice framework, Uganda would keep up with the evolution of AI without necessarily undertaking statutory amendments especially in the AI/technology world where there are rapid changes. 
  3. Implementation of checks and balances through the creation of specific policies, regulations, guidelines, and laws to manage AI effectively and address the existing significant gaps in its regulation and oversight. To address this, key stakeholders – including the Ministry of ICT and National Guidance, the Uganda Communications Commission, NITA-U, and the Personal Data Protection Office – must collaborate to develop comprehensive and tailored regulations. This effort should focus on understanding AI’s specific dynamics, impacts, and challenges within the Ugandan context and not wholesomely adopting or replicating legislation from other jurisdictions, given the divergences in context at continental, regional and national levels.
  4. Tap into the African AI Frameworks for Inspiration. Drawing on regional and international frameworks, such as the African Union’s AI Policy and the European Union’s AI Act, will offer key strategic guidelines and intervention measures to shape a robust and effective AI legislation in Uganda. 
  5. Establish a National Research and Innovative Fund on AI to effectively tap into and harvest the dividends that come with AI. This kind of funding requiring direct government intervention is informed by the reality that surrounds the high levels of uncertainty of outcomes in tech  innovation. 
  6. Develop and implement a National Strategy for AI to enhance policy coordination and coherence and offer direction and guidance. This would encompass the national vision for AI in Uganda’s social and economic development, and guide all other initiatives on progressive AI regulation.
  7. Develop and implement a National Citizenry Awareness and Public Education Programme on AI to better prepare citizens to engage with AI responsibly, ensure inclusion and advocate for ethical practices.
  8. Apply human rights protective AI to influence the designing of AI systems with fairness, transparency, and accountability, and employ diverse and representative datasets to mitigate biases related to ethnicity, gender, and socioeconomic status.
  9. Establish  a mechanism that can enforce ethical use of AI by the various stakeholders, including through emphasising transparency and accountability in AI deployment.
  1. Establish cyber security protocols to counter inherent vulnerability to cyber-attacks and other attendant digital security risks that come with AI.  
  2. Create a conducive atmosphere for citizenry platforms for AI engagements. These platforms can be conduits for encouraging best practices, and latest research information among other emerging issues on AI that could benefit the country. An AI ecosystem should thus favour and strategically support such inter-agency, inter-sector and public-private collaboration and formal linkages to also facilitate AI technology transfer from explorations, studies and innovation to actual application.

Read the full brief here.

Social Media’s Role in Hate Speech: A Double-Edged Sword for South Sudan

By Ochaya Jackson & George Gumikiriza |

The lead-up to and aftermath of the now-stalled December 2024 elections in South Sudan has highlighted the role of social media as a powerful tool for communication, civic engagement, and information sharing. Platforms such as Facebook, WhatsApp, X (formerly Twitter), and TikTok have connected people across the world’s youngest nation, enabling dialogue, amplifying marginalised voices, and spreading crucial information. However, alongside these benefits, social media has also become a breeding ground for hate speech, misinformation, and incitement to violence.

The Rise of Hate Speech on Social Media

From June to November 2024, DefyHateNow (recently renamed Digital Rights Frontlines – DRF) monitored incidents of hate speech in South Sudan. The monitoring was done on content created and shared via social media platforms. Of the 255 incidents recorded, Facebook accounted for 89.4%, with WhatsApp, X, and TikTok coming in as close second. The monitoring findings further indicate that 50.5% of online content contained misinformation or disinformation, while 39.9% was classified as hate speech.

Facebook is the most widely used social media platform in South Sudan, which explains why it holds most of the illegal and harmful content. The popularity of the platform partly arises from its “free mode” feature which allows MTN mobile subscribers in South Sudan to access Facebook, create and share content when they do not have an internet data bundle; only viewing or uploading photos and videos requires users to have data.

Social media’s accessibility and rapid reach make it easy for harmful content to spread, fueling ethnic and political tensions. Given South Sudan’s history of conflict, inflammatory online rhetoric can have real-world consequences, inciting violence, deepening divisions, and undermining peacebuilding efforts.

Why Does Hate Speech Spread So Easily?

As part of the project, DefyHateNow convened the country’s first Symposium in commemoration of the International Day for Countering Hate Speech as a platform for collective action to combat hate speech. The engagements identified several factors that contribute to the proliferation of hate speech and disinformation in South Sudan:

Ethnic and Political Divisions – Long-standing ethnic rivalries and political conflicts provide fertile ground for harmful narratives that further divide communities.

Lack of Digital Literacy – Many social media users lack the skills to critically assess the credibility of online content, making them more susceptible to misinformation.

Anonymity and Lack of Accountability – Many harmful posts are made under fake names or anonymous accounts, reducing the fear of repercussions.

Weak Regulatory Frameworks – South Sudan lacks robust policies to hold social media platforms accountable for harmful content.

Algorithmic Amplification – Social media algorithms prioritise engagement, often promoting divisive and inflammatory content because it generates more reactions and shares.

The Positive Side: Social Media for Good

Despite these challenges, social media remains a vital tool for positive change. Platforms have been used for:

Peacebuilding and Dialogue – Initiatives like #defyhatenow and DRF’s online campaigns promote counter-speech and encourage respectful conversations.

Fact-checking and Misinformation Prevention – Programmes like 211Check work to verify online information and educate communities about identifying false narratives.

Civic Engagement – Social media allows citizens to engage with governance, report human rights abuses, and access critical updates on national issues.

Curiosity – The disinformation awareness campaigns conducted raise the level of literacy and criticality among online audiences, which makes them detect and counter disinformation.

Towards maximising the benefits of social media, DefyHateNow also conducted awareness campaigns through the publication of animations in print media, radio talk shows and dissemination of posters across South Sudan’s capital Juba. The campaign messages reinforced the call for action against hate speech, misinformation and disinformation as well as raised awareness about their dangers and how to identify them.

Ahead of the rescheduled elections slated for December 2026, collective effort from tech companies, policymakers, civil society, the media and individual users is required to address the challenges of hate speech and disinformation. By promoting digital literacy, implementing stronger regulations, and encouraging responsible social media use, South Sudan can harness the power of social media platforms for peace and progress.

DefyHate Now’s work was supported by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) in the context of the Africa Digital Rights Fund (ADRF).

Do you want to be part of the solution? Join Digital Rights Frontlines (DRF) in advocating for safer digital spaces. Stay informed, report harmful content, and contribute to a more inclusive and responsible online community.

For more information, visit www.digitalrights.ngo or contact us at [email protected]