The Collaboration on International ICT Policy for East and Southern Africa (CIPESA), in partnership with Co-Develop, is pleased to announce the inaugural cohort of Fellows selected for the Digital Public Infrastructure (DPI) Journalism Fellowship for Eastern Africa following an open call in April 2025.
This six-month regional fellowship aims to cultivate a new generation of journalists with the knowledge and skills to investigate and report on DPI and Digital Public Goods (DPGs). Fellows will participate in specialised training sessions, receive mentorship, and receive financial support to develop and produce impactful stories in diverse formats and languages. The stories will interrogate the development and deployment of DPI and DPGs with a focus on their implications for governance, inclusion, equity, and citizens’ everyday lives.
The fellowship brings together 20 journalists from nine countries (Burundi, DR Congo, Ethiopia, Kenya, Rwanda, Somalia, South Sudan, Tanzania, and Uganda) who work across online, broadcast, and print platforms.
The call in April attracted 214 applications, which were assessed through a rigorous selection process to identify fellows who demonstrated a strong interest and capacity to report on emerging digital public infrastructure issues with clarity, depth, and integrity.
“This fellowship is about more than capacity building. It is about empowering African journalists to shape the public narrative around digital transformation in ways that reflect citizens’ rights, challenges, and aspirations,” said Dr. Wairagala Wakabi, CIPESA’s Executive Director. “We are thrilled to support these pioneering journalists as they lead the charge in demystifying digital infrastructure and holding power to account.”
The launch of this fellowship is significant as the digital transformation agenda of many African countries is evolving. Yet, media coverage of DPI and DPGs remains limited. The fellowship aims to close that gap by building the capacity of the media to cover DPI and DPI in ways that create awareness and informed public discourse on digital governance.
The fellowship is inspired by a similar Co-Develop-funded initiative implemented by the Media Foundation for West Africa (MFWA), which supported fellows to produce over 100 impactful stories that spurred public debate and influenced policy.
At Co-Develop, we believe that sustainable digital public infrastructure requires more than innovation and technology, it demands informed ecosystems. By supporting journalists across nine East African countries, this fellowship helps create a critical layer of engagement and accountability around Digital Public Infrastructure. We’re proud to invest in a future where DPI is not only built, but deeply understood, safeguarded, and shaped by those it serves. – Desire Kachenje, Senior Investment Principal, Co-Develop
Follow the Fellows’ Stories Stay engaged with the work of the DPI Journalism Fellows throughout 2025 using the hashtags #DPIJournalism #DPIFellows2025. Follow their stories and insights via CIPESA and Co-Develop’s online platforms, and join the conversation on how digital public infrastructure is shaping the future of governance and inclusion in Africa.
Read more about the DPI Journalism Fellows 2025 here.
This EPS saga highlighted implementation gaps and illuminated a systemic failure to promote equitable access, public accountability, and safeguard fundamental rights in the rollout of DPI.
When the State Forgets the People
The Uganda EPS, established under section 166 of the Traffic and Road Safety Act, Cap 347, serves as a tech-driven improvement to road safety. Its goal is to reduce road accidents and fatalities by encouraging better driver behaviour and compliance with traffic laws. By allowing offenders to pay fines directly without prosecution, the system aims to resolve minor offences quickly and to ease the burden on the judicial system. Challenges faced by the manual EPS system, which the move to the automated system aimed to eliminate, include corruption (reports of deleted fines, selective enforcement, and theft of collected penalties).
At the heart of the EPS was an automated surveillance and enforcement system, which used Closed Circuit Television (CCTV) cameras and license plate recognition to issue real-time traffic fines. This system operated with almost complete opacity. A Russian company, Joint Stock Company Global Security, was reportedly entitled to 80% of fine revenues, despite making minimal investment, among other significant legal and procurement irregularities. There was a notable absence of clear contracts, publicly accessible oversight mechanisms, or effective avenues for appeal. Equally concerning, the collection and storage of extensive amounts of sensitive data lacked transparency regarding who had access to it.
Such an arrangement represented a profound breach of public trust and an infringement upon digital rights, including data privacy and access to information. It illustrated the minimal accountability under which foreign-controlled infrastructure can operate within a nation. This was a data-driven governance mechanism that lacked the corresponding data rights safeguards, subjecting Ugandans to a system they could neither comprehend nor contest.
This is Not an Isolated Incident
The situation in Uganda reflects a widespread trend across the continent. In Kenya, the 2024 Microsoft–G42 data centre agreement – announced as a partnership with the government to build a state-of-the-art green facility aimed at advancing infrastructure, research and development, innovation, and skilling in Artificial Intelligence (AI) – has raised serious concerns about data sovereignty and long-term control over critical digital infrastructure.
In Uganda, the National Digital ID system (Ndaga Muntu) became a case study in how poorly-governed DPI deepens structural exclusion and undermines equitable access to public services. A 2021 report by the Centre for Human Rights and Global Justice found that rigid registration requirements, technical failures, and a lack of recourse mechanisms denied millions of citizens access to healthcare, education, and social protection. Those most affected were the elderly, women, and rural communities. However, a 2025 High Court ruling ignored evidence and expert opinions about the ID system’s exclusion and implications for human rights.
The clear pattern is emerging across the continent: countries are integrating complex, often foreign-managed or poorly localised digital systems into public governance without establishing strong, rights-respecting frameworks for transparency, accountability, and oversight. Instead of empowering citizens, this version of digital transformation risks deepening inequality, centralising control, and undermining public trust in government digital systems.
The State is Struggling to Keep Up
National Action Plans (NAPs) on Business and Human Rights, intended to guide ethical public–private collaboration, have failed to address the unique challenges posed by DPI. Uganda’s NAP barely touches on data governance, algorithmic harms, or surveillance technologies. While Kenya’s NAP mentions the digital economy, it lacks enforceable guardrails for foreign firms managing critical infrastructure. In their current form, these frameworks are insufficiently equipped to respond to the complexity and ethical risks embedded in modern DPI deployments.
Had the Ugandan EPS system been subject to stronger scrutiny under a digitally upgraded NAP, key questions would likely have been raised before implementation:
What redress exists for erroneous or abusive fines?
Who owns the data and where is it stored?
Are the financial terms fair, equitable, and sovereign?
But these questions came too late.
What these failures point to is not just a lack of policy, but a lack of operational mechanisms to design, test and interrogate DPI before roll out. What is needed is a practical bridge that responds to public needs and enforces human rights standards.
Regulatory Sandboxes: A Proactive Approach to DPI
DPI systems, such as Uganda’s EPS, should undergo rigorous testing before full-scale deployment. In such a space, a system’s logic, data flows, human rights implications, and resilience under stress are collectively scrutinised before any harm occurs. This is the purpose of regulatory sandboxes – platforms that offer a structured, participatory, and transparent testbed for innovations.
Thus, a regulatory sandbox could have revealed and resolved core failures of Uganda’s EPS before rollout, including the controversial revenue-sharing arrangement with a foreign contractor.
How Regulatory Sandboxes Work: Regulatory sandboxes are useful for testing DPI systems and governance frameworks such asrevenue models in a transparent manner, enabling stakeholders to examine the model’s fairness and legality. This entails publicly revealing financial terms to regulators, civil society, and the general public. Secondly, before implementation, simulated impact analyses can also highlight possible public backlash or a decline in trust. Sandboxes can be used for facilitating pre-implementation audits, making vendor selection and contract terms publicly available, and conducting mock procurements to detect errors. By defining data ownership and access guidelines, creating redress channels for data abuse, and supporting inclusive policy reviews with civil society, regulatory sandboxes make data governance and accountability more clear.
This shift from reactive damage control to proactive governance is what regulatory sandboxes offer. If Uganda had employed a sandbox approach, the EPS system might have served as a model for ethical innovation rather than a cautionary tale of rushed deployment, weak oversight, and lost public trust.
Beyond specific systems like EPS or digital ID, the future of Africa’s digital transformation hinges on how digital public infrastructure is conceived, implemented, and governed. Foundational services, such as digital identity, health information platforms, financial services, surveillance mechanisms, and mobility solutions, are increasingly reliant on data and algorithmic decision-making. However, if these systems are designed and deployed without sufficient citizen participation, independent oversight, legal safeguards, and alignment with the public interest, they risk becoming tools of exclusion, exploitation, and foreign dependency.
Realising the full potential of DPIs as a tool for inclusion, digital sovereignty, and rights-based development demands urgent and deliberate efforts to embed accountability, transparency, and digital rights at every stage of their lifecycle.
The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is pleased to announce that the 2025 edition of the Forum on Internet Freedom in Africa (FIFAfrica25) will be co-hosted in partnership with the Namibian Ministry of Information and Communication Technology (MICT) and the Namibia Internet Governance Forum (NamIGF).
Set to take place in Windhoek, Namibia, from September 25–27, 2025, this year’s Forum will serve as yet another notch in FIFAfrica’s 12-year history of assembling digital rights defenders, policymakers, technologists, academics, regulators, journalists, and the donor community, who all have the shared vision of advancing internet freedom in Africa.
With its strong commitments to democratic governance, press freedom, and inclusive digital development, Namibia offers fertile ground for rich dialogues on the future of internet freedom in Africa. The country holds a powerful legacy in the global media and information landscape, being the birthplace of the 1991 Windhoek Declaration on promoting independent and pluralistic media. In a digital age where new challenges are emerging – from information integrity and Artificial Intelligence (AI) governance to connectivity gaps and platform accountability – hosting FIFAfrica in Namibia marks a key moment for the movement towardtrusted information as a public good, including in the digital age.
“Through the Ministry of Information and Communication Technology, Namibia is proud to co-host FIFAfrica25 as a demonstration of our commitment to advancing technology for inclusive social and economic development. This Forum comes at a critical moment for Africa’s digital future, and we welcome the opportunity to engage with diverse voices from across the continent and beyond in shaping a rights-respecting, secure, and innovative digital landscape,” Minister of Information and Communication Technology (ICT), Emma Inamutila Theofelus
This sentiment is shared by the NamIGF Chairperson, Albertine Shipena. “We are honoured to co-host the FIFAfrica25 here in Namibia. This partnership with MICT and CIPESA marks a significant step in advancing digital rights, open governance, and meaningful multistakeholder engagement across the continent. As the NamIGF, we are proud to contribute to shaping a more inclusive and secure internet ecosystem, while spotlighting Namibia’s growing role in regional and global digital conversations.”
The NamIGF was established in September 2017, through a Cabinet decision, as a multistakeholder platform that facilitates public policy discussion on issues pertaining to the internet in Namibia.
Dr. Wairagala Wakabi, the CIPESA Executive Director, noted that FIFAfrica25 presents a timely opportunity to advance progressive digital policy agendas that uphold fundamental rights and promote digital democracy in Africa. “As global debates on internet governance, data sovereignty, and platform accountability intensify, it is essential that Africans inform and shape the frameworks that govern our digital spaces. We are honoured to partner with the Namibian government and NamIGF to convene this critical conversation on the continent,” he said.
Since its inception in 2014, FIFAfrica has grown to become the continent’s leading assembly of actors instrumental in shaping conversations and actions at the intersection of technology with democracy, society and the economy. It has become the stage for concerted efforts to advance digital rights and digital inclusion. These issues, and new emerging themes such as mental health, climate and the environment, and the content economy, will take centre stage at FIFAfrica25, which will feature a mix of plenaries, workshops, exhibitions, and a series of pre-events.
Meanwhile, FIFAfrica will also recognise the International Day for Universal Access to Information (IDUAI), celebrated annually on September 28. The commemoration serves to underscore the fundamental role of access to information in empowering individuals, supporting informed decision-making, fostering innovation, and advancing inclusive and sustainable development – tenets which resonate with the Forum. This year’s celebration is themed, “Ensuring Access to Environmental Information in the Digital Age”.
At the heart of the Forum is a Community of Allies that have, over the years, stood alongside CIPESA in its pursuit of effective and inclusive digital governance in Africa.
Feedback on Session Proposals and Travel Support Applications
All successful session proposals and travel support applicants have been contacted directly. See the list of successful sessions here. Thank you for your patience and for contributing to what promises to be an exciting FIFAfrica25.
Prepare for FIFAfrica25: Travel and Logistics
Everything you need to plan your attendance at the Forum can be found here – visit this page for key logistical details and tips to help you make the most of your experience!
In a July 18, 2025 decision, Uganda’s Personal Data Protection Office (PDPO) found Google LLC in breach of the country’s data protection law and ordered the global tech giant to register with the local data protection office within 30 days.
The decision would place the popular search engine under the ambit of Uganda’s Data Protection and Privacy Act, whose provisions it would have to comply with. In particular, the PDPO has ordered Google to provide – within 30 days – documentary evidence of how it is complying with requirements for transferring the personal data of Ugandan citizens outside of the country’s borders. Google also has to explain the legal basis for making those cross-border data transfers and the accountability measures in place to ensure that such transfers respect Uganda’s laws.
The orders followed a November 2024 complaint by four Ugandans, who argued that as a data collector, controller, and processor, Google had failed to register with the PDPO as required by local laws. They also contended that Google unlawfully transferred their personal data outside Uganda without meeting the legal conditions enshrined in the law, and claimed these actions infringed their data protection and privacy rights and caused them distress.
The PDPO ruled that Google was indeed collecting and processing personal data of the complainants without being registered with the local data regulator, which contravened section 29 of the Data Protection and Privacy Act. Google was also found liable for transferring the complainants’ data across Uganda’s borders without taking the necessary safeguards, in breach of section 19 of the Act.
This section provides that, where a data processor or data controller based in Uganda processes or stores personal data outside Uganda, they must ensure that the country in which the data is processed or stored has adequate measures for protecting the data. Those measures should at least be equivalent to the protection provided for under the Ugandan law. The consent of the data subject should also be obtained for their data to be stored outside Uganda.
In its defence, Google argued that since it was not based in Uganda and had no physical presence in the country, it was not obliged to register with the PDPO, and the rules on cross-border transfers of personal data did not apply to it. However, the regulator rejected this argument, determining that Google is a local data controller since it collects data from users in Uganda and decides how that data is processed.
The regulator further determined that the local data protection law has extra-territorial application, as it states in section 1 that it applies to a person, institution or public body outside Uganda who collects, processes, holds or uses personal data relating to Ugandan citizens. Accordingly, the regulator stated, the law places obligations “not only to entities physically present in Uganda but to any entity handling personal data of Ugandan citizens, including those established abroad, provided they collect or process such data.”
The implication of this decision is that all entities that collect Ugandans’ data, including tech giants such as Meta, TikTok, and X, must register with the Ugandan data regulator. This decision echoes global calls to hold Big Tech more accountable, and for African countries to have strong laws as per African Union (AU) Convention on Cyber Security and Personal Data Protection (Malabo Convention), and the AU Data Policy Framework.
However, enforcement of these orders remains a challenge. For instance, Uganda’s PDPO does not make binding decisions and only makes declaratory orders. Additionally, the regulator does not have powers to make orders of compensation to aggrieved parties, and indeed did not do so under the current decision. It can only recommend that the complainants engage a court of competent jurisdiction, in accordance with section 33(1) of the Act.
Conversely, the Office of the Data Protection Commissioner of Kenya established by section 5 of Data Protection Act, 2019 and the Personal Data Protection Commission of Tanzania established by section 6 of the Protection of Personal Information Act, 2022 are bestowed with powers to issue administrative fines under sections 9(1)(f) and section 47 respectively.
The dilemma surrounding the Uganda PDPO presents major concerns about its capacity to remedy wrongs of global data collectors, controllers and processors. Among its declarations in the July 2025 decision was that it would not issue an order for data localisation “at this stage” but “Google LLC is reminded that all cross-border transfers of personal data must comply fully with Ugandan law”. This leaves unanswered questions over data sovereignty and respect for individuals’ data rights given the handicaps faced by data regulators in countries such as Uganda and the practicalities presented by the global digital economy.
In these circumstances, Uganda’s Data Protection and Privacy Act should be amended to expand the powers of PDPO to impose administrative fines so as to add weight and enforceability to its decisions.
As Artificial Intelligence (AI) reshapes digital ecosystems across the globe, one group remains consistently overlooked in discussions around AI design and governance: Children. This gap was keenly highlighted at the Internet Governance Forum (IGF) held in June 2025 in Oslo, Norway, where experts, policymakers, and child-focused organisations called for more inclusive AI systems that protect and empower young users.
Children today are not just passive users of digital technologies; they are among the most active and most vulnerable user groups. In Africa, internet use among youths aged 15 to 24 was partly fuelled by the Covid-19 pandemic, hence their growing reliance on digital platforms for learning, play, and social interaction. New research by the Digital Rights Alliance Africa (DRAA), a consortium hosted by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), shows that this rapid connectivity has amplified exposure to risks such as harmful content, data misuse, and algorithmic manipulation that are especially pronounced for children.
The research notes that AI systems have become deeply embedded in the platforms that children engage with daily, including educational software, entertainment platforms, health tools, and social media. Nonetheless, Africa’s emerging AI strategies remain overwhelmingly adult-centric, often ignoring the distinct risks these technologies pose to minors. At the 2025 IGF, the urgency of integrating children’s voices into AI policy frameworks was made clear through a session supported by the LEGO Group, the Walt Disney Company, the Alan Turing Institute, and the Family Online Safety Institute. Their message was simple but powerful: “If AI is to support children’s creativity, learning, and safety, then children must be included in the conversation from the very beginning”.
The forum drew insights from recent global engagements such as the Children’s AI Summit of February 2025 held in the UK and the Paris AI Action Summit 2025. These events demonstrated that while children are excited about AI’s potential to enhance learning and play, they are equally concerned about losing creative autonomy, being manipulated online, and having their privacy compromised. A key outcome of these discussions was the need to develop AI systems that children can trust; systems that are safe by design, transparent, and governed with accountability.
This global momentum offers important lessons for Africa as countries across the continent begin to draft national AI strategies. While many such strategies aim to spur innovation and digital transformation, they often lack specific protections for children. According to DRAA’s 2025 study on child privacy in online spaces, only a handful of African countries have enacted child-specific privacy laws in the digital realm. Although instruments like the African Charter on the Rights and Welfare of the Child recognise the right to privacy, regional frameworks such as the Malabo Convention, and even national data protection laws, rarely offer enforceable safeguards against AI systems that profile or influence children.
Failure to address these gaps will leave African children vulnerable to a host of AI-driven harms ranging from exploitative data collection and algorithmic profiling to exposure to biased or inappropriate content. These harms can deprive children of autonomy and increase their risk of online abuse, particularly when AI-powered systems are deployed in schools, healthcare, or entertainment without adequate oversight.
To counter these risks and ensure AI becomes a tool of empowerment rather than exploitation, African governments, policymakers, and developers must adopt child-centric approaches to AI governance. This could start with mainstreaming children’s rights such as privacy, protection, education, and participation, into AI policies. International instruments like the UN Convention on the Rights of the Child and General Comment No. 25 provide a solid foundation upon which African governments can build desirable policies.
Furthermore, African countries should draw inspiration from emerging practices such as the “Age-Appropriate AI” frameworks discussed at IGF 2025. These practices propose clear standards for limiting AI profiling, nudging, and data collection among minors. Given that only 36 out 55 African countries currently have data protection laws, with few of them containing child-specific provisions, policymakers must take efforts to strengthen these frameworks. Such reforms should require AI tools targeting children to adhere to strict data minimisation, transparency, and parental consent requirements.
Importantly, digital literacy initiatives must evolve beyond basic internet safety to include AI awareness. Equipping children and caregivers with the knowledge to critically engage with AI systems will help them navigate and question the technology they encounter. At the same time, platforms similar to the Children’s AI Summit 2025 should be replicated at national and regional levels to ensure that African children’s lived experiences, hopes, and concerns shape the design and deployment of AI technologies.
Transparency and accountability must remain central to this vision. AI tools that affect children, whether through recommendation systems, automated decision-making, or learning algorithms, should be independently audited and publicly scrutinised. Upholding the values of openness, fairness, and inclusivity within AI systems is essential not only for protecting children’s rights but for cultivating a healthy, rights-respecting digital environment.
As the African continent’s digital infrastructure expands and AI becomes more pervasive, the choices made today will define the digital futures of generations to come. The IGF 2025 stressed that children must be central to these choices, not as an afterthought, but as active contributors to a safer and more equitable AI ecosystem. By elevating children’s voices in AI design and governance, African countries can lay the groundwork for an inclusive digital future that truly serves the best interests of all.