CIPESA Welcomes Namibia Ministry of ICT and the Namibia IGF as Co-Hosts of FIFAFrica25

By FIFAfrica |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is pleased to announce that the 2025 edition of the Forum on Internet Freedom in Africa (FIFAfrica25) will be co-hosted in partnership with the Namibian Ministry of Information and Communication Technology (MICT) and the Namibia Internet Governance Forum (NamIGF).

Set to take place in Windhoek, Namibia, from September 25–27, 2025, this year’s Forum will serve as yet another notch in FIFAfrica’s 12-year history of assembling digital rights defenders, policymakers, technologists, academics, regulators, journalists, and the donor community, who all have the shared vision of advancing internet freedom in Africa.

With its strong commitments to democratic governance, press freedom, and inclusive digital development, Namibia offers fertile ground for rich dialogues on the future of internet freedom in Africa. The country holds a powerful legacy in the global media and information landscape, being the birthplace of the 1991 Windhoek Declaration on promoting independent and pluralistic media. In a digital age where new challenges are emerging – from information integrity and Artificial Intelligence (AI) governance to connectivity gaps and platform accountability – hosting FIFAfrica in Namibia marks a key moment for the movement toward trusted information as a public good, including in the digital age.

“Through the Ministry of Information and Communication Technology, Namibia is proud to co-host FIFAfrica25 as a demonstration of our commitment to advancing technology for inclusive social and economic development. This Forum comes at a critical moment for Africa’s digital future, and we welcome the opportunity to engage with diverse voices from across the continent and beyond in shaping a rights-respecting, secure, and innovative digital landscape,” Minister of Information and Communication Technology (ICT), Emma Inamutila Theofelus

This sentiment is shared by the NamIGF Chairperson, Albertine Shipena. “We are honoured to co-host the FIFAfrica25 here in Namibia. This partnership with MICT and CIPESA marks a significant step in advancing digital rights, open governance, and meaningful multistakeholder engagement across the continent. As the NamIGF, we are proud to contribute to shaping a more inclusive and secure internet ecosystem, while spotlighting Namibia’s growing role in regional and global digital conversations.”

The NamIGF was established in September 2017, through a Cabinet decision, as a multistakeholder platform that facilitates public policy discussion on issues pertaining to the internet in Namibia.

Dr. Wairagala Wakabi, the CIPESA Executive Director, noted that FIFAfrica25 presents a timely opportunity to advance progressive digital policy agendas that uphold fundamental rights and promote digital democracy in Africa. “As global debates on internet governance, data sovereignty, and platform accountability intensify, it is essential that Africans inform and shape the frameworks that govern our digital spaces. We are honoured to partner with the Namibian government and NamIGF to convene this critical conversation on the continent,” he said.

Since its inception in 2014, FIFAfrica has grown to become the continent’s leading assembly of actors instrumental in shaping conversations and actions at the intersection of technology with democracy, society and the economy. It has become the stage for concerted efforts to advance digital rights and digital inclusion. These issues, and new emerging themes such as mental health, climate and the environment, and the content economy, will take centre stage at FIFAfrica25, which will feature a mix of plenaries, workshops, exhibitions, and a series of pre-events.

Meanwhile, FIFAfrica will also recognise the International Day for Universal Access to Information (IDUAI), celebrated annually on September 28. The commemoration serves to underscore the fundamental role of access to information in empowering individuals, supporting informed decision-making, fostering innovation, and advancing inclusive and sustainable development – tenets which resonate with the Forum. This year’s celebration is themed, “Ensuring Access to Environmental Information in the Digital Age”.

At the heart of the Forum is a Community of Allies that have, over the years, stood alongside CIPESA in its pursuit of effective and inclusive digital governance in Africa.

Feedback on Session Proposals and Travel Support Applications

All successful session proposals and travel support applicants have been contacted directly. See the list of successful sessions here. Thank you for your patience and for contributing to what promises to be an exciting FIFAfrica25.  

Prepare for FIFAfrica25: Travel and Logistics

Everything you need to plan your attendance at the Forum can be found here – visit this page for key logistical details and tips to help you make the most of your experience!

Ugandan Regulator Finds Google in Breach of Country’s Data Protection Law, Orders Local Registration

By Edrine Wanyama |

In a July 18, 2025 decision, Uganda’s Personal Data Protection Office (PDPO) found Google LLC in breach of the country’s data protection law and ordered the global tech giant to register with the local data protection office within 30 days.

The decision would place the popular search engine under the ambit of Uganda’s Data Protection and Privacy Act, whose provisions it would have to comply with. In particular, the PDPO has ordered Google to provide – within 30 days – documentary evidence of how it is complying with requirements for transferring the personal data of Ugandan citizens outside of the country’s borders. Google also has to explain the legal basis for making those cross-border data transfers and the accountability measures in place to ensure that such transfers respect Uganda’s laws.

The orders followed a November 2024 complaint by four Ugandans, who argued that as a data collector, controller, and processor, Google had failed to register with the PDPO as required by local laws. They also contended that Google unlawfully transferred their personal data outside Uganda without meeting the legal conditions enshrined in the law, and claimed these actions infringed their data protection and privacy rights and caused them distress.

The PDPO ruled that Google was indeed collecting and processing personal data of the complainants without being registered with the local data regulator, which contravened section 29 of the Data Protection and Privacy Act. Google was also found liable for transferring the complainants’ data across Uganda’s borders without taking the necessary safeguards, in breach of section 19 of the Act.

This section provides that, where a data processor or data controller based in Uganda processes or stores personal data outside Uganda, they must ensure that the country in which the data is processed or stored has adequate measures for protecting the data. Those measures should at least be equivalent to the protection provided for under the Ugandan law. The consent of the data subject should also be  obtained for their data to be stored outside Uganda.

In its defence, Google argued that since it was not based in Uganda and had no physical presence in the country, it was not obliged to register with the PDPO, and the rules on cross-border transfers of personal data did not apply to it. However, the regulator rejected this argument, determining that Google is a local data controller since it collects data from users in Uganda and decides how that data is processed.

The regulator further determined that the local data protection law has extra-territorial application, as it states in section 1 that it applies to a person, institution or public body outside Uganda who collects, processes, holds or uses personal data relating to Ugandan citizens. Accordingly, the regulator stated, the law places obligations “not only to entities physically present in Uganda but to any entity handling personal data of Ugandan citizens, including those established abroad, provided they collect or process such data.”

The implication of this decision is that all entities that collect Ugandans’ data, including tech giants such as Meta, TikTok, and X, must register with the Ugandan data regulator. This decision echoes global calls to hold Big Tech more accountable, and for African countries to have strong laws as per African Union (AU) Convention on Cyber Security and Personal Data Protection (Malabo Convention), and the AU Data Policy Framework.

However, enforcement of these orders remains a challenge. For instance, Uganda’s PDPO does not make binding decisions and only makes declaratory orders. Additionally, the regulator does not have powers to make orders of compensation to aggrieved parties, and indeed did not do so under the current decision. It can only recommend that the complainants engage a court of competent jurisdiction, in accordance with section 33(1) of the Act.

Conversely, the Office of the Data Protection Commissioner of Kenya established by section 5 of Data Protection Act, 2019  and the Personal Data Protection Commission of Tanzania established by section 6 of the Protection of Personal Information Act, 2022 are bestowed with powers to issue administrative fines under sections 9(1)(f) and section 47 respectively.

The dilemma surrounding the Uganda PDPO presents major concerns about its capacity to remedy wrongs of global data collectors, controllers and processors. Among its declarations in the July 2025 decision was that it would not issue an order for data localisation “at this stage” but “Google LLC is reminded that all cross-border transfers of personal data must comply fully with Ugandan law”. This leaves unanswered questions over data sovereignty and respect for individuals’ data rights given the handicaps faced by data regulators in countries such as Uganda and the practicalities presented by the global digital economy.

In these circumstances, Uganda’s Data Protection and Privacy Act should be amended to expand the powers of PDPO to impose administrative fines so as to add weight and enforceability to its decisions.

Elevating Children’s Voices and Rights in AI Design and Online Spaces in Africa

By Patricia Ainembabazi

As Artificial Intelligence (AI) reshapes digital ecosystems across the globe, one group remains consistently overlooked in discussions around AI design and governance: Children. This gap was keenly highlighted at the Internet Governance Forum (IGF) held in June 2025 in Oslo, Norway, where experts, policymakers, and child-focused organisations called for more inclusive AI systems that protect and empower young users.

Children today are not just passive users of digital technologies; they are among the most active and most vulnerable user groups. In Africa, internet use among youths aged 15 to 24 was partly fuelled by the Covid-19 pandemic, hence their growing reliance on digital platforms for learning, play, and social interaction. New research by the Digital Rights Alliance Africa (DRAA), a consortium hosted by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), shows that this rapid connectivity has amplified exposure to risks such as harmful content, data misuse, and algorithmic manipulation that are especially pronounced for children.

The research notes that AI systems have become deeply embedded in the platforms that children engage with daily, including educational software, entertainment platforms, health tools, and social media. Nonetheless, Africa’s emerging AI strategies remain overwhelmingly adult-centric, often ignoring the distinct risks these technologies pose to minors. At the 2025 IGF, the urgency of integrating children’s voices into AI policy frameworks was made clear through a session supported by the LEGO Group, the Walt Disney Company, the Alan Turing Institute, and the Family Online Safety Institute. Their message was simple but powerful: “If AI is to support children’s creativity, learning, and safety, then children must be included in the conversation from the very beginning”.

The forum drew insights from recent global engagements such as the Children’s AI Summit of February 2025 held in the UK and the Paris AI Action Summit 2025. These events demonstrated that while children are excited about AI’s potential to enhance learning and play, they are equally concerned about losing creative autonomy, being manipulated online, and having their privacy compromised. A key outcome of these discussions was the need to develop AI systems that children can trust; systems that are safe by design, transparent, and governed with accountability.

This global momentum offers important lessons for Africa as countries across the continent begin to draft national AI strategies. While many such strategies aim to spur innovation and digital transformation, they often lack specific protections for children. According to DRAA’s 2025 study on child privacy in online spaces, only a handful of African countries have enacted child-specific privacy laws in the digital realm. Although instruments like the African Charter on the Rights and Welfare of the Child recognise the right to privacy, regional frameworks such as the Malabo Convention, and even national data protection laws, rarely offer enforceable safeguards against AI systems that profile or influence children.

Failure to address these gaps will leave African children vulnerable to a host of AI-driven harms ranging from exploitative data collection and algorithmic profiling to exposure to biased or inappropriate content. These harms can deprive children of autonomy and increase their risk of online abuse, particularly when AI-powered systems are deployed in schools, healthcare, or entertainment without adequate oversight.

To counter these risks and ensure AI becomes a tool of empowerment rather than exploitation, African governments, policymakers, and developers must adopt child-centric approaches to AI governance. This could start with mainstreaming children’s rights such as privacy, protection, education, and participation, into AI policies. International instruments like the UN Convention on the Rights of the Child and General Comment No. 25 provide a solid foundation upon which African governments can build desirable policies.

Furthermore, African countries should draw inspiration from emerging practices such as the “Age-Appropriate AI” frameworks discussed at IGF 2025. These practices propose clear standards for limiting AI profiling, nudging, and data collection among minors. Given that only 36 out 55 African countries currently have data protection laws, with few of them containing child-specific provisions, policymakers must take efforts to strengthen these frameworks. Such reforms should require AI tools targeting children to adhere to strict data minimisation, transparency, and parental consent requirements.

Importantly, digital literacy initiatives must evolve beyond basic internet safety to include AI awareness. Equipping children and caregivers with the knowledge to critically engage with AI systems will help them navigate and question the technology they encounter. At the same time, platforms similar to the Children’s AI Summit 2025 should be replicated at national and regional levels to ensure that African children’s lived experiences, hopes, and concerns shape the design and deployment of AI technologies.

Transparency and accountability must remain central to this vision. AI tools that affect children, whether through recommendation systems, automated decision-making, or learning algorithms, should be independently audited and publicly scrutinised. Upholding the values of openness, fairness, and inclusivity within AI systems is essential not only for protecting children’s rights but for cultivating a healthy, rights-respecting digital environment.

As the African continent’s digital infrastructure expands and AI becomes more pervasive, the choices made today will define the digital futures of generations to come. The IGF 2025 stressed that children must be central to these choices, not as an afterthought, but as active contributors to a safer and more equitable AI ecosystem. By elevating children’s voices in AI design and governance, African countries can lay the groundwork for an inclusive digital future that truly serves the best interests of all.

Human Rights in the Digital Context in Rwanda

Universal Periodic Review |

The joint stakeholder report by the Association for Progressive Communications (APC) and the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) focuses on key issues relating to human rights in the digital context in Rwanda, including digital connectivity and inclusion, freedom of expression online, surveillance and technology-facilitated gender-based violence, particularly its impact on human and women’s rights defenders.

Context of the human rights situation online in Rwanda

Since its Universal Periodic Review during the third cycle, Rwanda has made progress towards implementing some of the recommendations received. The digital infrastructure in Rwanda has expanded and access to information and communication technologies (ICTs) has improved significantly. However, there was heightened authoritarianism and censorship of online criticism in the lead-up to the 2024 general elections. Violations of user rights, strict censorship, increased surveillance and infrastructure limitations contributed to Freedom House lowering Rwanda’s Freedom on the Net rating in 2024, rating it as “Not Free” with a score of 36/100.

Digital connectivity and inclusion

Rwanda’s internet penetration rate was estimated to be 34.2% at the start of 2025. There is gender imbalance in internet access and use, with about 38.2% of active social media user identities estimated to be female in January 2025. Price and lack of appropriate devices, the main reasons for not accessing ICTs, affect women more due to fundamental gender disparities, particularly in education and income. There is a need for an effective framework to regulate Rwanda’s national ID with a unique identifier number, which is becoming popular as a key to access services and effect transactions electronically.

Freedom of speech and expression online

Rwandan laws still contain provisions which unduly restrict free speech and expression online, including provisions of the 2018 Penal Code relating to dissemination of edited words or images, spreading false information, causing hostile international opinion, humiliating national authorities, and refusing to answer questions by intelligence officers. Meanwhile, troubling incidents of arrests, intimidation, abduction and killings of journalists and human rights defenders for exercising their rights to free speech continue to be reported. Self-censorship is widespread due to social pressure to support the government and fear of reprisals for criticising authorities. Restrictions on disinformation in the 2018 Cybercrime law are characterised by vague definitions and prescribe long periods of imprisonment. In June 2025, Rwanda’s Supreme Court rejected a challenge to the 2018 Cybercrime law for violating the constitutional guarantees of free speech.

Online surveillance, transnational repression and right to privacy

The law on Protection of Personal Data enacted by the Rwandan Government in October 2021 lacks a public-interest exception for digital and traditional media outlets, and has very strict data localisation requirements. Mass surveillance is institutionalised within Rwanda, with law 60/2013 requiring service providers to ensure that systems are technically capable of supporting interceptions at all times. Credible reports indicate the government has acquired and deployed Pegasus (a powerful spyware) against political opponents and human rights defenders, including members of the diaspora.

Technology-facilitated gender-based violence (TFGBV) against women human rights defenders

Rwanda’s 2018 Cybercrime Law aims to address some forms of TFGBV, but falls short in its implementation. There are concerns about the law being misused, for instance, to criminalise TFGBV survivors in cases where content is created and shared without consent. Gendered disinformation in Rwanda has been used to target women politicians and human rights defenders with image-based disinformation, sexualise them and create false narratives, shifting public focus from their main political discourse. Online and offline attacks based on gender identity and sexual orientation are prevalent in Rwanda, including online harassment, government surveillance and posting of images without consent.

Key recommendations to the government of Rwanda

  • Ensure that digital access is inclusive and equitable for all by removing access barriers for marginalised communities, including rural communities, women and persons with disabilities.
  • Repeal provisions which unduly criminalise free speech including articles 157, 164, 194, 233 and 253 of the 2018 Penal Code; amend the 2018 Cybercrime law to ensure that all provisions comply with international human rights standards relating to free speech and expression.
  • Withdraw all cases against individuals facing harassment, intimidation and prosecution from state authorities for legitimate expression of dissent against the government.
  • Refrain from or cease the use of artificial intelligence applications and spyware, where they are impossible to operate in compliance with international human rights law or that pose undue risks to the enjoyment of human rights, unless and until the adequate safeguards to protect human rights and fundamental freedoms are in place.
  • Guarantee adequate independent oversight mechanisms to ensure state surveillance practices are limited and proportional in accordance with international human rights standards.
  • Enhance measures and policies to prohibit, investigate and prosecute TFGBV in line with international human rights standards.
  • Amend the 2018 Cybercrime law to ensure that restrictions to freedom of expression as a response to TFGBV are necessary and proportionate, not overly broad or vague in terms of what speech is restricted, and do not over penalise.
  • Provide redress and reparation as an effective, efficient and meaningful way of aiding victims of TFGBV and ensuring that justice is achieved.
  • Develop appropriate and effective accountability mechanisms for social media platforms and other technology companies focused on ensuring company transparency and remediation to ensure that hate speech and TFGBV is appropriately addressed on their platforms.

Read full report here.

Advancing Respect for Human Rights by Businesses in Uganda

CIPESA |

In partnership with Enabel, the European Union, and the Uganda Ministry of Gender, Labour, and Social Development, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is implementing “The Advancing Respect for Human Rights by Businesses in Uganda (ARBHR) project”. Launched in November 2024, the project seeks to among others reduce human rights abuses connected to business activities in Uganda, particularly those impacting women and children.

With a focus on Uganda, the project is being implemented in the regions of Busoga (Iganga, Mayuge, Bugiri, and Bugweri), Albertine (Hoima, Kikuube, Masindi, Buliisa, and Kiryandongo) and Kampala Metropolitan (Kampala, Mukono, and Wakiso). While working in these regions, CIPESA is enhancing awareness on business and human rights concerns through evidence based advocacy, sensitisation campaigns, reporting and redress mechanisms, as well as through  public and private sector policy dialogues.

More details about the project can be found here.

Connecting Business to Digital Rights

Many Ugandan businesses, particularly small and medium enterprises (SMEs), lack a comprehensive understanding of digital rights principles and their obligations in upholding them. A significant portion of Uganda’s population lacks access to the internet and modern digital technologies, limiting the reach and impact of digital rights initiatives. 

According to the telecommunications regulator, as of June 2023, Uganda had a total of 34.9 million telephone subscriptions which translates to a 77% penetration rate. At 27.7 million internet subscriptions, internet penetration is at 61%. According to a 2018 nation-wide survey by the National Information Technology Authority of Uganda (NITA-U), 76.6% of respondents named high cost as the main limitation to their use of the internet. The same reason was reported in the 2022 survey, which also cited the rural-urban divide (84.9% vs 92.1% vs) and a gender gap (84.6% female and 89.6% male) in mobile phone ownership. 

Businesses often prioritise short-term economic gains over long-term investments in responsible digital practices such as data privacy and user security. The existence of insufficient digital infrastructure, especially in rural areas, hampers the effective implementation and enforcement of digital rights protections. Businesses face increasing cybersecurity threats that compromise data privacy and other digital rights, necessitating robust security measures.

Related reading: See this commentary on the Future of work in Uganda: Challenges and Prospects in the Context of the Digital Economy

#BeeraSharp Campaign

The #BeeraSharp (“be smart” in Luganda) campaign is our response in addressing the gaps that Ugandan businesses face when navigating digital rights, online spaces and digital data. It aims to fill key knowledge gaps on the understanding of business legal obligations through adopting secure and ethical digital practices to build a smarter, safer, and more resilient business ecosystem in Uganda.