Elevating Children’s Voices and Rights in AI Design and Online Spaces in Africa

By Patricia Ainembabazi

As Artificial Intelligence (AI) reshapes digital ecosystems across the globe, one group remains consistently overlooked in discussions around AI design and governance: Children. This gap was keenly highlighted at the Internet Governance Forum (IGF) held in June 2025 in Oslo, Norway, where experts, policymakers, and child-focused organisations called for more inclusive AI systems that protect and empower young users.

Children today are not just passive users of digital technologies; they are among the most active and most vulnerable user groups. In Africa, internet use among youths aged 15 to 24 was partly fuelled by the Covid-19 pandemic, hence their growing reliance on digital platforms for learning, play, and social interaction. New research by the Digital Rights Alliance Africa (DRAA), a consortium hosted by the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), shows that this rapid connectivity has amplified exposure to risks such as harmful content, data misuse, and algorithmic manipulation that are especially pronounced for children.

The research notes that AI systems have become deeply embedded in the platforms that children engage with daily, including educational software, entertainment platforms, health tools, and social media. Nonetheless, Africa’s emerging AI strategies remain overwhelmingly adult-centric, often ignoring the distinct risks these technologies pose to minors. At the 2025 IGF, the urgency of integrating children’s voices into AI policy frameworks was made clear through a session supported by the LEGO Group, the Walt Disney Company, the Alan Turing Institute, and the Family Online Safety Institute. Their message was simple but powerful: “If AI is to support children’s creativity, learning, and safety, then children must be included in the conversation from the very beginning”.

The forum drew insights from recent global engagements such as the Children’s AI Summit of February 2025 held in the UK and the Paris AI Action Summit 2025. These events demonstrated that while children are excited about AI’s potential to enhance learning and play, they are equally concerned about losing creative autonomy, being manipulated online, and having their privacy compromised. A key outcome of these discussions was the need to develop AI systems that children can trust; systems that are safe by design, transparent, and governed with accountability.

This global momentum offers important lessons for Africa as countries across the continent begin to draft national AI strategies. While many such strategies aim to spur innovation and digital transformation, they often lack specific protections for children. According to DRAA’s 2025 study on child privacy in online spaces, only a handful of African countries have enacted child-specific privacy laws in the digital realm. Although instruments like the African Charter on the Rights and Welfare of the Child recognise the right to privacy, regional frameworks such as the Malabo Convention, and even national data protection laws, rarely offer enforceable safeguards against AI systems that profile or influence children.

Failure to address these gaps will leave African children vulnerable to a host of AI-driven harms ranging from exploitative data collection and algorithmic profiling to exposure to biased or inappropriate content. These harms can deprive children of autonomy and increase their risk of online abuse, particularly when AI-powered systems are deployed in schools, healthcare, or entertainment without adequate oversight.

To counter these risks and ensure AI becomes a tool of empowerment rather than exploitation, African governments, policymakers, and developers must adopt child-centric approaches to AI governance. This could start with mainstreaming children’s rights such as privacy, protection, education, and participation, into AI policies. International instruments like the UN Convention on the Rights of the Child and General Comment No. 25 provide a solid foundation upon which African governments can build desirable policies.

Furthermore, African countries should draw inspiration from emerging practices such as the “Age-Appropriate AI” frameworks discussed at IGF 2025. These practices propose clear standards for limiting AI profiling, nudging, and data collection among minors. Given that only 36 out 55 African countries currently have data protection laws, with few of them containing child-specific provisions, policymakers must take efforts to strengthen these frameworks. Such reforms should require AI tools targeting children to adhere to strict data minimisation, transparency, and parental consent requirements.

Importantly, digital literacy initiatives must evolve beyond basic internet safety to include AI awareness. Equipping children and caregivers with the knowledge to critically engage with AI systems will help them navigate and question the technology they encounter. At the same time, platforms similar to the Children’s AI Summit 2025 should be replicated at national and regional levels to ensure that African children’s lived experiences, hopes, and concerns shape the design and deployment of AI technologies.

Transparency and accountability must remain central to this vision. AI tools that affect children, whether through recommendation systems, automated decision-making, or learning algorithms, should be independently audited and publicly scrutinised. Upholding the values of openness, fairness, and inclusivity within AI systems is essential not only for protecting children’s rights but for cultivating a healthy, rights-respecting digital environment.

As the African continent’s digital infrastructure expands and AI becomes more pervasive, the choices made today will define the digital futures of generations to come. The IGF 2025 stressed that children must be central to these choices, not as an afterthought, but as active contributors to a safer and more equitable AI ecosystem. By elevating children’s voices in AI design and governance, African countries can lay the groundwork for an inclusive digital future that truly serves the best interests of all.

Advancing Respect for Human Rights by Businesses in Uganda

CIPESA |

In partnership with Enabel, the European Union, and the Uganda Ministry of Gender, Labour, and Social Development, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is implementing “The Advancing Respect for Human Rights by Businesses in Uganda (ARBHR) project”. Launched in November 2024, the project seeks to among others reduce human rights abuses connected to business activities in Uganda, particularly those impacting women and children.

With a focus on Uganda, the project is being implemented in the regions of Busoga (Iganga, Mayuge, Bugiri, and Bugweri), Albertine (Hoima, Kikuube, Masindi, Buliisa, and Kiryandongo) and Kampala Metropolitan (Kampala, Mukono, and Wakiso). While working in these regions, CIPESA is enhancing awareness on business and human rights concerns through evidence based advocacy, sensitisation campaigns, reporting and redress mechanisms, as well as through  public and private sector policy dialogues.

More details about the project can be found here.

Connecting Business to Digital Rights

Many Ugandan businesses, particularly small and medium enterprises (SMEs), lack a comprehensive understanding of digital rights principles and their obligations in upholding them. A significant portion of Uganda’s population lacks access to the internet and modern digital technologies, limiting the reach and impact of digital rights initiatives. 

According to the telecommunications regulator, as of June 2023, Uganda had a total of 34.9 million telephone subscriptions which translates to a 77% penetration rate. At 27.7 million internet subscriptions, internet penetration is at 61%. According to a 2018 nation-wide survey by the National Information Technology Authority of Uganda (NITA-U), 76.6% of respondents named high cost as the main limitation to their use of the internet. The same reason was reported in the 2022 survey, which also cited the rural-urban divide (84.9% vs 92.1% vs) and a gender gap (84.6% female and 89.6% male) in mobile phone ownership. 

Businesses often prioritise short-term economic gains over long-term investments in responsible digital practices such as data privacy and user security. The existence of insufficient digital infrastructure, especially in rural areas, hampers the effective implementation and enforcement of digital rights protections. Businesses face increasing cybersecurity threats that compromise data privacy and other digital rights, necessitating robust security measures.

Related reading: See this commentary on the Future of work in Uganda: Challenges and Prospects in the Context of the Digital Economy

#BeeraSharp Campaign

The #BeeraSharp (“be smart” in Luganda) campaign is our response in addressing the gaps that Ugandan businesses face when navigating digital rights, online spaces and digital data. It aims to fill key knowledge gaps on the understanding of business legal obligations through adopting secure and ethical digital practices to build a smarter, safer, and more resilient business ecosystem in Uganda.

Reflections From the WSIS+20 Africa Regional Stakeholder Workshop

By Lillian Nalwoga and Patricia Ainembabazi

As the twenty-year review of the World Summit on the Information Society (WSIS+20) approaches, the need for inclusive, well-coordinated, and well-informed African participation has become more urgent than ever. In response, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), with support from the Civil Society Alliances for Digital Empowerment (CADE) project  the Global Network Initiative (GNI), and Global Partners Digital (GPD), convened a regional stakeholder workshop on May 28, 2025, in Dar es Salaam, Tanzania. Held as a pre-event to the 2025 Africa Internet Governance Forum (AfIGF), the workshop gathered 37 participants for a multi-stakeholder dialogue on WSIS progress, the future of the Internet Governance Forum (IGF), and funding equity in global internet governance processes.

Revisiting WSIS and Its Evolving Landscape

The meeting commenced with an overview of WSIS’s significance in shaping global internet policy since its inception in 2003, highlighting the journey from the Geneva and Tunis phases to the current +20 review (WSIS+20). Participants underscored how WSIS frameworks continue to underpin digital policy efforts, especially in developing regions.

Special attention was given to how the WSIS+20 review intersects with emerging frameworks such as the Global Digital Compact (GDC). Discussions emphasized the importance of strengthening Africa’s position in both processes and cautioned against duplicative or conflicting multilateral efforts. Participants called for a harmonised approach that prioritizes human rights and inclusive development.

Regional Dynamics and Country-Level Perspectives

A key component of the meeting focused on lessons learned from country and regional engagement with WSIS+20. Common challenges identified included low public awareness of the process, inadequate coordination mechanisms at the national level, and limited participation in global negotiations. Participants stressed the need to designate clear national focal points and to disseminate accessible information on WSIS milestones and upcoming consultations. They also urged the African Union Commission (AUC) and sub-regional bodies like the United Nations Economic Commission for Africa (UNECA) to consolidate African positions, reflecting shared concerns around digital access, online rights, and capacity-building for local actors.

Funding Inequities and Digital Diplomacy Imbalances

A prominent theme was the uneven distribution of financial and institutional support across regions and thematic areas. Delegates observed that limited participation in global forums such as the IGF, WSIS, and the GDC often mirror existing geopolitical and economic disparities, resulting in underrepresentation of African stakeholders due to limited travel support, language barriers, and technical capacity gaps. Participants called for the urgent need for donor and partner commitments to develop equitable funding models. Such models should prioritize grassroots organizations, youth-led initiatives, and actors outside urban areas. Furthermore, participants called for empowering African UN missions in Geneva, New York, and other capitals with the necessary expertise to influence global digital policymaking.

Civil Society’s Role in Shaping Future Digital Governance

Participants also recognised the role of Civil society organizations (CSOs) as a key stakeholder group in advancing digital governance and policy debate especially on issues such as digital rights, access, disinformation, cybersecurity, and feminist internet principles. A call for the WSIS+20 to produce tangible commitments to safeguard online freedoms, protect civic space, and enhance stakeholder inclusiveness was made.

Participants reiterated the importance of maintaining the IGF’s relevance as a multistakeholder platform and urged for sustainable financing, improved intersessional activities, and stronger linkages to policy outcomes to address the fragmentation increasingly seen in digital governance.

Voices from Parliament and the Legal Sector

Lawmakers and legal experts provided insights into domestic legislative processes and how international norms can be integrated into enforceable national frameworks. Discussions centered on data protection legislation, content regulation, and digital inclusion policies, emphasizing the need for increased legislative scrutiny and cross-border cooperation to foster policy coherence across Africa.

Media and Fact-Checking in the Digital Age

Journalists and fact-checkers reflected on the growing threats to information integrity in digital spaces. They emphasized the vital roles of press freedom, online safety, and accountability, highlighting the importance of partnerships between media outlets and civil society to counter disinformation, especially during elections and crises.

Next Steps and Recommendations

To make WSIS+20 and the GDC processes more inclusive and sustainable, participants proposed several key actions:

  • Strengthen national coordination structures for WSIS+20 and GDC engagement 
  • Develop regional position papers ahead of upcoming UN sessions such as the UN Commission on Science and Technology for Development (CSTD) and the UN General Assembly (UNGA) 
  • Leverage the 2025 AfIGF as a platform for broader African input into WSIS+20 
  • Establish a knowledge-sharing platform for African stakeholders to exchange resources, experiences, and policy insights

Conclusion

The WSIS+20 regional stakeholder workshop underscored Africa’s critical need to take a more assertive role in global digital governance. Amid rising geopolitical tensions, rapid technological change, and the increasing importance of digital tools in daily life, it is both a challenge and a chance for Africa to assert its digital future on its own terms.

Registration For FIFAfrica25 Now Open!

By FIFAfrica |

We are excited to announce that registration for the 2025 Forum on Internet Freedom in Africa (FIFAfrica25) is officially OPEN!

Taking place in Windhoek, Namibia, FIFAfrica25 comes at a pivotal time for Africa’s digital future. As governments, civil society, technologists, and the broader digital society and ecosystem grapple with the evolving dynamics of Artificial Intelligence, platform regulation, surveillance, and internet shutdowns as well as funding for digital rights and governance efforts, this year’s Forum offers a much-needed space for bold conversations, collaborative thinking, and collective action.

Building on the momentum from CIPESA’s and partners’ recent engagements at the regional and global Internet Governance Forums (IGF), contributions to the World Summit on the Information Society (WSIS) +20 Summit, and preparations for the upcoming G20 Summit, the Forum will serve as a key bridge between global digital policy conversations with lived realities, governance priorities, and contexts within the African continent. As digital technologies shape Africa’s political, economic, and social landscape, safeguarding digital rights is essential to building inclusive, participatory, and democratic societies. 

Key themes at FIFAfrica25 will include:

  • AI, Digital Governance, and Human Rights
  • Disinformation and Platform Accountability
  • Internet Shutdowns
  • Digital Inclusion
  • Digital Trade in Africa
  • Digital Public Infrastructure (DPI)
  • Digital Safety and Resilience

Since 2014, FIFAfrica has created a leading pan-African space for shaping digital rights, inclusion, and governance conversations. Whether you’re a returning member of the FIFAfrica family or joining us for the first time, we invite you to register now and be part of shaping the digital rights agenda on the continent. 

Feedback on Session Proposals and Travel Support Applications

We received an incredible response for the call for session proposals and travel support. While we had anticipated providing feedback on July 4, 2025, we will now be able to provide feedback by July 14, 2025. Thank you for your patience and for contributing to what promises to be an exciting FIFAfrica25.   

Prepare for FIFAfrica25: Travel and Logistics

Everything you need to plan your attendance at the Forum is right here – visit this page for key logistical details and tips to help you make the most of your experience!

Democratising Big Tech: Lessons from South Africa’s 2024 Election

By Jean-Andre Deenik | ADRF

South Africa’s seventh democratic elections in May 2024 marked a critical turning point — not just in the political sphere, but in the digital one too. For the first time in our democracy’s history, the information space surrounding an election was shaped more by algorithms, platforms, and private tech corporations than by public broadcasters or community mobilisation.

We have entered an era where the ballot box is not the only battleground for democracy. The online world — fast-moving, largely unregulated, and increasingly dominated by profit-driven platforms — has become central to how citizens access information, express themselves, and participate politically.

At the Legal Resources Centre (LRC), we knew we could not stand by as these forces influenced the lives, choices, and rights of South Africans — particularly those already navigating inequality and exclusion. Between May 2024 and April 2025, with support from the African Digital Rights Fund (ADRF), we implemented the Democratising Big Tech project: an ambitious effort to expose the harms of unregulated digital platforms during elections and advocate for transparency, accountability, and justice in the digital age.

Why This Work Mattered

The stakes were high. In the run-up to the elections, political content flooded platforms like Facebook, YouTube, TikTok, and X (formerly Twitter). Some of it was civic-minded and constructive — but much of it was misleading, inflammatory, and harmful.

Our concern wasn’t theoretical. We had already seen how digital platforms contributed to offline violence during the July 2021 unrest, and how coordinated disinformation campaigns were used to sow fear and confusion. Communities already marginalised — migrants, sexual minorities, women — bore the brunt of online abuse and harassment.

South Africa’s Constitution guarantees freedom of expression, dignity, and access to information. Yet these rights are being routinely undermined by algorithmic systems and opaque moderation policies, most of which are designed and governed far beyond our borders. Our project set out to change that.

Centering People: A Public Education Campaign

The project was rooted in a simple truth: rights mean little if people don’t know they have them — or don’t know when they’re being violated. One of our first goals was to build public awareness around digital harms and the broader human rights implications of tech platforms during the elections.

We launched Legal Resources Radio, a podcast series designed to unpack the real-world impact of technologies like political microtargeting, surveillance, and facial recognition. Our guests — journalists, legal experts, academics, and activists — helped translate technical concepts into grounded, urgent conversations.

We spoke to:

Alongside the podcasts, we used Instagram to host

Holding Big Tech to Account

A cornerstone of the project was our collaboration with Global Witness, Mozilla, and the Centre for Intellectual Property and Information Technology Law (CIPIT). Together, we set out to test whether major tech companies (TikTok, YouTube, Facebook, and X) were prepared to protect the integrity of South Africa’s 2024 elections. To do this, we designed and submitted controlled test advertisements that mimicked real-world harmful narratives, including xenophobia, gender-based disinformation, and incitement to violence. These ads were submitted in multiple South African languages to assess whether the platforms’ content moderation systems, both automated and human, could detect and block them. The findings revealed critical gaps in platform preparedness and informed both advocacy and public awareness efforts ahead of the elections.

The results were alarming.

  • Simulated ads with xenophobic content were approved in multiple South African languages;
  • Gender-based harassment ads directed at women journalists were not removed;
  • False information about voting — including the wrong election date and processes — was accepted by TikTok and YouTube.

These findings confirmed what many civil society organisations have long argued: that Big Tech neglects the Global South, failing to invest in local language moderation, culturally relevant policies, or meaningful community engagement. These failures are not just technical oversights. They endanger lives, and they undermine the legitimacy of our democratic processes.

Building an Evidence Base for Reform

Beyond exposing platform failures, we also produced a shadow human rights impact assessment. This report examined how misinformation, hate speech, and algorithmic discrimination disproportionately affect marginalised communities. It documented how online disinformation isn’t simply digital noise — it often translates into real-world harm, from lost trust in electoral systems to threats of violence and intimidation.

We scrutinised South Africa’s legal and policy frameworks and found them severely lacking. Despite the importance of online information ecosystems, there are no clear laws regulating how tech companies should act in our context. Our report recommends:

  • Legal obligations for platforms to publish election transparency reports;
  • Stronger data protection and algorithmic transparency;
  • Content moderation strategies inclusive of all South African languages and communities;
  • Independent oversight mechanisms and civil society input.

This work is part of a longer-term vision: to ensure that South Africa’s digital future is rights-based, inclusive, and democratic.

Continental Solidarity

In April 2025, we took this work to Lusaka, Zambia, where we presented at the Digital Rights and Inclusion Forum (DRIF) 2025. We shared lessons from South Africa and connected with allies across the continent who are also working to make technology accountable to the people it impacts.

What became clear is that while platforms may ignore us individually, there is power in regional solidarity. From Kenya to Nigeria, Senegal to Zambia, African civil society is uniting around a shared demand: that digital technology must serve the public good — not profit at the cost of people’s rights.

What Comes Next?

South Africa’s 2024 elections have come and gone. But the challenges we exposed remain. The online harms we documented did not begin with the elections, and they will not end with them.

That’s why we see the Democratising Big Tech project not as a one-off intervention, but as the beginning of a sustained push for digital justice. We will continue to build coalitions, push for regulatory reform, and educate the public. We will work with journalists, technologists, and communities to resist surveillance, expose disinformation, and uphold our rights online.

Because the fight for democracy doesn’t end at the polls. It must also be fought — and won — in the digital spaces where power is increasingly wielded, often without scrutiny or consequence.

Final Reflections

At the LRC, we do not believe in technology for technology’s sake. We believe in justice — and that means challenging any system, digital or otherwise, that puts people at risk or threatens their rights. Through this project, we’ve seen what’s possible when civil society speaks with clarity, courage, and conviction.

The algorithms may be powerful. But our Constitution, our communities, and our collective will are stronger.