Join CIPESA’s Critical Research on Biometric Data Privacy in Uganda

In an era where biometric data collection is rapidly expanding across Uganda’s public and private sectors, the need for expert oversight has never been more critical. From national ID systems to voter registration and SIM card verification, these digital transformations are reshaping privacy rights and surveillance capabilities in ways that demand careful scrutiny.

CIPESA is seeking a qualified consultant to peer review groundbreaking research on biometric data collection and its implications for digital rights in Uganda.

This consultancy offers a unique opportunity to:

  • Shape policy recommendations on digital rights and data protection in Uganda
  • Contribute to protecting citizens’ privacy rights in an increasingly digital ecosystem
  • Work with a leading organization at the intersection of technology and human rights
  • Deliver meaningful impact in just 15 days of focused work

The ideal candidate will bring:

  • An advanced degree in Law, international human rights law, or related field
  • Deep understanding of digital rights and data protection frameworks
  • Proven research and analytical capabilities
  • Strong track record in producing clear, actionable reports

Ready to contribute to this vital work? Submit your application by January 27, 2025 (18:00 EAT), including:

  • Cover letter
  • CV
  • Two samples of research work
  • Financial proposal

Download Research Consultant ToRs here.

Send your application to [email protected] with cc to [email protected].

This project is supported by Enabel and the European Union.

What Does Meta’s About-Turn on Content Moderation Bode for Africa?

By CIPESA Writer |

Meta’s recent decision to get rid of its third-party fact-checkers, starting within the United States, has sent shockwaves globally, raising significant concerns about the concept of free speech and the fight against disinformation and misinformation. The announcement was part of a raft of major policy changes announced on January 7, 2025 by Meta’s CEO Mark Zuckerberg that will affect its platforms Facebook, Instagram and Threads used by three billion people worldwide. They include the introduction of the user-generated “Community Notes” model, elimination of third-party fact-checkers, reduced content restrictions and enforcement, and enabling the personalisation of civic or political content.

While the announcement makes no reference to Africa, the changes will trickle down to the continent. Meta’s decision is particularly concerning for Africa which is unique in terms of linguistic and cultural diversity, limited digital and media information literacy, coupled with the growing challenges of hate speech and election-related disinformation, lack of context-specific content moderation policies, and inadequate investment in local fact-checking initiatives.

Africa’s content moderation context and needs are also quite different from those of Europe or North America due to the predominant use of local languages that are often overlooked by automated fact-checking algorithms and content filters.

Notably, the justifications given by Meta are quite weak, as the new changes appear to undermine its own initiatives to promote free speech, particularly the work of its third-party fact-checking program and the Oversight Board, which it set up to help resolve some of the most difficult questions around freedom of expression online and information integrity. The decision also appears to be politically and economically motivated as the company seeks to re-align itself with and appease the incoming Trump administration that has been critical against fact-checking and get assistance in pushing back against regulation of its activities outside the U.S.

The company also amended its policy on Hateful Conduct on January 7, 2025, and replaced the term “hate speech” with “hateful conduct” and eliminated previous thresholds for taking down hate content and will allow more hateful speech against specific groups. Further, whereas the company is moving its Trust and Safety and Content Moderation Teams to Texas, it is yet to set up such robust teams for Africa.

Importance of Fact-Checking

Fact-checking plays a critical role in combating disinformation and misinformation and fostering informed public discourse. By verifying the accuracy of online content, fact-checkers help to identify unauthentic content and counter the spread of false narratives that can incite violence, undermine trust in institutions, or distort democratic processes.

Additionally, it promotes accountability and reduces the virality of misleading content, particularly during sensitive periods, such as elections, political unrest, public health crises, or conflict situations, where accurate and credible information is crucial for decision-making. Moreover, fact-checking fosters media literacy by encouraging audiences to critically evaluate information sources.

Fact-checking organisations such as Politifact have criticised the assertions by the Meta CEO that fact-checkers were “too politically biased” and had “destroyed more trust than they had created, especially in the U.S.”, yet decisions and power to take down content have been squarely Meta’s responsibility, with fact-checkers only providing independent review of posts. The Meta assertions also undermine the work of independent media outlets and civil society who have been accused by authoritarian regimes of being corrupt political actors.

 However, fact-checking is not without its challenges and downsides. The process can inadvertently suppress free expression, especially in contexts where the line between disinformation and legitimate dissent is blurred. In Africa, where cultural and linguistic diversity is vast, and resources for local-language moderation are limited, fact-checking algorithms or teams may misinterpret context, leading to unjust content removal or amplification of bias. Furthermore, fact-checking initiatives can become tools for censorship if not governed transparently, particularly in authoritarian settings.

Despite these challenges, the benefits of fact-checking far outweigh their challenges. Instead of getting rid of fact-checking, Meta and other big tech companies should strengthen its implementation by providing enough resources to both recruit, train and provide psycho-social services to fact-checkers.

Impact of the Decision for Africa
  1. Increase of Disinformation

Africa faces a distinct set of challenges that make effective content moderation and fact-checking particularly crucial. Disinformation and misinformation in Africa have had far-reaching consequences, from disrupting electoral processes and influencing the choice of candidates by unsuspecting voters to jeopardising public health. Disinformation during elections has fueled violence, while health-related misinformation during health crises, such as during the Covid-19 pandemic, endangered lives by undermining public health efforts. False claims about the virus, vaccines, or cures led to vaccine hesitancy, resistance to public health measures like mask mandates, and the proliferation of harmful treatments. This eroded trust in health institutions, slowed down pandemic response efforts, and contributed to preventable illnesses and deaths, disproportionately affecting vulnerable populations.

The absence of fact-checking exacerbates the existing challenges of context insensitivity, as automated systems and under-resourced moderation teams fail to address the nuances of African content. The introduction of the user-driven Community Notes, which is similar to the model used on X, will still require experts’ input, especially in a region where many governments are authoritarian. Yet, media and information literacy and access to credible and reliable information is limited, and Meta’s platforms are primary ways to access independent news and information.

Research on the use of Community Notes on X has shown that the model has limited effectiveness in reducing the spread of disinformation, as it “might be too slow to intervene in the early (and most viral stages of the diffusion”, which is the most critical. The move also undermines efforts by civil society and fact-checking organisations in the region who have been working tirelessly to combat the spread of harmful content online.

  1. Political Manipulation and Increased Malign Influence

Dialing down on moderation and oversight may empower political actors who wish to manipulate public opinion through disinformation campaigns resulting in the surge of such activities. Given that social media has been instrumental in mobilising political movements across Africa, the lack of robust content moderation and fact-checking could hinder democratic processes and amplify extremist views and propaganda. Research has shown an apparent link between disinformation and political instability in Africa.

Unchecked false narratives not only mislead voters, but also distort public discourse and diminish public trust in key governance and electoral institutions. Authoritarian regimes may also use it to undermine dissent. Moreover, the relaxation of content restrictions on sensitive and politically divisive topics like immigration and gender could open floodgates for targeted hate speech, incitement and discrimination which could exacerbate gender disinformation, ethnic and political tensions. Likewise, weak oversight may enable foreign/external actors to manipulate elections.

  1. Regulatory and Enforcement Gaps

The effect of Meta easing restrictions on moderation of sensitive topics and reduced oversight of content could lead to an increase of harmful content on their platforms. Already, various African countries have  weak regulatory frameworks for harmful content and thus rely on companies like Meta to self-regulate effectively. Meta’s decision could spur efforts by some African governments to introduce new and more repressive laws to restrict certain types of content and hold platforms accountable for their actions. As our research has shown, such laws could be abused and employed to suppress dissent and curtail online freedoms such as expression, assembly, and association as well as access to information, creating an even more precarious environment.

  1. Limited Engagement with Local Actors

Meta’s decision to abandon fact-checking raises critical concerns for Africa, coming after the tech giant’s January 2023 decision to sever ties with their East African content moderation contractor, Sama, based out of Nairobi, Kenya, that was responsible for content moderation in the region. The Sama-operated hub announced its exit from content moderation services to focus on data annotation tasks, citing the prevailing economic climate as a reason for streamlining operations. Additionally, the Nairobi hub faced legal and ethical challenges, including allegations of poor working conditions, inadequate mental health support for moderators exposed to graphic content, and unfair labour practices. These issues led to lawsuits against both Sama and Meta, intensifying scrutiny of their practices.

Meanwhile, fact-checking partnerships with local organisations have played a crucial role in addressing disinformation, and their elimination erodes trust in Meta’s commitment to advancing information integrity in the region. Meta has fact-checking arrangements with various companies across 119 countries, including 26 in Africa. Some of the companies in Africa include AFP, AFP – Coverage, AFP – Hub, Africa Check, Congo Check, Dubawa, Fatabyyano فتبين,  Les Observateurs de France 24 and PesaCheck. In the aftermath of Meta’s decision to sever ties with their East African third-party content moderators, Sama let go of about 200 employees.

Opportunities Amidst Challenges

While Meta’s decision to abandon fact-checking is a concerning development, it also presents an opportunity for African stakeholders to utilise regional instruments, such as the African Charter on Human and Peoples’ Rights and the Declaration of Principles on Freedom of Expression and Access to Information in Africa, to assert thought leadership and demand better practices from platforms. Engaging with Meta’s regional leadership and building coalitions with other civil society actors can amplify advocacy about the continent’s longstanding digital rights and disinformation concerns and demands for more transparency and accountability.

Due to the ongoing pushback against the recently announced changes, Meta should be more receptive to dialogue and recommendations to review and contextualise the new proposals. For Africa, Meta must address its shortcomings by urgently investing and strengthening localised content moderation in Africa. It must reinvest in fact-checking partnerships, particularly with African organisations that understand local contexts. These partnerships are essential for addressing misinformation in local languages and underserved regions.

The company must also improve its automated content moderation tools, including by developing tools that can handle African culture, languages and dialects, hire more qualified moderators with contextual knowledge, provide comprehensive training for them and expand its partnerships with local stakeholders. Moreover, the company must ensure meaningful transparency and accountability as many of its transparency and content enforcement reports lack critical information and disaggregated data about its actions in most African countries.

Lastly, both governments and civil society in Africa must invest in digital, media and information literacy which is essential to empower users to critically think about and evaluate online content. Meta should partner with local organisations to promote digital literacy initiatives and develop educational campaigns tailored to different regions and languages. This will help build resilience against misinformation and foster a more informed digital citizenry.

In conclusion, it remains to be seen how the new changes by Meta will be implemented in the U.S., and subsequently in Africa, and how the company will address the gaps left by fact-checkers and mitigate the risks and negative consequences stemming from its decision. Notably, while there is widespread acknowledgement that content moderation systems on social media platforms are broken, efforts to promote and protect rights to free expression and access to information online should be encouraged. However, these efforts should not come at the expense of user trust and safety, and information integrity.

New Toolkit to Guide National Human Rights Institutions in Promoting Digital Rights

Edrine Wanyama |

In an increasingly digital world, safeguarding human rights requires innovative tools, robust mechanisms, and strategic collaboration. Recognising this need, the International Center for Not-for-Profit Law (ICNL), the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), and Paradigm Initiative (PIN) have developed a groundbreaking Toolkit to strengthen the ability of National Human Rights Institutions (NHRIs) in Africa to protect and promote human rights in the digital era. 


While emphasising the role of NHRIs in both promoting and protecting these rights, the Toolkit demystifies digital rights by providing their relationship with the traditionally known rights and demonstrating how digital rights violations can occur. 

The digital transformation sweeping across the globe has created new opportunities for citizens to communicate, express themselves, and claim their various rights. However, it has also ushered in unprecedented challenges, including online censorship, surveillance, misinformation, and violations of privacy. These digital threats disproportionately affect marginalised communities, activists, and human rights defenders, making the role of NHRIs more critical than ever.

The Toolkit equips NHRIs with the knowledge, tools, and strategies they need to effectively address these challenges. It emphasises the intersection of human rights with digital technologies and provides actionable insights to promote accountability, transparency, and inclusivity in digital governance.

The Toolkit highlights the various forms of digital rights violations  such as internet shutdowns, throttling, and blocking; content restrictions including filtering and takedown orders, onerous obligations on intermediaries, restrictive content moderation policies, and the widespread and unchecked digital surveillance.  

Among the roles that NHRIs should play are providing technical advice to government ministries, legislators, the judiciary, and other stakeholders to shape progressive laws, designing digital literacy curricula, and capacity and awareness building of the relevant institutions and stakeholders. Others are research on the impact of digital technologies, application of regional and international human rights approaches, and oversight over public sector procurement of digital technologies.

How NHRIs Can Protect Digital Rights

In the context of digital rights, NHRIs may:

  • Monitor proposed legislation with respect to its impact on digital rights and submit recommendations on how to ensure human rights compliance. 
  • Incorporate digital rights topics, such as online privacy rights violations and incidents of government ordered network disruptions, into annual reporting and submissions to UN mandate holders and the Universal Periodic Review (UPR) and other regional and international human rights monitoring processes. 
  • Connect with domestic and regional digital rights organisations to coordinate efforts to address digital rights violations.
  • Revise existing intake material to systematically receive complaints of digital rights violations
  • Ensure internal policies and methodologies for investigating, analysing, and reporting take into consideration the types of information, data, and tools needed to address digital rights violations
  • When supporting complainants and victims, provide resources and referrals for digital security best practices and capacity building so they can better protect themselves as they seek redress
  • Investigate digital rights violations and call for the necessary measures to end them and ensure non-recurrence.

The Toolkit also underscores the need for NHRIs to build their internal capacities to report and respond to digital rights violations, reporting and monitoring the implementation of laws, coordinating digital rights issues with regional and international institutions, and investigating digital rights violations to ensure that violations stop and justice is served. 

The Toolkit is an important resource that can be utilised to equip various stakeholders with knowledge to respond to emerging digital rights challenges and to identify viable solutions, such as monitoring, documenting and reporting, to enhance the promotion and protection of digital rights. As such, it could go a long way in helping to address common digital rights violations and leveraging resources and partnerships for the protection and promotion of digital rights in Africa.

The AU Disability Protocol Comes Into Force: Implications for Digital Rights for Persons with Disabilities in Africa

By Paul Kimumwe & Michael Aboneka |

On this International Day for Persons with Disabilities, CIPESA reflects on the impact of the African Union (AU) Disability Protocol and its Implication on digital rights for persons with disabilities in Africa and calls upon the African Commission to establish a Special Mandate to enhance the respect for and protection of the rights for persons with disabilities in Africa

Six years after its adoption, the Protocol to the African Charter on Human and Peoples’ Rights on the Rights of Persons with Disabilities in Africa came into force in May 2024 after securing the mandatory 15th ratification by the Republic of Congo. The other 14 African Union member states that have ratified the Protocol are Angola, Burundi, Cameroon, Kenya, Mali, Malawi, Mozambique, Namibia, Nigeria, Niger, Rwanda, South Africa, the Sahrawi Arab Democratic Republic, and Uganda. 

For disability rights activists, this was a defining moment as the protocol augments the rights of persons with disabilities to barrier-free access to the physical environment, transportation, information, and other communication technologies and systems. Specifically, under articles 23 and 24 of the protocol, States Parties should take “effective and appropriate measures” to facilitate the full enjoyment by persons with disabilities of the right to freedom of expression and opinion and access to information, including through the use of Information and Communication Technologies (ICT).

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has been a longstanding advocate for African governments to urgently ratify the protocol. However, CIPESA has also stated, including in submissions to the Africa Commission on Human and People’s Rights (ACHPR), that ratifying the protocol would be a major but insufficient step in ensuring that persons with disabilities access and use digital technologies and that there is sufficient disaggregated data to inform programme interventions.

Indeed, article 24(2) requires States Parties to put in place policy, legislative, administrative, and other measures to ensure that persons with disabilities enjoy the right to freedom of expression and access to information on an equal basis, including:

  1. Providing information intended for the general public as well as information required for official interactions with persons with disabilities in accessible formats and technologies appropriate to different kinds of disabilities in a timely manner and without additional cost to persons with disabilities. 
  2. Requiring private entities that provide services to the general public, including through the internet, to provide information and services in accessible and usable formats for persons with disabilities. 
  3. Recognising and promoting the use of sign language. 
  4. Ensuring that persons with visual impairments or with other print disabilities have effective access to published works, including by using information and communication technologies.

The protocol adds to the available digital rights advocacy tools for disability rights actors, including the 2006 United Nations Convention on the Rights of Persons with Disabilities (CRPD), which places significant obligations on States Parties to take appropriate measures to ensure that persons with disabilities have equal and meaningful access to ICT, including the internet. 

The CRPD was the first international human rights treaty requiring the accessibility of digital tools as a prerequisite for persons with disabilities to fully enjoy their fundamental rights without discrimination. It highlights the inherent risks of exclusion of persons with disabilities from participating equally in society by defining ICT accessibility as integral to general accessibility rights and on par with access to the physical environment and transportation.

While there has been some progress in the enactment of disability rights-respecting and ICT-enabling laws for persons with disabilities in Africa, implementation is a challenge. Moreover, the Protocol comes into force when the digital divide and exclusion of persons with disabilities has worsened despite the exponential growth and penetration of digital technologies on the continent. Persons with disabilities have consistently remained disproportionately excluded from the digital society due to factors such as low levels of ICT skills, high illiteracy levels, and high cost of assistive technologies such as screen readers, screen magnification software, text readers, and speech input software.

It is against this background that CIPESA adds its voice to other calls to the African Commission to expedite the establishment of a special mandate at the level of Special Rapporteur for Persons with Disabilities. This elevated position will ensure that the rights of persons with disabilities in Africa are mainstreamed and upheld.

CIPESA recognises that as a regional human rights instrument, the protocol empowers disability rights actors to demand the enactment and full implementation of policies and laws that promote the rights of persons with disabilities, including in accessing and using digital technologies.

For example, disability rights actors, including civil society, activists, and Disability Rights Organisations (DPOs), should develop mechanisms to monitor the status of implementation of the protocol, including ensuring that the states parties submit their statutory reports as required by Article 34 of the protocol. The DPOs should also actively participate in developing shadow reports on the status of implementation of the protocol, especially on access to information and participation in public affairs.

In addition, disability rights organisations should work with policymakers and the executive to ensure that more countries ratify the protocol and domesticate it through national policies, laws, and practices. Both the protocol and the CRPD should become a reference point during any discussions of draft laws and policies that affect persons with disabilities.

For the media, it is important that, through their reporting, they hold governments accountable for failure to ratify or to fully implement the provisions of the protocol.

Member countries can also demand for accountability of their peers on the status of implementation of the key provisions of the protocol through the African Peer Review Mechanism (APRM).

Please read more about CIPESA submissions on policy actions governments should take after ratifying the protocol. See also The Disability and ICT Accessibility Framework for Monitoring the Implementation of ICT Accessibility Laws and Policies in Africa.

Civil Society Statement on Kenya’s Telegram Shutdown

Statement |

As civil society organizations and stakeholders in the Information, Communication and Technology (ICT) sector committed to Digital Rights and Internet Freedom, we are deeply concerned about the Kenyan government’s recent decision to block access to the Telegram social media platform. 

According to an unverified letter circulating online from the Communications Authority of Kenya (CA) to service providers (Safaricom, Airtel, Telkom Kenya and Jamii Telecommunications) on 31 October 2024, the operators were required to “use all available mechanisms to suspend the operation of Telegram Inc in the country”.  The suspension was ordered to prevent cheating during the national examinations period on weekdays until 22nd November 2024.  Moreover, the ongoing internet disruption has been confirmed by web connectivity tests from OONI and Netblocks, as well as independent tests by Tatua

Internet disruptions like these undermine fundamental human rights and freedoms outlined in the International Bill of Rights to which the Kenyan government is a party and the Kenyan Constitution. Likewise, they disrupt economic activity and weaken democratic values by limiting the rights to Access to Information and Freedom of Expression, Assembly and Association.

This action also goes against the principles outlined in the Global Digital Compact (GDC), which emphasizes the importance of a universal, open, and secure internet. The GDC, part of the commitments that governments endorsed in the Pact of the Future, discourages internet shutdowns, noting their harmful impact on human rights, democracy, and economic growth, and calls for transparent and accountable solutions to address issues in the digital space. At a time when global standards are pushing for universal, secure, and open internet access, national policies must align with these principles rather than undermine them.

Kenya’s commitment to internet freedom appears to be on a worrying downward trend. We note with concern that there was an internet disruption on 25 June, less than 6 months ago, during the protests against the Finance Bill, 2024. A similar blocking of the Telegram App was implemented in November 2023. Such repeated actions not only curtail rights but also erode public trust in digital governance.

While we recognize the importance of maintaining exam integrity, we urge the Kenyan government to explore alternative, lawful and rights-respecting measures to tackle this issue. Instead of blocking the application or disrupting the internet, authorities are encouraged to pursue criminals who breach confidential examination documents and seal loopholes in examination processes. Such alternative actions to tackle this issue can be explored through multi-stakeholder consultations ensuring that they are human rights-respecting. Disrupting the internet or blocking social media access as in this case goes against the three-part test under international human rights law of legality, legitimacy, necessity and proportionality. A stable, secure and accessible internet should remain a priority, especially given its critical role in supporting the digital economy, education, livelihoods, and civic engagement.

We call on Kenyan authorities and the CA in particular, to immediately retract the letter to service providers, and for service providers to restore access to Telegram and commit to upholding digital rights and internet freedom. We also urge policymakers to consult civil society and other key stakeholders to develop sustainable, rights-based strategies to address digital governance challenges without resorting to internet disruptions. 

Endorsed on November 9, 2024 by:

Afia-Amani Grands-Lacs

African Internet Rights Alliance (AIRA)

Africa Media and Information Technology Initiative (AfriMITI)

Afrika Youth Movement

Article 19 East Africa

Bloggers Association of Kenya (BAKE)

Brain Builders Youth Development Initiative

Centre for Artificial Intelligence Ethics and Governance in Africa (CAIEGA)

Centre for Information Technology and Development (CITAD)

Center for the Study of Organized Hate (CSOH)

Collaboration on International ICT Policy for East and Southern Africa (CIPESA)

Collaborative for Peace

Consortium of Ethiopian Human Rights Organizations (CEHRO-Ethiopia)

FactCheck Africa

Gonline Africa

Human Rights Journalists Network Nigeria

Impact Foundation For Youths Development

Internet Without Borders

Internet Society Kenya Chapter

KICTANet

Kijiji Yeetu

Media Rights Agenda (MRA)

Paradigm Initiative (PIN)

Roots Africa Inc.

Tech & Media Convergency (TMC)

The Internet Governance Tanzania Working Group (IGTWG)

Tribeless Youth (TY)

VANGUARD PRESS BOARD UDUS

Women of Uganda Network (WOUGNET)

West African Digital Rights Defenders Coalition

The SaferNet Initiative

This article was first published by KICTANET on November 09, 2024.