CIPESA and Partners Advocate for Inclusion of Technology-Facilitated Gender-Based Violence in Uganda’s Sexual Offences Bill

By Ainembabazi Patricia |

On February 18, 2025, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) alongside Pollicy and the Gender Tech Initiative appeared before Members of Uganda’s Parliament to advocate for the inclusion of Technology-Facilitated Gender-Based Violence (TFGBV) in the Sexual Offences Bill 2024.

The rapid evolution of digital technologies has reshaped societal interactions, leading to increased perpetration of online violence. In Uganda, online users increasingly face digital forms of abuse that often mirror or escalate offline sexual offences, yet efforts to combat gender-based violence are met with both legal and practical challenges.

The Sexual Offences Bill aims to address sexual offences by providing for the effectual prevention of sexual violence, enhancement of the punishment of sexual offenders, providing for the protection of victims during trial of sexual offences, and providing for extra-territorial application of the law.

In the presentation to Committee of Legal and Parliamentary Affairs and the Committee on Gender, Labour, and Social Development, CIPESA and partners emphasised the necessity of closing the policy gap between digital and physical sexual offenses in the Bill, to ensure that Uganda’s legal system is responsive to the realities of technology advancements and related violence. We argued that while the Bill is timely and presents real issues of sexual violence especially against women, there are some pertinent aspects that have been left out and should be included.

According to the United Nations Population Fund (UNFPA), TFGBV is “an act of violence perpetrated by one or more individuals that is committed, assisted, aggravated, and amplified in part or fully by the use of information and communication technologies or digital media, against a person on the basis of their gender.” It includes cyberstalking, doxing, non-consensual sharing of intimate images, cyberbullying, and other forms of online harassment.

In Uganda, TFGBV is not addressed by the existing laws including the Penal Code Act and the Computer Misuse Act. Adding TFGBV to the bill will provide an opportunity to bridge this legal gap by explicitly incorporating TFGBV as a prosecutable offence.

CIPESA and partners’ recommendations to the Committees were to:

1. Include and Explicitly Define TFGBV

Under Part I (Preliminary), the Bill provides definitions for various terms related to sexual offences, including references to digital and online platforms. However, it does not explicitly define TFGBV or recognise its various manifestations. This omission limits the Bill’s effectiveness in addressing emerging forms of online sexual offences.

We propose an introduction of a new clause under Part I defining TFGBV, to ensure the Bill adequately addresses offences committed via digital means. The definition should align with international standards, such as the UNFPA’s definition of TFGBV, and should ensure consistency with Uganda’s digital policy frameworks, including the Constitution of the Republic of Uganda 1995, the Data Protection and Privacy Act, 2019, the Computer Misuse (Amendment) Act 2022, Penal Code Act Cap 120, and the Uganda Communications Act 2013.

2. Recognising Various Forms of TFGBV

Clause 7 of the Bill provides for the penalisation of Indecent Communication or transmission of sexual content without consent. It criminalises the sharing of unsolicited material of a sexual nature, including the unauthorised distribution of nude images or videos. However, the provision does not explicitly mention cyber harassment, online grooming, sextortion, or non-consensual intimate image sharing (commonly known as “revenge porn”).  As such, we recommended the expansion of Clause 7 to explicitly recognise and define offences such as Cyber harassment, Non-consensual intimate image sharing, Online grooming, and Sextortion. This addition will clarify legal pathways for victims and broaden the scope of protection against digital sexual exploitation. 

3. Replacing “Online Platform” with “Technology-Facilitated Gender-Based Violence”

In clause 1 the Bill defines “on-line platform” as any computer-based technology that facilitates the sharing of information, ideas, or other forms of expression. This encompasses social media sites, websites, and other digital communication tools. Clause 6 addresses the offense of indecent exposure, criminalising the intentional display of one’s sexual organs in public or through electronic means, including online platforms and clause 7 pertains to the non-consensual sharing of intimate content. However, these provisions do not comprehensively categorise TFGBV as a distinct form of sexual offences. Accordingly, “Online Platform” should be replaced with “Technology-Facilitated Gender-Based Violence” to ensure the Bill adequately captures all digital gender-based offences, including deepfake pornography, cyberstalking, and sexual exploitation through content generated by artificial intelligence.

4. Criminalising Voyeurism

The Bill does not explicitly criminalise voyeurism, which refers to the act of secretly observing, recording, or distributing images or videos of individuals in private settings without their consent, often for sexual gratification. Thee is increasing prevalence of voyeurism through hidden cameras, non-consensual recordings, and live-streamed sexual abuse.  Voyeurism should be criminalised with a clear definition provided under clause 1 and the scope and penalty defined under Part II of the Bill.

5. Strengthening Accountability for Technology Platforms

The Bill does not impose specific responsibilities on digital platforms and service providers in cases of TFGBV. We argued for the addition of a new clause under Part III (Procedures and Evidential Provisions) mandating digital platforms and service providers to cooperate in investigations related to TFGBV, and provide relevant data and evidence upon request by law enforcement. Similarly,  the provision should expand into the obligation to ensure data protection compliance and  implementation of proactive measures to detect, remove, and report sexual exploitation content.  This provision will enhance accountability and facilitate the prosecution of perpetrators. 

6. Aligning Uganda’s Legislation with Regional and International Frameworks

The Bill does not explicitly state its alignment with regional and international human rights instruments addressing sexual violence and digital rights.  We recommend an addition of a new clause under Part I (Preliminaries) stating that the Bill shall be interpreted in a manner that aligns with the African Commission on Human and Peoples’ Rights (ACHPR) Resolution 522 (2022) and the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW). This will reinforce Uganda’s commitment to and application of international best practices in combating sexual offences. 

7. Enhancing Legal Remedies for Survivors

Clause 42 (Settlement in Capital Sexual Offences) prohibits compromise settlements in cases of rape, aggravated rape, and defilement, prescribing a 10-year prison sentence for offenders who attempt to settle such cases outside court. However, the Bill does not provide civil remedies for victims of TFGBV-related crimes, nor does it ensure access to psycho-social support.  We recommend an expansion of Clause 42 to include  civil remedies, including compensation for victims of TFGBV, psychosocial and legal support, ensuring survivors receive necessary rehabilitation, and mandatory reporting obligations for online platforms hosting TFGBV-related content. 

The inclusion of TFGBV in the Sexual Offences Bill 2024 will not only strengthen the fight against gender-based violence but also ensure that survivors access justice. The proposed legislative changes will reinforce Uganda’s commitment to upholding digital rights and gender equality in the digital age. The country will also join the ranks of pioneers such as South Africa who have taken legislative steps to criminalise online gender-based violence.

By incorporating the proposed provisions and amendments, the Sexual Offences Bill, 2024 will clearly define online-based sexual offenses, bring perpetrators of online violence to book and provide protection for survivors of digital sexual offences. It will also contribute to the building and strengthening of accountability for technology platforms. Once enacted, the law will also go strides in ensuring that Uganda’s legal framework aligns with regional and international human rights standards on protection of survivors while guaranteeing effective prosecution of offenders of technology-facilitated sexual offences.

Download the Full report here

What Does Meta’s About-Turn on Content Moderation Bode for Africa?

By CIPESA Writer |

Meta’s recent decision to get rid of its third-party fact-checkers, starting within the United States, has sent shockwaves globally, raising significant concerns about the concept of free speech and the fight against disinformation and misinformation. The announcement was part of a raft of major policy changes announced on January 7, 2025 by Meta’s CEO Mark Zuckerberg that will affect its platforms Facebook, Instagram and Threads used by three billion people worldwide. They include the introduction of the user-generated “Community Notes” model, elimination of third-party fact-checkers, reduced content restrictions and enforcement, and enabling the personalisation of civic or political content.

While the announcement makes no reference to Africa, the changes will trickle down to the continent. Meta’s decision is particularly concerning for Africa which is unique in terms of linguistic and cultural diversity, limited digital and media information literacy, coupled with the growing challenges of hate speech and election-related disinformation, lack of context-specific content moderation policies, and inadequate investment in local fact-checking initiatives.

Africa’s content moderation context and needs are also quite different from those of Europe or North America due to the predominant use of local languages that are often overlooked by automated fact-checking algorithms and content filters.

Notably, the justifications given by Meta are quite weak, as the new changes appear to undermine its own initiatives to promote free speech, particularly the work of its third-party fact-checking program and the Oversight Board, which it set up to help resolve some of the most difficult questions around freedom of expression online and information integrity. The decision also appears to be politically and economically motivated as the company seeks to re-align itself with and appease the incoming Trump administration that has been critical against fact-checking and get assistance in pushing back against regulation of its activities outside the U.S.

The company also amended its policy on Hateful Conduct on January 7, 2025, and replaced the term “hate speech” with “hateful conduct” and eliminated previous thresholds for taking down hate content and will allow more hateful speech against specific groups. Further, whereas the company is moving its Trust and Safety and Content Moderation Teams to Texas, it is yet to set up such robust teams for Africa.

Importance of Fact-Checking

Fact-checking plays a critical role in combating disinformation and misinformation and fostering informed public discourse. By verifying the accuracy of online content, fact-checkers help to identify unauthentic content and counter the spread of false narratives that can incite violence, undermine trust in institutions, or distort democratic processes.

Additionally, it promotes accountability and reduces the virality of misleading content, particularly during sensitive periods, such as elections, political unrest, public health crises, or conflict situations, where accurate and credible information is crucial for decision-making. Moreover, fact-checking fosters media literacy by encouraging audiences to critically evaluate information sources.

Fact-checking organisations such as Politifact have criticised the assertions by the Meta CEO that fact-checkers were “too politically biased” and had “destroyed more trust than they had created, especially in the U.S.”, yet decisions and power to take down content have been squarely Meta’s responsibility, with fact-checkers only providing independent review of posts. The Meta assertions also undermine the work of independent media outlets and civil society who have been accused by authoritarian regimes of being corrupt political actors.

 However, fact-checking is not without its challenges and downsides. The process can inadvertently suppress free expression, especially in contexts where the line between disinformation and legitimate dissent is blurred. In Africa, where cultural and linguistic diversity is vast, and resources for local-language moderation are limited, fact-checking algorithms or teams may misinterpret context, leading to unjust content removal or amplification of bias. Furthermore, fact-checking initiatives can become tools for censorship if not governed transparently, particularly in authoritarian settings.

Despite these challenges, the benefits of fact-checking far outweigh their challenges. Instead of getting rid of fact-checking, Meta and other big tech companies should strengthen its implementation by providing enough resources to both recruit, train and provide psycho-social services to fact-checkers.

Impact of the Decision for Africa
  1. Increase of Disinformation

Africa faces a distinct set of challenges that make effective content moderation and fact-checking particularly crucial. Disinformation and misinformation in Africa have had far-reaching consequences, from disrupting electoral processes and influencing the choice of candidates by unsuspecting voters to jeopardising public health. Disinformation during elections has fueled violence, while health-related misinformation during health crises, such as during the Covid-19 pandemic, endangered lives by undermining public health efforts. False claims about the virus, vaccines, or cures led to vaccine hesitancy, resistance to public health measures like mask mandates, and the proliferation of harmful treatments. This eroded trust in health institutions, slowed down pandemic response efforts, and contributed to preventable illnesses and deaths, disproportionately affecting vulnerable populations.

The absence of fact-checking exacerbates the existing challenges of context insensitivity, as automated systems and under-resourced moderation teams fail to address the nuances of African content. The introduction of the user-driven Community Notes, which is similar to the model used on X, will still require experts’ input, especially in a region where many governments are authoritarian. Yet, media and information literacy and access to credible and reliable information is limited, and Meta’s platforms are primary ways to access independent news and information.

Research on the use of Community Notes on X has shown that the model has limited effectiveness in reducing the spread of disinformation, as it “might be too slow to intervene in the early (and most viral stages of the diffusion”, which is the most critical. The move also undermines efforts by civil society and fact-checking organisations in the region who have been working tirelessly to combat the spread of harmful content online.

  1. Political Manipulation and Increased Malign Influence

Dialing down on moderation and oversight may empower political actors who wish to manipulate public opinion through disinformation campaigns resulting in the surge of such activities. Given that social media has been instrumental in mobilising political movements across Africa, the lack of robust content moderation and fact-checking could hinder democratic processes and amplify extremist views and propaganda. Research has shown an apparent link between disinformation and political instability in Africa.

Unchecked false narratives not only mislead voters, but also distort public discourse and diminish public trust in key governance and electoral institutions. Authoritarian regimes may also use it to undermine dissent. Moreover, the relaxation of content restrictions on sensitive and politically divisive topics like immigration and gender could open floodgates for targeted hate speech, incitement and discrimination which could exacerbate gender disinformation, ethnic and political tensions. Likewise, weak oversight may enable foreign/external actors to manipulate elections.

  1. Regulatory and Enforcement Gaps

The effect of Meta easing restrictions on moderation of sensitive topics and reduced oversight of content could lead to an increase of harmful content on their platforms. Already, various African countries have  weak regulatory frameworks for harmful content and thus rely on companies like Meta to self-regulate effectively. Meta’s decision could spur efforts by some African governments to introduce new and more repressive laws to restrict certain types of content and hold platforms accountable for their actions. As our research has shown, such laws could be abused and employed to suppress dissent and curtail online freedoms such as expression, assembly, and association as well as access to information, creating an even more precarious environment.

  1. Limited Engagement with Local Actors

Meta’s decision to abandon fact-checking raises critical concerns for Africa, coming after the tech giant’s January 2023 decision to sever ties with their East African content moderation contractor, Sama, based out of Nairobi, Kenya, that was responsible for content moderation in the region. The Sama-operated hub announced its exit from content moderation services to focus on data annotation tasks, citing the prevailing economic climate as a reason for streamlining operations. Additionally, the Nairobi hub faced legal and ethical challenges, including allegations of poor working conditions, inadequate mental health support for moderators exposed to graphic content, and unfair labour practices. These issues led to lawsuits against both Sama and Meta, intensifying scrutiny of their practices.

Meanwhile, fact-checking partnerships with local organisations have played a crucial role in addressing disinformation, and their elimination erodes trust in Meta’s commitment to advancing information integrity in the region. Meta has fact-checking arrangements with various companies across 119 countries, including 26 in Africa. Some of the companies in Africa include AFP, AFP – Coverage, AFP – Hub, Africa Check, Congo Check, Dubawa, Fatabyyano فتبين,  Les Observateurs de France 24 and PesaCheck. In the aftermath of Meta’s decision to sever ties with their East African third-party content moderators, Sama let go of about 200 employees.

Opportunities Amidst Challenges

While Meta’s decision to abandon fact-checking is a concerning development, it also presents an opportunity for African stakeholders to utilise regional instruments, such as the African Charter on Human and Peoples’ Rights and the Declaration of Principles on Freedom of Expression and Access to Information in Africa, to assert thought leadership and demand better practices from platforms. Engaging with Meta’s regional leadership and building coalitions with other civil society actors can amplify advocacy about the continent’s longstanding digital rights and disinformation concerns and demands for more transparency and accountability.

Due to the ongoing pushback against the recently announced changes, Meta should be more receptive to dialogue and recommendations to review and contextualise the new proposals. For Africa, Meta must address its shortcomings by urgently investing and strengthening localised content moderation in Africa. It must reinvest in fact-checking partnerships, particularly with African organisations that understand local contexts. These partnerships are essential for addressing misinformation in local languages and underserved regions.

The company must also improve its automated content moderation tools, including by developing tools that can handle African culture, languages and dialects, hire more qualified moderators with contextual knowledge, provide comprehensive training for them and expand its partnerships with local stakeholders. Moreover, the company must ensure meaningful transparency and accountability as many of its transparency and content enforcement reports lack critical information and disaggregated data about its actions in most African countries.

Lastly, both governments and civil society in Africa must invest in digital, media and information literacy which is essential to empower users to critically think about and evaluate online content. Meta should partner with local organisations to promote digital literacy initiatives and develop educational campaigns tailored to different regions and languages. This will help build resilience against misinformation and foster a more informed digital citizenry.

In conclusion, it remains to be seen how the new changes by Meta will be implemented in the U.S., and subsequently in Africa, and how the company will address the gaps left by fact-checkers and mitigate the risks and negative consequences stemming from its decision. Notably, while there is widespread acknowledgement that content moderation systems on social media platforms are broken, efforts to promote and protect rights to free expression and access to information online should be encouraged. However, these efforts should not come at the expense of user trust and safety, and information integrity.

Job Opportunity: CIPESA Seeks a Communications Officer

Call for Applications |

Job title:                                               Communications Officer

Deadline for applications:                 January 10, 2025

Location:                                              Can be Remote, or based at CIPESA office in Kampala, Uganda

Duration:                                             Two (2) Years

About CIPESA

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) works to defend and expand the digital civic space to enable the protection and promotion of human rights and to enhance innovation and sustainable development. With a focus on disparate actors, including the private sector, civil society, media, policymakers, and multinational institutions, our work aims to engender a free, open, and secure internet that advances rights, livelihoods, and democratic governance. CIPESA’s work responds to a shortage of information, research, resources, and actors consistently working at the nexus of technology, human rights, and society.

CIPESA is seeking a Communications Officer to help increase the visibility of our work in defending and expanding the digital civic space.

Job Summary:

This is an opportunity to work with CIPESA’s expanding team and network of collaborators, including the private sector, civil society, media, academia, policymakers, and multinational institutions. The Communications Officer will work collaboratively with the CIPESA team to create, implement, and oversee internal and external communications programmes that effectively engender a free, open, and secure internet that advances rights, livelihoods, and democratic governance.

Key Areas of Accountability Include:
  • Implement the CIPESA Communications Strategy, so as to increase brand awareness and recognition and to reach out to external stakeholders and respond to the needs of targeted audiences.
  • Document and communicate CIPESA’s work through powerful storytelling using various tools and platforms.
  • Manage communication tools to ensure that CIPESA partners and collaborators are kept informed and engaged and messages effectively and consistently describe CIPESA and its goals and activities.
  • Monitor and evaluate communication and advocacy activities.
  • Manage the production of CIPESA publications, including research reports, impact reports, policy briefs, annual reports and newsletters.
  • Work with CIPESA staff to draft, edit and refine press releases, talking points, blog posts, speeches, grant applications and other external communications.
  • Engage staff and key stakeholders in promoting CIPESA’s mission. This includes establishing rapport with them and ensuring visibility.
  • Oversee and grow the content of CIPESA’s website and other digital and social media resources.
  • Liaise with various media houses and content publishers for publicity and promotion as required.
  • Media monitoring and outreach to mainstream and technology-focused media.
  • Manage the communications related logistics and support for CIPESA events.
  • Oversee the maintenance of CIPESA’s contact database, events and publications calendar, and other communication tools.
  • Curate thematic news content for CIPESA’s platforms including reporting on the latest trends and developments in technology in the region.
  • Conducting other tasks as appropriate to support CIPESA’s mission.  
Qualifications and Experience:

This position is applicable to people with at least five years of relevant experience. An ideal candidate would demonstrate the following:

  • Undergraduate degree and/or equivalent experience in communications, public affairs, advocacy or journalism; and a demonstrated interest/knowledge related to one or more of these fields: human rights law or policy, technology policy, digital rights, social or development studies. Postgraduate qualifications are a distinct advantage.
  • Outstanding verbal and written communication skills — including strong writing and editing skills for different platforms (social media, blogs, campaign tool kits)  and varied external audiences.
  • Experience working with journalists or in the media.
  • Proficiency in creatively using  digital and social media, including multimedia content development and storytelling.
  • Confidence/experience in multi-stakeholder environments and differing cultural settings, and in working with diverse constituencies.
  • Exceptional project and time management, planning and organisational skills, resourcefulness and attention to detail.
  • Fluency in English is required, and proficiency in another language is an advantage.
  • Familiarity with Content Management Systems and creative software suites are an advantage.

Salary and Benefits: Salary will be commensurate with experience. CIPESA provides a benefits package that includes health care, provident fund, statutory pension and paid leave.

Standards of Professional Conduct: CIPESA staff and partners must adhere to the values and principles outlined in the Code of Conduct, and the Safeguarding against Sexual Exploitation and Abuse and Sexual Harassment (SEAH) Policy. In accordance with these, CIPESA operates and enforces policies on Beneficiary Protection from Exploitation and Abuse, Child Safeguarding, Harassment-Free Workplace, Fiscal Integrity, Anti-Retaliation, and several others.

To Apply: Please send your interest including a cover letter detailing why you think your skill set would be ideal for this position, along with a CV, the names and contact details of two referees and 2-3 writing/content samples in one PDF file to recruitment@cipesa.org with Application for Communications Officer in the email subject line.

The AU Disability Protocol Comes Into Force: Implications for Digital Rights for Persons with Disabilities in Africa

By Paul Kimumwe & Michael Aboneka |

On this International Day for Persons with Disabilities, CIPESA reflects on the impact of the African Union (AU) Disability Protocol and its Implication on digital rights for persons with disabilities in Africa and calls upon the African Commission to establish a Special Mandate to enhance the respect for and protection of the rights for persons with disabilities in Africa

Six years after its adoption, the Protocol to the African Charter on Human and Peoples’ Rights on the Rights of Persons with Disabilities in Africa came into force in May 2024 after securing the mandatory 15th ratification by the Republic of Congo. The other 14 African Union member states that have ratified the Protocol are Angola, Burundi, Cameroon, Kenya, Mali, Malawi, Mozambique, Namibia, Nigeria, Niger, Rwanda, South Africa, the Sahrawi Arab Democratic Republic, and Uganda. 

For disability rights activists, this was a defining moment as the protocol augments the rights of persons with disabilities to barrier-free access to the physical environment, transportation, information, and other communication technologies and systems. Specifically, under articles 23 and 24 of the protocol, States Parties should take “effective and appropriate measures” to facilitate the full enjoyment by persons with disabilities of the right to freedom of expression and opinion and access to information, including through the use of Information and Communication Technologies (ICT).

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has been a longstanding advocate for African governments to urgently ratify the protocol. However, CIPESA has also stated, including in submissions to the Africa Commission on Human and People’s Rights (ACHPR), that ratifying the protocol would be a major but insufficient step in ensuring that persons with disabilities access and use digital technologies and that there is sufficient disaggregated data to inform programme interventions.

Indeed, article 24(2) requires States Parties to put in place policy, legislative, administrative, and other measures to ensure that persons with disabilities enjoy the right to freedom of expression and access to information on an equal basis, including:

  1. Providing information intended for the general public as well as information required for official interactions with persons with disabilities in accessible formats and technologies appropriate to different kinds of disabilities in a timely manner and without additional cost to persons with disabilities. 
  2. Requiring private entities that provide services to the general public, including through the internet, to provide information and services in accessible and usable formats for persons with disabilities. 
  3. Recognising and promoting the use of sign language. 
  4. Ensuring that persons with visual impairments or with other print disabilities have effective access to published works, including by using information and communication technologies.

The protocol adds to the available digital rights advocacy tools for disability rights actors, including the 2006 United Nations Convention on the Rights of Persons with Disabilities (CRPD), which places significant obligations on States Parties to take appropriate measures to ensure that persons with disabilities have equal and meaningful access to ICT, including the internet. 

The CRPD was the first international human rights treaty requiring the accessibility of digital tools as a prerequisite for persons with disabilities to fully enjoy their fundamental rights without discrimination. It highlights the inherent risks of exclusion of persons with disabilities from participating equally in society by defining ICT accessibility as integral to general accessibility rights and on par with access to the physical environment and transportation.

While there has been some progress in the enactment of disability rights-respecting and ICT-enabling laws for persons with disabilities in Africa, implementation is a challenge. Moreover, the Protocol comes into force when the digital divide and exclusion of persons with disabilities has worsened despite the exponential growth and penetration of digital technologies on the continent. Persons with disabilities have consistently remained disproportionately excluded from the digital society due to factors such as low levels of ICT skills, high illiteracy levels, and high cost of assistive technologies such as screen readers, screen magnification software, text readers, and speech input software.

It is against this background that CIPESA adds its voice to other calls to the African Commission to expedite the establishment of a special mandate at the level of Special Rapporteur for Persons with Disabilities. This elevated position will ensure that the rights of persons with disabilities in Africa are mainstreamed and upheld.

CIPESA recognises that as a regional human rights instrument, the protocol empowers disability rights actors to demand the enactment and full implementation of policies and laws that promote the rights of persons with disabilities, including in accessing and using digital technologies.

For example, disability rights actors, including civil society, activists, and Disability Rights Organisations (DPOs), should develop mechanisms to monitor the status of implementation of the protocol, including ensuring that the states parties submit their statutory reports as required by Article 34 of the protocol. The DPOs should also actively participate in developing shadow reports on the status of implementation of the protocol, especially on access to information and participation in public affairs.

In addition, disability rights organisations should work with policymakers and the executive to ensure that more countries ratify the protocol and domesticate it through national policies, laws, and practices. Both the protocol and the CRPD should become a reference point during any discussions of draft laws and policies that affect persons with disabilities.

For the media, it is important that, through their reporting, they hold governments accountable for failure to ratify or to fully implement the provisions of the protocol.

Member countries can also demand for accountability of their peers on the status of implementation of the key provisions of the protocol through the African Peer Review Mechanism (APRM).

Please read more about CIPESA submissions on policy actions governments should take after ratifying the protocol. See also The Disability and ICT Accessibility Framework for Monitoring the Implementation of ICT Accessibility Laws and Policies in Africa.

African Commission Resolution to Bolster Data Governance

By Edrine Wanyama |

The Resolution adopted by the African Commission on Human and Peoples’ Rights (ACHPR) during its 81st Ordinary Session held from October 17 to November 6, 2024 in Banjul, The Gambia potentially bolsters data protection and governance on the African continent.

The Resolution calls upon states parties to take all relevant measures to ensure transparent and accountable collection, processing, storage and access to data. It also underscores the importance of ethical principles in data usage, equitable access to data, and addressing biases to prevent structural inequalities while safeguarding privacy and combating discrimination.​

The resolution acknowledges the rapid advancement of technology and the increased dependence on data in governance and socio-economic development, and is in line with the African Union Convention on Cyber Security and Personal Data Protection, African Union’s Data Policy Framework, and the Digital Transformation Strategy for Africa (2020–2030).

Similarly, this timely resolution aligns closely with the vision of the Global Digital Compact (GDC), which calls for inclusive, rights-based governance of digital technologies and artificial intelligence (AI), and the ACHPR’s Resolution 473 on the need to undertake a study on human and peoples’ rights and AI, robotics and other new and emerging technologies. These frameworks emphasise the potential of data and digital technologies while cautioning against risks such as bias, inequities, unwarranted surveillance, and privacy violations.

By embedding human rights principles in digital governance, both the ACHPR’s Resolution 473 and the GDC advocate for responsibly leveraging digital tools to reduce inequalities and protect vulnerable populations. The ACHPR’s focus on equitable data access and capacity-building within African states resonates with the GDC’s call for global collaboration to address disparities in digital infrastructure and skills. Together, these initiatives present a unified agenda to ensure that digital technologies and AI are harnessed for equity, justice, and sustainable development that foster a shared vision for a more inclusive digital age.

The ACHPR Resolution further urges state parties to ensure open access to data which is in possession of public and private in public interest, in accordance with the prescribed regional and international human rights standards.

The Resolution reinforces the African Union’s Data Policy Framework which, among others, seeks to maximise the benefits of the data-driven economy for African countries. With common anticipated benefits, data governance systems will be harmonised to enable secure and free data flow on the continent which potentially contributes to a people-centred approach which is not inward-looking for individuals, institutions and businesses and, enhances data utility for accelerated attainment of Agenda 2063 and the Sustainable Development Goals (SDGs).

There is increasing recognition of the need for data governance frameworks that create a safe and trustworthy digital environment, foster intra-Africa digital trade, enable states’ cooperation on data governance, enable domestication of continental policies, and ensure free and secure data flows across the continent. As such, the  Resolution also calls for the establishment of collaborative mechanisms, coordinating data issues, enabling and facilitating competitiveness in the global economy, promoting sustainable data use and benefits to society, as well as facilitating innovative ways to promote and maximise benefits of data for the African peoples.

Besides, the Resolution will potentially grow the impetus of Regional Economic Communities (RECs) to adopt harmonised data governance systems, which will quicken continental initiatives such as the African Continental Free Trade Area (AfCFTA) Agreement whose growth and benefits depend on secure and free cross-border data flows. For instance, the East African Community (EAC) and the Southern African Development Community (SADC) are set to develop regional data governance policy frameworks with the aim of harmonising data governance in the region for economic growth and regional integration.

The Resolution echoes sentiments shared in various panels at the September Forum on Internet Freedom in Africa 2024 (#FIFAfrica), which highlighted contemporary issues in data governance in Africa, including in collection, management, and processing of data. The Forum emphasised the role of national and regional actors in policy harmonisation, enabling greater cross-border data flows, maximising the benefits of data for all countries and all citizens, and the need for greater privacy protections over personal data. Among others, speakers at FIFAfrica singled out  national parliaments, RECs, civil society organisations, the African Union, and the private sector as having pivotal roles to play in promoting effective data governance.