Sarah Kekeli Akunor

Who is Sarah Kekeli Akunor?

I am a young woman with visual impairment. I recently graduated from the University of Ghana with a Bachelor’s degree in Political Science and Philosophy. I am a member of the Mastercard Foundation’s Alumni Network Committee, serving as the lead for Inclusion, Gender, and Safeguard. I am also a Disability Inclusion Facilitator under the Mastercard Foundation’s ‘We Can Work’ programme. I also serve as an interim executive for the newly inaugurated Ghana Youth Federation, where I hold the position of Secretary for Gender Equality and Social Inclusion. I am a passionate advocate for disability rights and digital rights, and inclusion. I have a certificate in Disability Leadership in Internet Governance and Digital Rights from the Internet Society (ISOC).

Before I lost my sight, I had a very limited understanding of disability or its challenges. I have a genetic eye condition called Retinitis Pigmentosa, which was only diagnosed when I was 24 years old. I was born with the condition, but I could see with the help of glasses until the condition deteriorated through my teenage years and early twenties. After I lost my sight, I couldn’t read Braille, so I needed to learn how to use a computer. I noticed that leveraging technology made it possible for me to live a more fulfilling life. This served as a basis for my strong passion for advocating for digital rights, not just for myself, but for the countless persons with disabilities who are out there. Through my experience, I witnessed the transformative impact of Assistive Technology in the lives of persons with disabilities.

In Africa, there is inaccessibility everywhere. I noticed that persons with disabilities are left behind even in the digital space. I started by advocating in my own space, such as discussing disability inclusion with family and friends. I entered the digital space by volunteering at the Ghana Blind Union Assistive Technology lab and with Inclusive Tech Group. Now that I have a platform, my goal is to see the rights of persons with disabilities mainstreamed in all spaces, especially online, as online spaces have become increasingly central to how the world functions.

Across Africa, there are several innovators creating brilliant solutions for various categories of persons with disabilities. In Ghana, for example, a young innovator has developed an app called DeafCanTalk, which enables deaf people to communicate through sign language. The app converts the signs into written or spoken text for the other party to understand.

Other initiatives in Ghana, such as the “Disability Conclusion Hackathon”, organised by the Inclusive Tech Group, make it possible for young innovators without disabilities to co-design and co-create disability-relevant tech solutions together with their counterparts with disabilities. One organisation I belong to, The Phoenix Unity Club, is also championing the accessibility of apps and websites in Ghana by advocating strongly for the development of Ghana’s own ‘Web Content Accessibility Policy.’

In furtherance of these goals, I have held positive engagements with the authorities at the National Information Technology Agency and the Ghana Investment Fund for Electronic Communication. Across Africa, InABLE is doing great work promoting digital rights through its annual Inclusive Africa Conference. The Collaboration of ICT Policy for East and Southern Africa (CIPESA), with its Forum on Internet Freedom in Africa, the Internet Society, the Internet Governance Forum, and Paradigm Initiative, with its Digital Rights and Inclusion Forum (DRIF), are all actively promoting the realisation of digital rights for persons with disability across Africa.

The deployment of various apps, technologies, and AI platforms in Africa is definitely worth commending. However, there is a real risk of widening the digital divide if these technologies and AI platforms are not accessible to persons with disability, either because they are too expensive or poorly designed without consideration for accessibility needs. There are also real risks of data protection issues, particularly for persons with disabilities and those with little or no education.

To stay ahead of the curve, policymakers need to ensure that new technologies in Africa are only deployed for public use after they have been proven to be compliant with diverse accessibility needs. Again, innovators must be encouraged to co-create with persons with disabilities. And finally, data protection laws must be strengthened and strictly enforced.

There are several things that we should be doing. First, we need to make sure that we build meaningful collaborations and partnerships among all stakeholders across policy, civil society, digital rights activists, government, and persons with disabilities from different parts of Africa, and foster a spirit of true friendship and openness in discussing the common problems of access to assistive devices, online accessibility, data privacy breaches, etc., and develop solutions that are relevant to our local African context.

Secondly, it is important to promote co-creation and co-design of all innovations in Africa with persons with disabilities. Persons with disabilities, women, youth, and older persons must be involved in the innovation process right from the ideation stage. Once these innovations are complete, these groups of people must be consulted in testing the products before they are released to the market.

In addition, as digital rights advocates, we must not only make passionate arguments for inclusion but also insist that there is a business case for including persons with disabilities and other vulnerable groups in the development of new technologies. After all, there are countless funding and grant opportunities that make accessibility a non-negotiable requirement before these funds are granted. In short, disability inclusion is not a matter of charity but a long-term business strategy.

Why Digital Security Training Is No Longer Optional for Ugandan Journalists

By Byaruhanga Brian |


Ugandan journalists are increasingly facing intertwined physical and digital threats which  intensify during times of public interest including elections and protests. These threats are compounded by  internet shutdowns, targeted surveillance, account hacking, online harassment, and regulatory censorship that directly undermines their safety and work.  A study on the Daily Monitor’s experience found that the 2021 general election shutdown constrained news gathering, data-driven reporting, and online distribution, effectively acting as digital censorship. These practices restrict news gathering, production, and dissemination and have been documented repeatedly from the 2021 general election through the run‑up to the 2026 polls.

Over the years, CIPESA has documented digital rights violations, challenged internet shutdowns, and worked directly with media practitioners to strengthen their ability to operate safely and independently. This work has deepened as the threats to journalism have evolved.

In recent  months CIPESA has conducted extensive journalist safety and digital resilience trainings across the country, reaching more than 200 journalists from diverse media houses and districts across the country, in the Acholi subregion (Gulu, Kitgum, Amuru, Lamwo, Agago, Nwoya, Pader, and Omoro), Ankole sub region (Buhweju, Bushenyi, Ibanda, Isingiro, Kazo, Kiruhura, Mbarara (City & District), Mitooma, Ntungamo, Rubirizi, Rwampara, and Sheema), Central (Kampala, Wakiso), Busoga Region (Bugiri, Bugweri, Buyende, Iganga, Jinja, Kaliro, Kamuli, Luuka, Mayuge, Namayingo, and Namutumba), and the Elgon, Bukedi, and Teso subregions (Mbale, Bududa, Bulambuli, Manafwa, Namisindwa, Sironko, Tororo, Busia, Butaleja, Kapchorwa, Soroti, and Katakwi).

The trainings aimed to strengthen the capacity of media actors to mitigate digital threats and push back against rising online threats and censorship that enable digital authoritarianism. The training was central to helping journalists and the general media sector to understand media’s role in democratic and electoral processes, ensure legal compliance and navigate common restrictions, buttressing their digital and physical security resilience, enhancing the skills to identify and counter disinformation and facilitating the newsroom safety frameworks for the media sector.

The various trainings were tailored to respond to the needs of the journalists, covering media, democracy, and elections; electoral laws and policies; and peace journalism, with attention to transparent reporting and the effects of military presence on journalism in post-conflict settings.

Meanwhile, in Mbale and Jinja, reporters unpacked election-day risks, misinformation circulating on social media, and the legal boundaries that are often used to intimidate them. Across the different regions, newsroom managers, editors and reporters worked through practical exercises on digital hygiene, safer communication, and physical-digital risk intersections.

CIPESA’s digital security trainings respond to the real conditions journalists work under. The sessions focus on election-day and post-election reporting, verifying information and claims under pressure, protecting sources, and strengthening everyday digital security through strong passwords, two-factor authentication, and safe device handling. Journalists also develop newsroom safety protocols and examine how peace journalism can help de-escalate tension rather than inflame it during contested political moments

One of the most important shifts for the participants,  came from the perspective that safety stopped being treated as an individual burden and started being understood as an organisational responsibility. Through protocol-development sessions, journalists mapped threats, identified vulnerabilities such as predictable routines and weak passwords, and designed “if-then” responses for incidents like account hacking, detention, or device theft. For many journalists, this was the first time safety had been written down rather than improvised.

Beyond the training for journalists, CIPESA hosted several digital security clinics and help desks for human rights defenders and activists. At separate engagements, close to 70 journalists received one-on-one support during a digital security clinic at Ukweli Africa held from the 15 December 2025 including the at the Uganda Media Week. These efforts sought to  enhance their digital security practices. The support provided during these interventions included checking the journalists’ devices for vulnerabilities, removal of malware, securing accounts, enabling encryption, and secure data management approaches.

“Some journalists who had arrived unsure, even embarrassed, about their digital habits, left lighter, not because the risks had vanished, but because they now understood the tools and how to manage risks.”

These engagements serve as avenues to build the digital resilience of journalists in Uganda, especially as the media faces heightened online threats amidst a shrinking civic space.Such trainings that speak the language of lived experience often travel further than any policy alone. In Uganda, where laws can be used to narrow civic space, where the internet can be switched off, and where surveillance blurs the line between public and private, practical digital security becomes a necessity.

By training journalists across Uganda, supporting them through digital security desks, and standing with them during moments like Media Week, CIPESA has helped journalists strengthen their resilience to keep reporting in spite of the challenges and threats they encounter daily.

Inform Africa Expands OSINT Training and DISARM-Based Research With CIPESA

ADRF |

Information integrity work is only as strong as the methods behind it. In Ethiopia’s fast-changing information environment, fact-checkers and researchers are expected to move quickly while maintaining accuracy, transparency, and ethical care. Inform Africa has expanded two practical capabilities to address this reality: advanced OSINT-based fact-checking training and structured disinformation research using the DISARM framework, in collaboration with the Collaboration on International ICT Policy for East and Southern Africa (CIPESA).

This work was advanced with support from the Africa Digital Rights Fund (ADRF), administered by CIPESA. At a time when many civic actors face uncertainty, the fund’s adaptable support helped Inform Africa sustain day-to-day operations and protect continuity, while still investing in verification and research methods designed to endure beyond a single project cycle.

The collaboration with CIPESA was not only administrative. It was anchored in shared priorities around digital rights, information integrity, and capacity building. Through structured coordination and learning exchange, CIPESA provided a partnership channel that strengthened the work’s clarity and relevance, and helped position the outputs as reusable methods that can be applied beyond a single team. The collaboration also reinforced a regional ecosystem approach: improving practice in one context while keeping the methods legible for peer learning, adaptation, and future joint work.

The implementation followed a phased timetable across the project activity period from April through November 2025. Early work focused on scoping and method design, aligning the training and research approaches with practical realities in newsrooms and civil society. Mid-phase work concentrated on developing the OSINT module and applying DISARM as a structured research lens, with iterative refinement as materials matured. The final phase focused on consolidation, documentation discipline, and packaging the outputs to support repeatable use, including onboarding, internal training, and incident review workflows.

A central focus has been an advanced OSINT training module built to move beyond tool familiarity into a complete verification workflow. Verification is treated as a chain of decisions that must be consistent and auditable: how to intake a claim, determine whether it is fact-checkable, plan the evidence, trace sources, verify images and videos, confirm the place and time, and document each step clearly enough for an editor or peer to reproduce the work. The aim is not only to reach accurate conclusions but also to show the route taken, including which evidence was prioritized and how uncertainty was handled.

This documentation discipline is not bureaucracy. It is a trust technology. In high-risk information environments, preserved sources, verification logs, and clear decision trails protect credibility, strengthen editorial oversight, and reduce avoidable errors. The module prioritizes hands-on, production-style assignments that mirror real newsroom constraints and trains participants to avoid overclaiming, communicate uncertainty responsibly, and present evidence in ways that non-expert audiences can follow.

In parallel, Inform Africa has applied the DISARM framework to disinformation research. DISARM provides a shared language for describing influence activity through observable behaviors and techniques, without drifting into assumptions. The priority has been to remain evidence-bound: collecting and preserving artifacts responsibly, maintaining a structured evidence log, reducing harm by avoiding unnecessary reproduction of inflammatory content, and avoiding claims of attribution beyond what the evidence supports. This DISARM-informed approach has improved internal briefs, strengthened consistency, and made incidents easier to compare over time and across partners.

Three lessons stand out from this work with CIPESA and ADRF. First, quality scales through workflow, not only through talent. Second, evidence discipline is a strategic choice that protects credibility and reduces harm in both fact-checking and research. Third, shared frameworks reduce friction by improving clarity and consistency across teams. Looking ahead, Inform Africa will integrate the OSINT module into routine training and onboarding and continue to apply DISARM-informed analysis in future incident reviews and deeper studies, reinforcing information integrity as a public good.

This article was first published by Informa Africa on December 15, 2025

Towards Inclusive AI Policies in Africa’s Digital Transformation

By CIPESA Writer |

On November 13, 2025, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) took part in the global PILNET summit on Artificial Intelligence (AI) and its impact on the work of Civil Society Organisations (CSOs). Over three days, the summit assembled stakeholders from across the world in Rome, Italy, to deliberate on various topics under the theme, “Amplifying Impact: Pro Bono & Public Interest Law in a Shifting World.”

CIPESA contributed to a session titled, “Pro bono for AI: Addressing legal risks and enhancing opportunities for CSOs”. The session focused on AI and its potential impacts on the work and operations of CSOs. CIPESA emphasised the need for a universally acceptable and adaptable framework to guide the increased application of AI in the fast-evolving technological era. Furthermore, CIPESA highlighted its efforts in developing a model policy on AI for CSOs in Africa, which is being undertaken with the support of the Thomson Reuters Foundation through its global pro bono network.

Edrine Wanyama, Programme Manager Legal at CIPESA, centered his discussions around ethical and rights-respecting AI adoption, and emphasised the necessity for CSOs to enhance their knowledge and measures of accountability while navigating the AI ecosystem.

Mara Puacz, the Head of Impact at Tech To The Rescue, Ana de la Cruz Cubeiro, a Legal Officer at PILnet, and Megan King, a Senior Associate at Norton Rose Fulbright, shared similar sentiments on the benefits of AI, which include expanding advocacy work and initiatives of CSOs.

They noted the increased demand for transparency and accountability in AI development and use, and the need to minimise harms that marginalised communities face from AI enabled analysis of data sets that often perpetuate bias, gaps in data, and limited or poorly digitalised language sets.

The session cited various benefits of AI for CSOs, such as enabling human rights monitoring, documenting and reporting at various fronts like the Universal Periodic Review, aiding democratic participation, and tracking and documenting trends. Others are facilitating and enhancing environmental protection, such as through monitoring pollution and providing real-time support to agri-business and the health sector by facilitating pest and disease identification and diagnosis for viable solutions.

However, funding constraints not only affect AI deployment but also capacity building to address the limited skills and expertise in AI deployment. In Africa, the inadequacy of relevant infrastructure, data sovereignty fears amongst states, and the irresponsible use of AI and related technologies present additional challenges.

Meanwhile, between October 23 and 24, 2025, CIPESA joined KTA Advocates and the Centre for Law, Policy and Innovation Initiative (CeLPII), to co-host the 8th Annual Symposium under the theme of “Digital Trade, AI and the Creative Economy as Drivers for Digital Transformation”.

The symposium explored the role of AI in misinformation and disinformation, as well as its potential to transform Uganda’s creative economy and digital trade. CIPESA emphasised the need to make AI central in all discussions of relevant sectors, including governments, innovators, CSOs and the private sector, among others, to identify strategies, such as policy formulation and adoption, to check potential excesses by AI.

Conversations at the PILNET summit and the KTA Symposium align with CIPESA’s ongoing initiatives across the continent, where countries and regional blocs are developing AI strategies and policies to inform national adoption and its application. At the continental level, in 2024, the African Union (AU) adopted the Continental AI Strategy, which provides a unified framework for using AI to drive digital transformation and socio-economic development of Africa.

Amongst the key recommendations from the discussions is the need for:

  • Wide adoption of policies guiding the use of AI by civil society organisations, firms, the private sector, and innovators.
  • Nationwide and global participation of individuals and stakeholders, including governments, CSOs, the private sector, and innovators, in AI processes and how it works to ensure that no one is left behind. This will ensure inclusive participation.
  • Awareness creation and continuous education of citizens, CSOs, innovators, firms, and the private sector on the application and value of AI in their work.
  • The adoption of policies and laws that specifically address the application of AI at national, regional and international levels and at organisational and institutional levels to mitigate the potential adverse impacts of AI rollout.

#BeSafeByDesign: A Call To Platforms To Ensure Women’s Online Safety

By CIPESA Writer |

Across Eastern and Southern Africa, activists, journalists, and women human rights defenders (WHRDs) are leveraging online spaces to mobilise for justice, equality, and accountability.  However, the growth of online harms such as Technology-Facilitated Gender-Based Violence (TFGBV), disinformation, digital surveillance, and Artificial Intelligence (AI)-driven discrimination and attacks has outpaced the development of robust protections.

Notably, human rights defenders, journalists, and activists face unique and disproportionate digital security threats, including harassment, doxxing, and data breaches, that limit their participation and silence dissent.

It is against this background that the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), in partnership with Irene M. Staehelin Foundation, is implementing a project aimed at combating online harms so as to advance digital rights. Through upskilling, advocacy, research, and movement building, the initiative addresses the growing threats in digital spaces, particularly affecting women journalists and human rights defenders.

The first of the upskilling engagements kicked off in Nairobi, Kenya, at the start of December 2025, with 25 women human rights defenders and activists in a three-day digital resilience skills share workshop hosted by CIPESA and the Digital Society Africa. Participants came from the Democratic Republic of Congo, Madagascar, Malawi, South Africa, Tanzania, Uganda, Zambia, and Zimbabwe. It coincides with the December 16 Days Of Activism campaign, which this year is themed “Unite to End Digital Violence against All Women and Girls”.

According to the United Nations Population Fund (UNFPA), TFGBV is “an act of violence perpetrated by one or more individuals that is committed, assisted, aggravated, and amplified in part or fully by the use of information and communication technologies or digital media against a person based on their gender.” It includes cyberstalking, doxing, non-consensual sharing of intimate images, cyberbullying, and other forms of online harassment.

Women in Sub-Saharan Africa are 32% less likely than men to use the internet, with the key impediments being literacy and digital skills, affordability, safety, and security. On top of this gender digital divide, more women than men face various forms of digital violence. Accordingly, the African Commission on Human and Peoples’ Rights (ACHPR) Resolution 522 of 2022 has underscored the urgent need for African states to address online violence against women and girls.

Women who advocate for gender equality, feminism, and sexual minority rights face higher levels of online violence. Indeed, women human rights defenders, journalists and politicians are the most affected by TFGBV, and many of them have withdrawn from the digital public sphere due to gendered disinformation, trolling, cyber harassment, and other forms of digital violence. The online trolling of women is growing exponentially and often takes the form of gendered and sexualised attacks and body shaming.

Several specific challenges must be considered when designing interventions to combat TFGBV. These challenges are shaped by legal, social, technological, and cultural factors, which affect both the prevalence of digital harms and violence and the ability to respond effectively. They include weak and inadequate legal frameworks; a lack of awareness about TFGBV among policymakers, law enforcement officers, and the general public; the gender digital divide; and normalised online abuse against women, with victims often blamed rather than supported.

Moreover, there is a shortage of comprehensive response mechanisms and support services for survivors of online harassment, such as digital security helplines, psychosocial support, and legal aid. On the other hand, there is limited regional and cross-sector collaboration between CSOs, government agencies, and the private sector (including tech companies).

A guiding strand for these efforts will be the #BeSafeByDesign campaign that highlights the necessity of safe platforms for women as well as the consequences when safety is missing. The #BeSafeByDesign obligation shifts the burden of responsibility of ensuring safety in online spaces away from women and places it on platforms where more efforts on risk assessments, accessible and stronger reporting pathways, proactive detection of abuse, and transparent accountability mechanisms are required. The initiative will also involve the practical upskilling of at-risk women in practical cybersecurity.