Congo Heads to the Polls: Why it is Crucial to Keep the Internet On

By CIPESA Writer |

The Democratic Republic of Congo (DR Congo) heads to general elections on December 20, 2023 amidst considerable turbulence, including a deluge of disinformation and endemic insecurity in parts of the country. To move towards building a safer and more democratic society, the Congolese government must break away from its past practice of limiting internet access and commit to keeping the internet accessible ahead of, throughout and after the election period.

For more than a decade, network disruptions have been a hallmark of elections in the DR Congo, Africa’s fourth most populous country. The first network disruption, which affected Short Messaging Services (SMS), was witnessed in December 2011, following disputed elections. The government claimed the move was necessary to prevent the online spread of fake results prior to the official announcement by the electoral commission.

Three years later in January 2015, the Congolese government again ordered telecommunications companies to block access not only to SMS but also the internet. This shutdown came on the backdrop of protests against a proposed electoral bill. Whereas banks and government agencies were granted access to the internet four days after the shutdown, the general public did not regain access until after three weeks.

In 2016, the government ordered telecom operators to block access to social media sites as an attempt to thwart mobilising by protestors against then president Joseph Kabila’s stay in office beyond the two-term limit. In January 2018, internet access and SMS were disrupted ahead of a peaceful protest march organised by the Catholic Church to compel Kabila to step down following the expiry of his final term in office. 

The Congolese government again disrupted the internet during the last general elections in December 2018. A senior government official justified the move as necessary to preserve public order after “fictitious results” were circulated on social media. 

Congo faces strong challenges to electoral integrity, including logistical ones such as faulty voters’ cards, poor infrastructure that renders delivery of electoral materials cumbersome, and armed conflict that will likely limit voting in some areas. Moreover, the rampant electoral disinformation has fuelled insecurity and hindered many citizens’ ability to access credible and pluralistic information necessary to make informed and independent choices. 

Opposition leaders have accused the electoral commission of lacking independence and questioned its ability to hold free and fair elections. This has fomented hostilities against the electoral body and its officials and raised the possibility of post-election violence. 

Meanwhile, since November 19, 2023, the Committee to Protect Journalists has documented attacks or threats against at least four journalists and the closure of at least one broadcast station.

Today, a disruption to the internet would exacerbate the challenges of disinformation, incite violence, endanger people’s lives, affect press freedom, and have long-lasting harms on Congo’s economy. Shutdowns provide a shield to those who perpetrate human rights violations, deny people access to critical information, and amplify the spread of misinformation as they block access to alternative sources of verification. The resulting lack of transparency would fuel doubts about the integrity of the elections and heighten the prospects for violence.

Internet shutdowns also violate regional and international frameworks including the African Charter on Human and People’s Rights (ACHPR), the Declaration of Principles on Freedom of Expression and Access to Information in Africa 2019, and the International Covenant on Civil and Political Rights (ICCPR). These frameworks uphold fundamental human rights including freedom of opinion and expression, access to information, right to assembly and social, economic, cultural and political rights.

Indeed, the United Nations Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association has noted that network shutdowns are in clear violation of international law and cannot be justified in any circumstances.

The 2023  election could be a key step in Congo’s move toward a more democratic, tolerant and safe society. How the country handles the election process, including the transparency and integrity of the entire electoral cycle, could determine whether the post-election period will be peaceful and see meaningful social cohesion in the country.

CIPESA Launches Framework to Assess ICT Accessibility for Persons with Disabilities in Africa

By Frank Kisakye |

In many instances, persons with disabilities are unable to use digital technologies because these technologies lack “digital accessibility,” namely the ability of a website, mobile application, or electronic document to be easily navigated and understood by a wide range of users, including those with visual, auditory, motor or cognitive disabilities. The issue is especially key in African countries, where persons with disabilities contend with various forms and layers of exclusion, contradicting the potential for inclusion promised by technology despite one of the pillars of the 2030 Agenda for Sustainable Development Goals (SDGs) pledging to “leave no one behind.”

Against this background, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) hosted experts from multiple countries in a webinar to discuss developments pertaining to efforts aimed at enhanced digital inclusion for persons with disabilities. The webinar also served to commemorate the International Day for Persons with Disabilities on December 4, for which CIPESA launched its revised Disability and ICT Accessibility Framework Indicators: A Framework for Monitoring the Implementation of ICT Accessibility Laws and Policies in Africa.

Speaking at the webinar, Dr. Abdul Busuulwa, a lecturer at Kyambogo University Institute for Special Needs (Uganda) and also a CIPESA board member, noted that there are several tools and devices that mitigate barriers to accessibility for users with visual, physical, hearing, intellectual, and psychosocial disabilities. He also highlighted the accessibility guidelines published by the Web Accessibility Initiative of the World Wide Web Consortium which are  aimed at making web content more accessible for persons with visual and hearing impairment, learning disabilities, cognitive limitations, limited movement, speech disabilities, photosensitivity, and combinations of these.

However, the spectrum for ensuring digital access for persons with disabilities is expansive. Titilola Olatunde-Fasogbon, a senior associate at Udo Udoma & Belo-Osagie (Nigeria), noted that in addition to the technical aspects of inclusion, such as websites, more deliberate efforts need to be made by stakeholders in the information ecosystem.  Olatunde-Fasogbon pointed out the role of media houses and the need for more investment into meeting the needs of persons with disabilities as part of broadcasting, such as through more sign language interpretation, subtitling, dubbing for non-English speakers or audio-to-text options, as well as provisions for alternate text for non-text content.

She added that governments also need to ensure that policies and laws align to enhance digital inclusion for excluded communities and that they should be at the forefront of driving such policies. Further, that for digital inclusion to be realised for persons with disabilities, civil society organisations must pursue more collaborative efforts with governments as “governments are obligated to offer the much-needed tax-related policies, tax holidays for commercial companies involved in digital inclusion beyond just passing the laws.”

Collaborative efforts were also emphasised by Dr. Diana Msipa, Manager, Disability Unit at the University of Pretoria (South Africa), who stressed that digital inclusion efforts should even be more deliberate for persons with disabilities in rural areas as they face numerous exclusions such as limited availability of infrastructure and public resources, as well as scarce economic opportunities. Dr. Msipa called for more conscious efforts to make it mandatory for governments and organisations to translate adverts, engage sign language interpreters, and use of non-complex language, among other measures, to ensure information reaches audiences often excluded.

Further efforts at Diversity, Equity, and Inclusion (DEI) also need to be part of narratives on digital inclusion for persons with disabilities, including through more employment opportunities and leadership positions for persons with disabilities. Dr. Karen Smit, Manager of Disability Unit at Vodacom (South Africa) stated that, “Employers should embrace an inclusive culture so that staff with disabilities can feel they belong. Senior leaders must speak about disability, and ongoing awareness raising must be conducted with management and all staff.” 

Dr. Smit noted that solutions for digital inclusion already exist for mainstream devices and called for more stakeholders, including private sector actors, to pursue awareness initiatives and campaigns on their existence.  She noted that  Safaricom – a subsidiary of Vodacom – introduced the Interactive Voice Response (IVR), a mobile money (M-PESA) solution that enables the visually impaired and blind customers to be in control of their M-PESA transactions.

Similar sentiments were expressed in a 2020 CIPESA report – Access Denied How Telecom Operators in Africa Are Failing Persons With Disabilities, which found that while there are various efforts to increase ICT usage in Africa, there is limited information about what telecom companies are doing to promote digital accessibility. 

While there may be efforts aimed at improving digital inclusion, these need to be assessed regularly. CIPESA’s Revised Disability and ICT Accessibility Framework notes that an accessible web also benefits people without disabilities, for example, older people with changing abilities, people using a slow/expensive Internet connection; and people with “temporary disabilities” such as a broken arm or poor eyesight. The framework implores companies, governments, and organisations to conduct a self-diagnosis on whether they meet the threshold of the five broad disability and ICT accessibility framework indicators which include legal and regulatory, accessibility framework for public access, mobile communication accessibility, television, video programming accessibility, and web accessibility.  

The five indicators were informed and crafted around key provisions within national laws, policies, and international human rights instruments on ICT and Disability. Other international ICT Accessibility standards, such as The Web and Mobile Content Accessibility Guidelines developed by the World Wide Web Consortium, informed the indicators.

It is anticipated that this framework will be useful for monitoring and measuring public and private stakeholders’ compliance and implementation of inclusion obligations and inform research, advocacy, and capacity building on ICT for persons with disabilities in the region. 

In addition, the assessment informs planning for different interventions at country and regional levels since it reveals areas that need further interventions. It can also highlight good practices that can help inspire other countries that may still be far in the journey or still struggling with certain aspects of improving their disability digital rights.

Panelists encouraged Disability Rights Organisations, policymakers, mobile network operators, researchers, and academia interested in advancing the rights of persons with disabilities to utilise the framework as a key pillar in their digital accessibility and inclusion efforts.

Here is the Disability and ICT Accessibility Framework Indicators: A Framework for Monitoring the Implementation of ICT Accessibility Laws and Policies in Africa

Shifting the Burden: Online Violence Against Women

By Evelyn Lirri |

Across Africa, the use of Information and Communications Technology (ICT) by women and girls remains low. Yet amidst the low access to digital tools, women, particularly those in public and political spaces, such as human rights defenders (HRDs), bloggers, and journalists, continue to be the primary target of various forms of online violence such as cyberstalking, sexual harassment, trolling, body shaming and blackmail.

 According to a 2021 global survey by UNESCO, nearly three-quarters of female journalists have experienced online harassment in the course of their work, forcing many to self-censor. Furthermore, a 2020 report by UN Women found that women in politics and the media were more likely to be victims of technology-based violence as a consequence of their work and public profiles.

Over the years, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has documented and pursued interventions aimed at addressing the significant obstacles hindering an increase in women’s participation not only in online spaces but also in the political sphere. A concerning and recurring trend is that, oftentimes, responses to violence against women have prioritised an individual’s responsibility for self-protection rather than systematic or policy actions. 

 At the Forum on Internet Freedom in Africa 2023 (FIFAfrica23), the National Democratic Institute (NDI), Pollicy, Africtivites, the Women of Uganda Network (WOUGNET), Internews and the Solidarity Centre shared lessons learned from their work implementing multi-stakeholder interventions to address online violence against women. During a panel discussion, it was noted that applying multi-stakeholder interventions that include governments, civil society, technology platforms and media was critical in promoting safe and meaningful participation of women in online spaces. Internews and WOUGNET highlighted the work they have been jointly engaged in through the FemTech project in various African countries, aimed at empowering women human rights defenders to safely participate in digital spaces while promoting equitable access to technology. Through trainings of women human rights defenders, CSOs, policy makers and law enforcers, the project is raining awareness on how women are often impacted by cyber crimes legislations. 

In Senegal, AfricTivistes, a network organisation made up of journalists, bloggers and HRDs, has spearheaded public advocacy campaigns on responsible use of the internet. The organisation has conducted gender-inclusive training and capacity-building workshops for journalists, bloggers, public officials and political leaders on how to respond to cyber violence. Aisha Dabo, a Programme Coordinator at AfricTivistes, noted that since 2017, over 700 people in 15 African countries have been reached with these trainings. The organisation also conducts media monitoring of online violence on social media platforms. 

Sarah Moulton, NDI’s Deputy Director for Democracy and Technology, highlighted the negative impact that online violence continues to have on women who are actively engaged in politics and political spaces. In Uganda, for instance, a joint report by Pollicy and NDI documented cases of gender-based online violence during the 2021 general elections and found that women and men politicians experienced online violence differently, with women candidates likely to be trolled and body shamed while men were more likely to experience hate speech. This echoed research by CIPESA which analysed the gender dynamics of politics in online spaces in Uganda, including campaigns for presidential, parliamentary, mayoral, and other local government seats during the same elections. The CIPESA research also explored the legal landscape and in similarity to Pollicy and NDI found that although Uganda has enacted a number of laws aimed at improving digital access and rights such as the Computer Misuse Act 2011, the Anti Pornography Act 2014, the Excise Duty (Amendment) Act 2018, most do not address the gender dynamics of the internet such as targeted online gender-based violence, affordability, and the lack of digital skills among women.  

Like Africvistes, NDI has engaged in a number of campaigns to document these various forms of violence and make recommendations to address the problem. In 2022, it released a  list of interventions that could be adopted globally by technology platforms, governments, civil society and the media to mitigate the impact of online violence against women in politics and hold perpetrators to account.  

“Often, the expectation is that the individual is responsible for addressing the issue or for advocating on behalf of themselves. It really needs to involve a lot of actors,” said Moulton. 

On its part, the Solidarity Centre has been spearheading a global campaign to end gender-based violence and harassment in the world of work. With the advent of Covid-19, a growing number of women shifted online for employment opportunities, access to services and education, among others. It was highlighted that female platform workers, including influencers, content creators and women who run online retail businesses, continue to face various violations such as sexual harassment and cyberbullying. 

Panelists called on governments to ratify the International Labour Organisation (ILO) Convention No. 190 on violence and harassment in the world of work. This global treaty recognises the impact of domestic violence in the workplace, and how women are often disproportionately affected.  Currently, the convention has been ratified by 32 countries globally, of which only eight are African.

Journalists attending FIFAfrica23 also shared their encounters with online violence and called for regular digital literacy skills to stay safe online. Alongside the need for enhanced digital literacy, participants also noted the lack of effective reporting mechanisms for cases. Ultimately, it was noted that efforts that shift the burden of blame from victims of online violence against women in Africa need to be more actively pursued, alongside more actionable, collaborative and systematic interventions by governments, law enforcement, and platforms.

CIPESA Conducts Digital Rights Training for Ethiopian Human Rights Commission Staff

By CIPESA Writer |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has conducted a digital rights training for staff of the Ethiopian Human Rights Commission (EHRC) in a programme that benefitted 22 experts from various departments of the statutory entity. 

The training was a response to the desire by the commission to build its organisational capacity in understanding and defending digital rights and CIPESA’s vision to grow the ability of African national human rights institutions (NHRIs) to monitor, protect and promote digital freedoms.

Conducted in the Ethiopian capital Addis Ababa on October 11-12, 2023, the programme aimed to build the EHRC staff’s understanding of digital rights issues and the link with traditional rights. Participants went on to brainstorm how the EHRC should strengthen human rights protection in the digital space and through the use of technology.

Dr. Abdi Jibril, the Commissioner for Civil and Political and Socio-Economic Rights at the EHRC, noted that the proliferation of digital technology has contributed positively to human rights protection. It was therefore necessary to maximise the benefits of digital technology and to expand its usage for the promotion and enforcement of human rights.

The importance of growing the capacity of NHRIs was underscored by Line Gamrath Rasmussen, Senior Adviser, Human Rights, Tech and Business at the Danish Institute for Human Rights and CIPESA Executive Director Dr. Wairagala Wakabi. African NHRIs are not always well versed with the opportunities and challenges which technology presents, which creates a need for capacity development and developing partnerships with stakeholders such as civil society. 

As legislation governing the technology domain is fast-evolving, NHRIs in many countries are playing catch up. As such, these institutions need to constantly keep updating themselves on new legislation and implications of these laws on human rights in the digital domain. The NHRIs need to  enhance their capacity to document, investigate and report on digital rights. 

The NHRIs also need to pay specific attention to the vices such as hate speech, disinformation, and technology-facilitated gender-based violence (TF GBV), that are being perpetuated with the aid of technology.

Dr. Daniel Bekele, the Chief Commissioner at EHRC, stated that social media companies and messaging platforms are not doing enough to moderate harmful content in Africa yet in other geographical regions they have invested more resources and efforts in content moderation. He said African countries need to work together in order to build a strong force against the powerful platforms. The official proposed that the African Union (AU), working with relevant governments and other stakeholders, should spearhead the development of regulations which African countries can jointly use in their engagements with the tech giants on issues such as content moderation. 

 The two-day training discussed the positive and negative effects of digital technology on human rights and how the commission’s work to enforce human rights can be strengthened through the use of digital technology.

Among other topics, the training also addressed the human rights, governance and technology landscape in Sub-Saharan Africa; public and private sector digitalisation and challenges for human rights; the link between online and offline rights; transparency and accountability of the private sector in upholding human rights; and opportunities for NHRIs to advance online rights at national, regional and international levels. It also featured deep dives into key digital rights concerns such as surveillance, online violence against women, disinformation, and network disruptions. 

At the end of the training, the EHRC staff identified key actions the commission could integrate in its annual work plans, such as digital rights monitoring, advocacy for enabling laws to be enacted, and developing tools for follow up on implementation of recommendations on digital rights by treaty bodies and the Human Rights Council. Others were collaborations with local and regional actors including media, fact-checkers, civil society organisations, and platforms; working with the police and other national mechanisms to tackle hate speech and disinformation while protecting human rights; and conducting digital literacy.

Trainers in the programme were drawn from CIPESA, the Centre for the Advancement of Rights and Democracy (CARD), the Danish Institute for Human Rights, the Centre for International Private Enterprise (CIPE), the African Centre for Media Excellence (ACME), Inform Africa, and the Kenya National Cohesion and Integration Commission (NCIC).

Meanwhile, after the aforementioned training, CIPESA teamed up with Ethiopian civil society partners to conduct a training on disinformation and hate speech for journalists, bloggers and digital rights activists. Like many African countries, Ethiopia is grappling with a significant and alarming rise in hate speech and disinformation, particularly on social media platforms. This surge in disinformation is undermining social cohesion, promoting conflict, and leading to a concerning number of threats against journalists and human rights defenders.

The proliferation of disinformation is to citizens’ fundamental rights as studies have shown that many Ethiopians feel their right to freedom of expression is compromised. The prevalence of disinformation also means that many Ethiopians lack access to impartial and diverse information.

Disinformation has been directly fueling conflict in several regions of Ethiopia. According to workshop participants and reports, both pro-government and anti-government actors have perpetuated this vice, whose real-world consequences are severe, including the loss of life and large-scale violent events.

Whereas Ethiopia in 2020 enacted legislation to curb hate speech and disinformation, the effectiveness of this law has been called into question. Some critics argue that it has not been effectively implemented and could be used to undermine citizens’ rights.

The training equipped 21 journalists, bloggers and activists with knowledge to navigate this law and with skills to call out and fight disinformation and hate speech. The efforts of the trained journalists, and those which the human rights commission could implement, are expected to boost the fight against online harms and contribute to the advancement of digital rights in Ethiopia.

Social Media 4 Peace: An Initiative to Tackle the Quagmire Of Content Moderation

By Juliet Nanfuka |

Content moderation has emerged as a critical global concern in the digital age. In Africa, coinciding with increasing digital penetration and digitalisation, along with problematic platform responses to harmful content and retrogressive national legislation, the demand for robust and rights-respecting content moderation has reached a new level of urgency. 

Online content platforms and governments are increasingly getting caught in a contentious struggle on how to address content moderation, especially as online hate speech and disinformation become more pervasive. These vices regularly seep into the real world, undermining human rights, social cohesion, democracy, and peace. They have also corroded public discourse and fragmented societies, with marginalised communities often bearing some of the gravest  consequences. 

The Social Media 4 Peace (SM4P) project run by the United Nations Educational, Scientific and Cultural Organization (UNESCO) was established to strengthen the resilience of societies to potentially harmful content spread online, in particular hate speech inciting violence, while protecting freedom of expression and enhancing the promotion of peace through digital technologies, notably social media. The project was piloted in four countries – Bosnia and Herzegovina, Colombia, Indonesia, and Kenya.

At the Forum on Internet Freedom in Africa (FIFAfrica) held in Tanzania in September 2023, UNESCO hosted a session where panellists interrogated the role of a multi-stakeholder coalition in addressing gaps in content moderation with a focus on Kenya. The session highlighted the importance of multi-stakeholder cooperation, accountability models, and safety by design to address online harmful content, particularly disinformation and hate speech in Africa. 

In March 2023, as a product of the SM4P, UNESCO in partnership with the National Cohesion and Integration Commission (NCIC) launched the National Coalition on Freedom of Expression and Content Moderation in Kenya. The formation of the coalition, whose membership has grown to include various governmental, civil society and private sector entities, is a testament to the need for content moderation efforts based on broader multi-stakeholder collaboration. 

As such, the coalition provided a learning model for the participants at FIFAfrica, who included legislators, regulators, civil society activists and policy makers, whose countries are grappling with establishing effective and rights-based content moderation mechanisms. The session explored good practices from the SM4P project in Kenya for possible replication of the model coalition across African countries. Discussions largely centred on issues around content moderation challenges and opportunities for addressing these issues in the region.

Online content moderation has presented a new regulatory challenge for governments and technology companies. Striking a balance between safeguarding freedom of expression and curtailing harmful and illegal content has presented a challenge especially as such decisions have largely fallen into the remit of platforms, and state actors have often criticised platforms’ content moderation practices. 

This has resulted in governments and platforms coming to loggerheads as was witnessed during the 2021 Uganda elections. Six days before the election, Meta blocked various accounts and removed content for what it termed as “coordinated inauthentic behavior”. Most of the accounts affected were related to pro-ruling party narratives or had links to the ruling party. In response, the Uganda government blocked social media before blocking access to the entire internet. Nigeria similarly suspended Twitter in June 2021 for deleting a post made from the president’s account.

Social media companies are taking various measures to curb such content, including by using artificial intelligence tools, employing human content moderators, collaborating with fact-checking organisations and trusted partner organisations that identify false and harmful content, and relying on user reports of harmful and illegal content. However, these measures by the platforms are often criticised as inadequate. 

On the other hand, some African governments were also criticised for enacting laws and issuing directives that undermine citizens’ fundamental rights, under the guise of combating harmful content.  

Indeed, speaking at the 2023 edition of FIFAfrica, Felicia Anthonio, #KeepItOn Campaign Manager at Access Now, stated that weaknesses in content moderation are often cited by some governments that shut down the internet. She noted that governments justify shutting down internet access as a tool to control harmful content amidst concerns of hate speech, disinformation, and incitement of violence. This was reaffirmed by Thobekile Matimbe from Paradigm Initiative who noted that “content moderation is a delicate test of speech”, stressing that if content moderation is not balanced against freedom of expression and access to information, it would result in violation of fundamental human rights. Beyond governments and social media companies, other stakeholders need to step up efforts to combat harmful content and advocate for balanced content moderation policies. For instance, speakers at the FIFAfrica session were unanimous that civil society organisations, academic institutions, and media regulators need to enhance digital media literacy and increase collaborative efforts. Further, they stressed the need for these stakeholders to regularly conduct research to underpin evidence-based advocacy, and to support the development of human rights-centered strategies in content moderation.