Senegal Elections: CIPESA and AfricTivistes Engage Key Stakeholders on Content Moderation

By Abdou Aziz Cissé, Laïty Ndiaye and CIPESA Staff Writer |

The postponement of Senegal’s presidential elections in February 2024 escalated political tensions in the west African country. In response, the Ministry of Communication, Telecommunications and Digital Economy suspended access to mobile internet, first on February 5 as parliament debated the extension of President Macky Sall’s tenure, and again on February 13 amidst civil society-led protests. The ministry claimed that social media platforms were fuelling the dissemination of “several subversive hate messages” that incited violence. 

The February 2024 restrictions on access to mobile internet were the third instance of network disruptions in the country, which had in the past kept the internet accessible during pivotal moments including elections. Last year, Senegal restricted access to the internet and banned TikTok, amidst opposition protests.

It is against this backdrop that the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), in partnership with AfricTivistes, organised a workshop on platform accountability and content moderation on February 8, 2024. The workshop held in the capital Dakar examined the efficacy and impact of content moderation and discussed the opportunities for stakeholder collaboration and common approaches for advancing internet freedom in Senegal. 

As at September 2023, there were over 18 million internet subscribers in Senegal. Participants at the workshop acknowledged that the growth in user numbers had fueled online disinformation and hate speech, which are threatening social cohesion. However, they also raised concerns about the disproportionate responses, notably network disruptions instituted by the government, which undermine freedom of expression, access to information and citizen participation. 

According to Ababacar Diop, a Commissioner with the Personal Data Protection Commission (CDP), efforts to curb the spread of harmful and illegal content online had seen the Commission partner with popular social media platforms to explore mechanisms to effectively regulate content. He added, however, that since the platforms are not domiciled in Senegal, the partnership’s effect has been limited. Besides being the authority responsible for personal data protection, the CDP’s mandate includes ensuring that technology does not pose a threat to the rights and lives of citizens. 

The CDP’s engagements with platforms complemented user reporting of harmful and illegal content and the trusted partner programme. However, participants noted that user reports and trusted partner programmes are heavily subjective, with users and partners sometimes flagging content as inappropriate or dangerous “solely based on their opinions”. “Moderation policies by platforms and governments must be alive to differing contexts and opinions,” said Serge Koué, a blogger and Information Technology expert. Moreover, algorithm-based content moderation measures are also prone to the same challenge, as they do not understand local context and languages, according to Pape Ismaïla Dieng, Communication and Advocacy Officer at AfricTivistes.

The complexities in content moderation were further highlighted with case studies from Nigeria and Uganda. In 2021, former Nigerian President Muhammadu Buhari announced a countrywide ban on Twitter following the deletion of a tweet from his account about the Biafra civil war. Twitter claimed the tweet violated the platform’s policy on “abusive behaviour.” Twitter was blocked from operating in the country following this face-off. In October 2021,the government issued several conditions for lifting the ban. It required Twitter to set up a local office, pay tax locally and cooperate with the Nigerian government to regulate harmful tweets. The platform remained banned in the country until January 2022. 

In Uganda, during the 2021 election period, Facebook and Twitter suspended the accounts of various pro-government individuals over what Facebook described as “Coordinated Inauthentic Behaviour (CIB)” to suit the online narrative interests of the ruling party. The platforms’ actions  sparked the ire of President Yoweri Museveni who responded by stating in a national address that, “If you want to take sides against the (ruling party), then that group will not operate in Uganda,” adding that, “We cannot tolerate this arrogance of anybody coming to decide for us who is good and who is bad.” A day later on January 11, 2021, access to social media was blocked and two days later the internet as a whole was blocked as citizens prepared to go to the polls. Facebook remains blocked to-date. 

The experiences from Nigeria and Uganda highlighted not only the role of public figures as perpetrators of harmful content but also the impact of the unchecked power of governments to censor and restrict access to platforms in direct response to content moderation based on platforms’ Community Standards.

During the workshop, Olivia Tchamba, Meta’s Public Policy Manager for Francophone Africa, stated that the platform was committed to striking a balance between giving users a voice and ensuring the predominance of reliable information. She added that regulation coupled with responsible user behaviour should be the norm. 

CIPESA’s Programme Manager Ashnah Kalemera also reiterated that content moderation requires a multi-stakeholder and multi-faceted approach not only involving platforms and regulators but also users. 

The workshop, which was attended by 25 participants including journalists, social media influencers, human rights defenders and staff of civil society organisations, called on platforms to ensure their terms of use are available in multiple languages, increase transparency in their content moderation processes, and promote awareness and understanding among African users of recourse mechanisms. 

They also emphasised that consolidating efforts such as by civil society, the CDP and Meta’s Oversight Board in line with national laws and international human rights standards, would help create a social media ecosystem that upholds freedom of expression and privacy, among other rights. 
Whereas the Senegalese Constitutional Court ruled against the postponement of elections, a new date for the polls is yet to be determined. The dialogue during CIPESA’s and AfricTivistes workshop is critical as tensions continue to simmer online and offline, and sets the pace for similar engagements in other African countries set to go to the polls during 2024 and the push for increased tech accountability.

Élections au Sénégal : CIPESA et AfricTivistes impliquent les principaux acteurs concernant la modération du contenu

Par Abdou Aziz Cissé, Laïty Ndiaye et rédacteur du CIPESA |

Le report des élections présidentielles au Sénégal en février 2024 a exacerbé les tensions politiques dans le pays d’Afrique de l’Ouest. En réponse, le Ministère de la Communication, des Télécommunications et de l’Économie Numérique a suspendu l’accès à Internet mobile, d’abord le 5 février alors que le parlement débattait de la prolongation du mandat du Président Macky Sall, puis à nouveau le 13 février au milieu de manifestations de la société civile. Le ministère a affirmé que les plateformes de médias sociaux alimentaient la diffusion de “plusieurs messages haineux subversifs” incitant à la violence.

Les restrictions de février 2024 sur l’accès à Internet mobile constituaient la troisième instance de perturbations du réseau dans le pays, qui avait précédemment maintenu l’accès à Internet disponible lors de moments cruciaux, notamment les élections. L’année dernière, le Sénégal a restreint l’accès à Internet et interdit TikTok, au milieu de manifestations de l’opposition.

C’est dans ce contexte que la Collaboration sur les Politiques Internationales des TIC pour l’Afrique de l’Est et Australe (CIPESA), en partenariat avec AfricTivistes, a organisé un atelier sur la responsabilité des plateformes et la modération du contenu le 8 février 2024. L’atelier, qui s’est tenu dans la capitale Dakar, a examiné l’efficacité et l’impact de la modération du contenu et a discuté des opportunités de collaboration entre les parties prenantes et des approches communes pour promouvoir la liberté d’Internet au Sénégal.

En septembre 2023, il y avait plus de 18 millions d’abonnés à Internet au Sénégal. Les participants à l’atelier ont reconnu que la croissance du nombre d’utilisateurs alimentait la désinformation en ligne et les discours de haine, qui menacent la cohésion sociale. Cependant, ils ont également exprimé des préoccupations concernant les réponses disproportionnées, notamment les perturbations du réseau instituées par le gouvernement, qui portent atteinte à la liberté d’expression, à l’accès à l’information et à la participation citoyenne.

Selon Ababacar Diop, Commissaire à la Commission de Protection des Données Personnelles (CDP), les efforts visant à freiner la diffusion de contenus nuisibles et illégaux en ligne ont conduit la Commission à collaborer avec des plateformes de médias sociaux populaires pour explorer des mécanismes permettant de réguler efficacement le contenu. Il a ajouté cependant que, comme les plateformes ne sont pas domiciliées au Sénégal, l’effet du partenariat a été limité. En plus d’être l’autorité chargée de la protection des données personnelles, le mandat de la CDP comprend également la garantie que la technologie ne constitue pas une menace pour les droits et la vie des citoyens.

Les engagements de la CDP avec les plateformes complétaient les signalements d’utilisateurs concernant des contenus nuisibles et illégaux et le programme de partenariat de confiance. Cependant, les participants ont noté que les signalements d’utilisateurs et les programmes de partenariat de confiance étaient fortement subjectifs, les utilisateurs et les partenaires signalant parfois des contenus comme inappropriés ou dangereux “uniquement sur la base de leurs opinions”. “Les politiques de modération des plateformes et des gouvernements doivent tenir compte des contextes et opinions différents”, a déclaré Serge Koué, blogueur et expert en technologie de l’information. De plus, les mesures de modération du contenu basées sur des algorithmes sont également sujettes au même défi, car elles ne comprennent pas le contexte local et les langues, selon Pape Ismaïla Dieng, Responsable de la Communication et du Plaidoyer chez AfricTivistes.

Les complexités liées à la modération du contenu ont été mises en lumière avec des études de cas du Nigeria et de l’Ouganda. En 2021, l’ancien Président nigérian Muhammadu Buhari a annoncé une interdiction nationale de Twitter suite à la suppression d’un tweet de son compte sur la guerre civile du Biafra. Twitter a affirmé que le tweet violait la politique de la plateforme sur “le comportement abusif”. Twitter a été bloqué dans le pays à la suite de cet affrontement. En octobre 2021, le gouvernement a émis plusieurs conditions pour lever l’interdiction. Il exigeait que Twitter ouvre un bureau local, paie des impôts localement et coopère avec le gouvernement nigérian pour réguler les tweets nuisibles. La plateforme est restée interdite dans le pays jusqu’en janvier 2022.

En Ouganda, pendant la période électorale de 2021, Facebook et Twitter ont suspendu les comptes de divers individus pro-gouvernementaux pour ce que Facebook a décrit comme “Comportement Inauthentique Coordonné (CIC)” afin de servir les intérêts narratifs en ligne du parti au pouvoir. Les actions des plateformes ont suscité la colère du Président Yoweri Museveni qui a répondu en déclarant lors d’une allocution nationale que : “Si vous voulez prendre parti contre le (parti au pouvoir), alors ce groupe n’opérera pas en Ouganda”, ajoutant que : “Nous ne pouvons pas tolérer cette arrogance de quiconque venant décider pour nous qui est bon et qui est mauvais.” Un jour plus tard, le 11 janvier 2021, l’accès aux médias sociaux a été bloqué et deux jours plus tard, Internet dans son ensemble a été bloqué alors que les citoyens se préparaient à aller voter. Facebook reste bloqué à ce jour.

Les expériences du Nigeria et de l’Ouganda ont mis en lumière non seulement le rôle des personnalités publiques en tant qu’auteurs de contenus nuisibles, mais aussi l’impact du pouvoir non contrôlé des gouvernements pour censurer et restreindre l’accès aux plateformes en réponse directe à la modération du contenu basée sur les Normes Communautaires des plateformes.

Pendant l’atelier, Olivia Tchamba, Responsable des Politiques Publiques de Meta pour l’Afrique francophone, a déclaré que la plateforme s’engageait à trouver un équilibre entre donner la parole aux utilisateurs et garantir la prédominance de l’information fiable. Elle a ajouté que la réglementation associée à un comportement utilisateur responsable devrait être la norme.

La Directrice de Programme de CIPESA, Ashnah Kalemera, a également souligné que la modération du contenu nécessite une approche multi-parties prenantes et multifacette impliquant non seulement les plateformes et les régulateurs, mais aussi les utilisateurs.

L’atelier, auquel ont participé 25 personnes, dont des journalistes, des influenceurs des médias sociaux, des défenseurs des droits de l’homme et du personnel d’organisations de la société civile, a appelé les plateformes à garantir que leurs conditions d’utilisation soient disponibles dans plusieurs langues, à accroître la transparence de leurs processus de modération du contenu, et à promouvoir la sensibilisation et la compréhension des utilisateurs africains sur les mécanismes de recours.

Ils ont également souligné que la consolidation des efforts, tels que ceux de la société civile, du CDP et du Conseil de Surveillance de Meta, conformément aux lois nationales et aux normes internationales en matière de droits de l’homme, contribuerait à créer un écosystème des médias sociaux qui défend la liberté d’expression et la vie privée, entre autres droits. Alors que la Cour Constitutionnelle sénégalaise s’est prononcée contre le report des élections, une nouvelle date pour les élections reste à déterminer. Le dialogue lors de l’atelier de CIPESA et d’AfricTivistes est crucial alors que les tensions continuent de bouillonner en ligne et hors ligne, et donne le ton pour des engagements similaires dans d’autres pays africains devant se rendre aux urnes en 2024 et pour la promotion de la responsabilité accrue des technologies.

Africa Commission Resolution A Boon For Fight Against Unlawful Surveillance

By CIPESA Writer |

The Collaboration on International ICT Police for East and Southern Africa (CIPESA) welcomes the resolution by the African Commission on Human and Peoples’ Rights, which urges African governments to cease undertaking unlawful communications surveillance.

The resolution is timely, as it comes amidst an unprecedented spike in the scale and nature of state surveillance that is often unlawful, excessive, and inadequately supervised by oversight bodies. As CIPESA research has found, the expansion in state surveillance in various African countries is denying citizens their rights to freedom of expression, association and assembly, and undermining their participation in democratic processes.

The Resolution on the deployment of mass and unlawful targeted communication surveillance and its impact on human rights in Africa, adopted last November at the Commission’s 77th Ordinary Session held in Arusha, Tanzania, expresses concern about the unrestrained acquisition of communication surveillance technologies by states without adequate regulation. It also notes the lack of adequate national frameworks on privacy, communication surveillance, and personal data protection. 

Furthermore, the resolution notes the Commission’s concern about the disproportionate targeting of journalists, human rights defenders, civil society organisations, whistleblowers and opposition political activists by state surveillance.

CIPESA welcomes the resolution, which reflects the findings of our research and the recommendations we have variously made to African governments regarding the conduct of state surveillance. CIPESA has previously called upon stakeholders, including governments, to take all measures that buttress the right to privacy in order to guarantee and enhance free expression, access to information, freedom of association, and freedom of assembly in accordance with international human rights standards.

Notably, the African Commission resolution urges African countries to ensure that all restrictions on privacy and other fundamental freedoms are necessary and proportionate, and in line with international human rights law and standards. It also urges states to consider safeguards such as the requirement for prior authorisation of surveillance by an independent and impartial judicial authority and the need for effective monitoring and regular review by independent oversight mechanisms.

According to CIPESA’s Legal Officer Edrine Wanyama, “The resolution is a step forward to buttressing data rights and privacy on the continent. States should take advantage of the resolution and overhaul regressive surveillance practices while embracing all internationally recognised efforts and standards for strengthening the right to privacy.”

According to the Declaration of Principles on Freedom of Expression and Access to Information in Africa, states should only engage in targeted surveillance in conformity with international human rights law (principle 41), and every individual shall have legal recourse to effective remedies in relation to the violation of their privacy and the unlawful processing of their personal information (principle 42 (7)). In addition, principle 20 requires states to guarantee the safety of journalists and other media practitioners by taking measures that prevent threats and unlawful surveillance.
See related CIPESA resources: Privacy Imperilled: Analysis of Surveillance, Encryption and Data Localisation Laws in Africa; Effects of State Surveillance on Democratic Participation in Africa; Compelled Service Provider Assistance for State Surveillance in Africa: Challenges and Policy Options; Mapping and Analysis of Privacy Laws in Africa.

A Decade of Internet Freedom in Africa: Report Documents Reflections and Insights from Change Makers

CIPESA Writer |

Over the last decade, Africa’s journey to achieve internet freedom has not been without challenges. There have been significant threats to internet freedom, evidenced by the rampant state censorship through
internet shutdowns, surveillance, blocking and filtering of websites, and the widespread use of repressive laws to suppress the voices of key actors.

However, amidst all this, there is a community of actors who have dedicated efforts towards advancing digital rights in the continent with the goal of ensuring that more Africans can enjoy the full benefits of the internet.

As part of our efforts recounting the work of the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) over the years, we are pleased to share this special edition report: A Decade of Digital Rights in Africa: Reflections and Insights from 10 Change Makers, where we document reflections and insights from ten collaborators who have been instrumental in shaping Africa’s digital and Internet freedom advocacy landscape over the last ten years.

These changemakers have demonstrated change by advocating for a more free, secure, and open internet in Africa and working to ensure that no one is left behind.

Read the full report: A Decade of Digital Rights in Africa: Reflections and Insights from 10 Change Makers!

Meet the Changemakers

‘Gbenga Sesan, the Executive Director of Paradigm Initiative, is an eloquent advocate for internet freedom across the continent, leading efforts to push back against repressive laws and promoting digital
inclusion while speaking truth to power. He continues to champion the transformative power of technology for social good and to drive positive change in society.

Arthur Gwagwa is a Research Scholar at Utrecht University, Netherlands, and a long-standing advocate for digital rights and justice. His work in the philanthropic sector has been instrumental in supporting various grassroots initiatives to promote internet freedom in Africa. Similarly, his pioneering research work and thought leadership continue to inspire and transform the lives of people in Africa.

Edetaen Ojo, the Executive Director of Media Rights Agenda, is a prominent advocate for advancing media rights and internet freedom. Known for his strategic vision and dedication to media freedom, he pioneered the conceptualisation and development of the African Declaration on Internet Rights and has been a key voice in shaping Internet policy-making in Africa.

Emilar Gandhi, the Head of Stakeholder Engagement and Global Strategic Policy Initiatives at Meta, built a strong foundation in civil society as an advocate for Internet freedom. She is a prominent figure in technology policy in Africa whose expertise and dedication have made her a valuable voice for inclusivity and responsible technology development in the region.

Dr. Grace Githaiga, the CEO and Convenor of Kenya ICT Action Network (KICTANet), has been a leading
advocate for media freedom and digital rights in Africa. Her tireless advocacy in shaping internet policy has earned her recognition for her pivotal roles in championing internet freedom, digital inclusion,
multistakeholderism, and women’s rights online.

Julie Owono, the Executive Director of Internet Sans Frontières (Internet Without Borders), is a passionate and respected digital rights advocate and thought leader in the global digital community. She is not only a champion for internet freedom in Africa but is also a symbol of hope for many communities standing at the forefront of the battle for internet freedom and connectivity in Africa.

Neema Iyer, the founder of Pollicy, is well known for her advocacy efforts in bringing feminist perspectives into data and technology policy. Her dynamic and multi-faceted approach to solving social challenges exemplifies the potential of data and technology to advance social justice and promote digital inclusion and internet freedom in Africa.

Dr. Tabani Moyo, the Regional Director of the Media Institute of Southern Africa (MISA), is a distinguished
media freedom advocate and influential leader in guiding a community of changemakers in Southern Africa. He has played an extensive and formidable role in pushing back against restrictive and repressive laws, supporting journalists under threat, empowering young Africans, and shaping internet governance policies.

Temitope Ogundipe, the Founder and Executive Director of TechSocietal, has been a champion for digital rights and inclusion in Africa. She is an advocate for women’s rights online and uses her expertise to contribute to the development of youth and address digital inequalities affecting vulnerable groups across the continent.

Wafa Ben-Hassine, the Principal Responsible Technology at Omidyar Network, is a recognised human rights defender and visionary leader dedicated to promoting human rights and responsible technological
development. Her relentless advocacy and valuable contributions to defending digital rights, civil liberties, and technology policy continue to inspire many across the continent.

Join the Report Launch Webinar:

When: January 31, 2024
Time: 14:00-16:00 (Nairobi Time)
Location: Zoom (Register here)
After registering, you will receive a confirmation email containing information about joining the webinar.

Updated: Watch the report launch webinar.

Introducing the Tech Accountability Fund and a Call for Proposals

Announcement |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has partnered with Digital Action to support work on tech accountability in Sub-Saharan Africa in the run up to and during the “Year of Democracy” in 2024. This support will be channelled through the Tech Accountability Fund that will be administered under the auspices of the Africa Digital Rights Fund (ADRF).  

Numerous African countries, including the Comoros, Senegal, Mauritania, Rwanda, Mozambique, Ghana, Algeria, Botswana, Chad, Guinea Bissau, Mali, Mauritius, Namibia, South Africa, South Sudan and Tunisia, are headed to the polls during 2024. Electoral processes are essential to building democracy, and given growing threats to information integrity and technology use in elections, it is crucial to conduct platform accountability around electoral processes. The Fund responds to these key concerns in the Year of Democracy and to the scant  resources to African civil society entities that are working to counter tech harms. 

In 2022, Africa had around 570 million internet users, of which 384 million (67%) were social media users. These users, most of whom are the youth, are increasingly using social media applications such as WhatsApp, Facebook, Twitter, YouTube, Instagram and TikTok for content creation and entertainment, business, advertising and entrepreneurship, communication and connection, education and learning, civic engagement and activism. As the users increase, reports from social media companies indicate the rise of harmful, illegal or offensive content on the platforms.

In response, social media companies have employed various measures to review, screen, and filter content to ensure it meets their community guidelines or policies and does not adversely affect the user experience on the platforms. The content moderation tools and techniques applied include keyword filtering, machine learning algorithms and human review. 

Despite these efforts, the inadequacy of the measures undertaken by social media platforms and social networking sites in moderating illegal, harmful or offensive content has increasingly been questioned. In Ethiopia for instance, social media companies have been accused of not doing enough to moderate such content, which has gone on to cause real-world harm, such as fuelling killings. Starkly, platforms such as Facebook and Twitter are accused of deploying minuscule resources and measures in content moderation in Africa, relative to investments in the United States and Europe. 

Key concerns about content moderation in Africa include the limited understanding by platforms of the cultural context in the continent, the lack of cultural sensitivity, labour rights violations, bias and discrimination of algorithms, non-application of local laws, lack of transparency and accountability in content moderation, all of which have an impact on freedom of expression and civic participation.

Call for Proposals
Applications are now open for the Tech Accountability Fund as the eighth edition of the ADRF. Grant sizes will range from USD 5,000 to USD 20,000 subject to demonstrated need. Cost sharing is strongly encouraged. Funding shall be for periods between six and 12 months. 

The Fund is particularly interested in work related to but not limited to:

  • Online gender-based violence, particularly against women politicians and women journalists
  • Network disruptions
  • Content moderation
  • Microtargeting and political advertising
  • Hate speech
  • Electoral Disinformation 
  • Electoral specific harms e.g. effects on freedom of expression and citizens’ ability to make independent choices and participate in electoral processes.

The deadline for applications is February 16, 2024. Read more about the Fund Guidelines here. The application form can be accessed here

Only shortlisted applicants will be contacted directly. Feedback on unsuccessful applications will be available upon request.