By Abdou Aziz Cissé, Laïty Ndiaye and CIPESA Staff Writer |
The postponement of Senegal’s presidential elections in February 2024 escalated political tensions in the west African country. In response, the Ministry of Communication, Telecommunications and Digital Economy suspended access to mobile internet, first on February 5 as parliament debated the extension of President Macky Sall’s tenure, and again on February 13 amidst civil society-led protests. The ministry claimed that social media platforms were fuelling the dissemination of “several subversive hate messages” that incited violence.
The February 2024 restrictions on access to mobile internet were the third instance of network disruptions in the country, which had in the past kept the internet accessible during pivotal moments including elections. Last year, Senegal restricted access to the internet and banned TikTok, amidst opposition protests.
It is against this backdrop that the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), in partnership with AfricTivistes, organised a workshop on platform accountability and content moderation on February 8, 2024. The workshop held in the capital Dakar examined the efficacy and impact of content moderation and discussed the opportunities for stakeholder collaboration and common approaches for advancing internet freedom in Senegal.
As at September 2023, there were over 18 million internet subscribers in Senegal. Participants at the workshop acknowledged that the growth in user numbers had fueled online disinformation and hate speech, which are threatening social cohesion. However, they also raised concerns about the disproportionate responses, notably network disruptions instituted by the government, which undermine freedom of expression, access to information and citizen participation.
According to Ababacar Diop, a Commissioner with the Personal Data Protection Commission (CDP), efforts to curb the spread of harmful and illegal content online had seen the Commission partner with popular social media platforms to explore mechanisms to effectively regulate content. He added, however, that since the platforms are not domiciled in Senegal, the partnership’s effect has been limited. Besides being the authority responsible for personal data protection, the CDP’s mandate includes ensuring that technology does not pose a threat to the rights and lives of citizens.
The CDP’s engagements with platforms complemented user reporting of harmful and illegal content and the trusted partner programme. However, participants noted that user reports and trusted partner programmes are heavily subjective, with users and partners sometimes flagging content as inappropriate or dangerous “solely based on their opinions”. “Moderation policies by platforms and governments must be alive to differing contexts and opinions,” said Serge Koué, a blogger and Information Technology expert. Moreover, algorithm-based content moderation measures are also prone to the same challenge, as they do not understand local context and languages, according to Pape Ismaïla Dieng, Communication and Advocacy Officer at AfricTivistes.
The complexities in content moderation were further highlighted with case studies from Nigeria and Uganda. In 2021, former Nigerian President Muhammadu Buhari announced a countrywide ban on Twitter following the deletion of a tweet from his account about the Biafra civil war. Twitter claimed the tweet violated the platform’s policy on “abusive behaviour.” Twitter was blocked from operating in the country following this face-off. In October 2021,the government issued several conditions for lifting the ban. It required Twitter to set up a local office, pay tax locally and cooperate with the Nigerian government to regulate harmful tweets. The platform remained banned in the country until January 2022.
In Uganda, during the 2021 election period, Facebook and Twitter suspended the accounts of various pro-government individuals over what Facebook described as “Coordinated Inauthentic Behaviour (CIB)” to suit the online narrative interests of the ruling party. The platforms’ actions sparked the ire of President Yoweri Museveni who responded by stating in a national address that, “If you want to take sides against the (ruling party), then that group will not operate in Uganda,” adding that, “We cannot tolerate this arrogance of anybody coming to decide for us who is good and who is bad.” A day later on January 11, 2021, access to social media was blocked and two days later the internet as a whole was blocked as citizens prepared to go to the polls. Facebook remains blocked to-date.
The experiences from Nigeria and Uganda highlighted not only the role of public figures as perpetrators of harmful content but also the impact of the unchecked power of governments to censor and restrict access to platforms in direct response to content moderation based on platforms’ Community Standards.
During the workshop, Olivia Tchamba, Meta’s Public Policy Manager for Francophone Africa, stated that the platform was committed to striking a balance between giving users a voice and ensuring the predominance of reliable information. She added that regulation coupled with responsible user behaviour should be the norm.
CIPESA’s Programme Manager Ashnah Kalemera also reiterated that content moderation requires a multi-stakeholder and multi-faceted approach not only involving platforms and regulators but also users.
They also emphasised that consolidating efforts such as by civil society, the CDP and Meta’s Oversight Board in line with national laws and international human rights standards, would help create a social media ecosystem that upholds freedom of expression and privacy, among other rights.
Whereas the Senegalese Constitutional Court ruled against the postponement of elections, a new date for the polls is yet to be determined. The dialogue during CIPESA’s and AfricTivistes workshop is critical as tensions continue to simmer online and offline, and sets the pace for similar engagements in other African countries set to go to the polls during 2024 and the push for increased tech accountability.