The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has partnered with Digital Action to support work on tech accountability in Sub-Saharan Africa in the run up to and during the “Year of Democracy” in 2024. This support will be channelled through the Tech Accountability Fund that will be administered under the auspices of the Africa Digital Rights Fund (ADRF).
Numerous African countries, including the Comoros, Senegal, Mauritania, Rwanda, Mozambique, Ghana, Algeria, Botswana, Chad, Guinea Bissau, Mali, Mauritius, Namibia, South Africa, South Sudan and Tunisia, are headed to the polls during 2024. Electoral processes are essential to building democracy, and given growing threats to information integrity and technology use in elections, it is crucial to conduct platform accountability around electoral processes. The Fund responds to these key concerns in the Year of Democracy and to the scant resources to African civil society entities that are working to counter tech harms.
In 2022, Africa had around 570 million internet users, of which 384 million (67%) were social media users. These users, most of whom are the youth, are increasingly using social media applications such as WhatsApp, Facebook, Twitter, YouTube, Instagram and TikTok for content creation and entertainment, business, advertising and entrepreneurship, communication and connection, education and learning, civic engagement and activism. As the users increase, reports from social media companies indicate the rise of harmful, illegal or offensive content on the platforms.
In response, social media companies have employed various measures to review, screen, and filter content to ensure it meets their community guidelines or policies and does not adversely affect the user experience on the platforms. The content moderation tools and techniques applied include keyword filtering, machine learning algorithms and human review.
Despite these efforts, the inadequacy of the measures undertaken by social media platforms and social networking sites in moderating illegal, harmful or offensive content has increasingly been questioned. In Ethiopia for instance, social media companies have been accused of not doing enough to moderate such content, which has gone on to cause real-world harm, such as fuelling killings. Starkly, platforms such as Facebook and Twitter are accused of deploying minuscule resources and measures in content moderation in Africa, relative to investments in the United States and Europe.
Key concerns about content moderation in Africa include the limited understanding by platforms of the cultural context in the continent, the lack of cultural sensitivity, labour rights violations, bias and discrimination of algorithms, non-application of local laws, lack of transparency and accountability in content moderation, all of which have an impact on freedom of expression and civic participation.
Call for Proposals
Applications are now open for the Tech Accountability Fund as the eighth edition of the ADRF. Grant sizes will range from USD 5,000 to USD 20,000 subject to demonstrated need. Cost sharing is strongly encouraged. Funding shall be for periods between six and 12 months.
The Fund is particularly interested in work related to but not limited to:
- Online gender-based violence, particularly against women politicians and women journalists
- Network disruptions
- Content moderation
- Microtargeting and political advertising
- Hate speech
- Electoral Disinformation
- Electoral specific harms e.g. effects on freedom of expression and citizens’ ability to make independent choices and participate in electoral processes.
Only shortlisted applicants will be contacted directly. Feedback on unsuccessful applications will be available upon request.