What Does Meta’s About-Turn on Content Moderation Bode for Africa?

By CIPESA Writer |

Meta’s recent decision to get rid of its third-party fact-checkers, starting within the United States, has sent shockwaves globally, raising significant concerns about the concept of free speech and the fight against disinformation and misinformation. The announcement was part of a raft of major policy changes announced on January 7, 2025 by Meta’s CEO Mark Zuckerberg that will affect its platforms Facebook, Instagram and Threads used by three billion people worldwide. They include the introduction of the user-generated “Community Notes” model, elimination of third-party fact-checkers, reduced content restrictions and enforcement, and enabling the personalisation of civic or political content.

While the announcement makes no reference to Africa, the changes will trickle down to the continent. Meta’s decision is particularly concerning for Africa which is unique in terms of linguistic and cultural diversity, limited digital and media information literacy, coupled with the growing challenges of hate speech and election-related disinformation, lack of context-specific content moderation policies, and inadequate investment in local fact-checking initiatives.

Africa’s content moderation context and needs are also quite different from those of Europe or North America due to the predominant use of local languages that are often overlooked by automated fact-checking algorithms and content filters.

Notably, the justifications given by Meta are quite weak, as the new changes appear to undermine its own initiatives to promote free speech, particularly the work of its third-party fact-checking program and the Oversight Board, which it set up to help resolve some of the most difficult questions around freedom of expression online and information integrity. The decision also appears to be politically and economically motivated as the company seeks to re-align itself with and appease the incoming Trump administration that has been critical against fact-checking and get assistance in pushing back against regulation of its activities outside the U.S.

The company also amended its policy on Hateful Conduct on January 7, 2025, and replaced the term “hate speech” with “hateful conduct” and eliminated previous thresholds for taking down hate content and will allow more hateful speech against specific groups. Further, whereas the company is moving its Trust and Safety and Content Moderation Teams to Texas, it is yet to set up such robust teams for Africa.

Importance of Fact-Checking

Fact-checking plays a critical role in combating disinformation and misinformation and fostering informed public discourse. By verifying the accuracy of online content, fact-checkers help to identify unauthentic content and counter the spread of false narratives that can incite violence, undermine trust in institutions, or distort democratic processes.

Additionally, it promotes accountability and reduces the virality of misleading content, particularly during sensitive periods, such as elections, political unrest, public health crises, or conflict situations, where accurate and credible information is crucial for decision-making. Moreover, fact-checking fosters media literacy by encouraging audiences to critically evaluate information sources.

Fact-checking organisations such as Politifact have criticised the assertions by the Meta CEO that fact-checkers were “too politically biased” and had “destroyed more trust than they had created, especially in the U.S.”, yet decisions and power to take down content have been squarely Meta’s responsibility, with fact-checkers only providing independent review of posts. The Meta assertions also undermine the work of independent media outlets and civil society who have been accused by authoritarian regimes of being corrupt political actors.

 However, fact-checking is not without its challenges and downsides. The process can inadvertently suppress free expression, especially in contexts where the line between disinformation and legitimate dissent is blurred. In Africa, where cultural and linguistic diversity is vast, and resources for local-language moderation are limited, fact-checking algorithms or teams may misinterpret context, leading to unjust content removal or amplification of bias. Furthermore, fact-checking initiatives can become tools for censorship if not governed transparently, particularly in authoritarian settings.

Despite these challenges, the benefits of fact-checking far outweigh their challenges. Instead of getting rid of fact-checking, Meta and other big tech companies should strengthen its implementation by providing enough resources to both recruit, train and provide psycho-social services to fact-checkers.

Impact of the Decision for Africa
  1. Increase of Disinformation

Africa faces a distinct set of challenges that make effective content moderation and fact-checking particularly crucial. Disinformation and misinformation in Africa have had far-reaching consequences, from disrupting electoral processes and influencing the choice of candidates by unsuspecting voters to jeopardising public health. Disinformation during elections has fueled violence, while health-related misinformation during health crises, such as during the Covid-19 pandemic, endangered lives by undermining public health efforts. False claims about the virus, vaccines, or cures led to vaccine hesitancy, resistance to public health measures like mask mandates, and the proliferation of harmful treatments. This eroded trust in health institutions, slowed down pandemic response efforts, and contributed to preventable illnesses and deaths, disproportionately affecting vulnerable populations.

The absence of fact-checking exacerbates the existing challenges of context insensitivity, as automated systems and under-resourced moderation teams fail to address the nuances of African content. The introduction of the user-driven Community Notes, which is similar to the model used on X, will still require experts’ input, especially in a region where many governments are authoritarian. Yet, media and information literacy and access to credible and reliable information is limited, and Meta’s platforms are primary ways to access independent news and information.

Research on the use of Community Notes on X has shown that the model has limited effectiveness in reducing the spread of disinformation, as it “might be too slow to intervene in the early (and most viral stages of the diffusion”, which is the most critical. The move also undermines efforts by civil society and fact-checking organisations in the region who have been working tirelessly to combat the spread of harmful content online.

  1. Political Manipulation and Increased Malign Influence

Dialing down on moderation and oversight may empower political actors who wish to manipulate public opinion through disinformation campaigns resulting in the surge of such activities. Given that social media has been instrumental in mobilising political movements across Africa, the lack of robust content moderation and fact-checking could hinder democratic processes and amplify extremist views and propaganda. Research has shown an apparent link between disinformation and political instability in Africa.

Unchecked false narratives not only mislead voters, but also distort public discourse and diminish public trust in key governance and electoral institutions. Authoritarian regimes may also use it to undermine dissent. Moreover, the relaxation of content restrictions on sensitive and politically divisive topics like immigration and gender could open floodgates for targeted hate speech, incitement and discrimination which could exacerbate gender disinformation, ethnic and political tensions. Likewise, weak oversight may enable foreign/external actors to manipulate elections.

  1. Regulatory and Enforcement Gaps

The effect of Meta easing restrictions on moderation of sensitive topics and reduced oversight of content could lead to an increase of harmful content on their platforms. Already, various African countries have  weak regulatory frameworks for harmful content and thus rely on companies like Meta to self-regulate effectively. Meta’s decision could spur efforts by some African governments to introduce new and more repressive laws to restrict certain types of content and hold platforms accountable for their actions. As our research has shown, such laws could be abused and employed to suppress dissent and curtail online freedoms such as expression, assembly, and association as well as access to information, creating an even more precarious environment.

  1. Limited Engagement with Local Actors

Meta’s decision to abandon fact-checking raises critical concerns for Africa, coming after the tech giant’s January 2023 decision to sever ties with their East African content moderation contractor, Sama, based out of Nairobi, Kenya, that was responsible for content moderation in the region. The Sama-operated hub announced its exit from content moderation services to focus on data annotation tasks, citing the prevailing economic climate as a reason for streamlining operations. Additionally, the Nairobi hub faced legal and ethical challenges, including allegations of poor working conditions, inadequate mental health support for moderators exposed to graphic content, and unfair labour practices. These issues led to lawsuits against both Sama and Meta, intensifying scrutiny of their practices.

Meanwhile, fact-checking partnerships with local organisations have played a crucial role in addressing disinformation, and their elimination erodes trust in Meta’s commitment to advancing information integrity in the region. Meta has fact-checking arrangements with various companies across 119 countries, including 26 in Africa. Some of the companies in Africa include AFP, AFP – Coverage, AFP – Hub, Africa Check, Congo Check, Dubawa, Fatabyyano فتبين,  Les Observateurs de France 24 and PesaCheck. In the aftermath of Meta’s decision to sever ties with their East African third-party content moderators, Sama let go of about 200 employees.

Opportunities Amidst Challenges

While Meta’s decision to abandon fact-checking is a concerning development, it also presents an opportunity for African stakeholders to utilise regional instruments, such as the African Charter on Human and Peoples’ Rights and the Declaration of Principles on Freedom of Expression and Access to Information in Africa, to assert thought leadership and demand better practices from platforms. Engaging with Meta’s regional leadership and building coalitions with other civil society actors can amplify advocacy about the continent’s longstanding digital rights and disinformation concerns and demands for more transparency and accountability.

Due to the ongoing pushback against the recently announced changes, Meta should be more receptive to dialogue and recommendations to review and contextualise the new proposals. For Africa, Meta must address its shortcomings by urgently investing and strengthening localised content moderation in Africa. It must reinvest in fact-checking partnerships, particularly with African organisations that understand local contexts. These partnerships are essential for addressing misinformation in local languages and underserved regions.

The company must also improve its automated content moderation tools, including by developing tools that can handle African culture, languages and dialects, hire more qualified moderators with contextual knowledge, provide comprehensive training for them and expand its partnerships with local stakeholders. Moreover, the company must ensure meaningful transparency and accountability as many of its transparency and content enforcement reports lack critical information and disaggregated data about its actions in most African countries.

Lastly, both governments and civil society in Africa must invest in digital, media and information literacy which is essential to empower users to critically think about and evaluate online content. Meta should partner with local organisations to promote digital literacy initiatives and develop educational campaigns tailored to different regions and languages. This will help build resilience against misinformation and foster a more informed digital citizenry.

In conclusion, it remains to be seen how the new changes by Meta will be implemented in the U.S., and subsequently in Africa, and how the company will address the gaps left by fact-checkers and mitigate the risks and negative consequences stemming from its decision. Notably, while there is widespread acknowledgement that content moderation systems on social media platforms are broken, efforts to promote and protect rights to free expression and access to information online should be encouraged. However, these efforts should not come at the expense of user trust and safety, and information integrity.

Human Rights Day: Here’s How African Countries Should Advance Digital Rights

By Edrine Wanyama and Patricia Ainembabazi |

As the world marks Human Rights Day 2024, themed Our Rights, Our Future, Right Now, we are reminded of the urgent need to advance and protect human rights in an increasingly digital world.  Today, CIPESA joins the world in commemorating Human Rights Day and reflecting on the immense opportunities that the digital age brings for the realisation of human rights. Indeed, this year’s theme emphasises the need for immediate actions to safeguard rights in the digital sphere for a just and equitable future.

Whereas human rights have traditionally been enjoyed in offline spaces, the digital landscape presents unprecedented opportunities for the enjoyment of a broad range of rights, including access to information, civic participation, and freedom of expression, assembly, and association. However, the potential of digital technology to catalyse the enjoyment of these rights has steadily been threatened by challenges such as internet shutdowns, regressive laws that enable governments to clamp down on the digital civic space, and the digital divide.

The threats to digital rights, democracy, and the rule of law in Africa are numerous. They are often the result of growing authoritarianism and repression, political instability, corruption, the breakdown of public institutions, gender disparities, and growing socio-economic inequalities. Below are key intervention areas to advance digital rights on the continent.

Combat Internet  Shutdowns and Internet Censorship  

Internet shutdowns are increasingly used as a tool to suppress dissent, stifle freedom of expression, restrict access to information and freedom of assembly and association. The #KeepItOn coalition documented at least 146 incidents of shutdowns in 37 countries in Africa between January 2016 and June 2023. These disruptions continue despite evidence that they harm individuals’ rights, are counterproductive for democracy, and have long lasting impacts on national economies and individuals’ livelihoods.

A separate survey of 53 African countries shows that, as of 2023, the majority (44) had restrictions on political media, 34 had implemented social media restrictions, two restricted VPN use and seven restricted the use of messaging and Voice Over IP applications.
Governments must commit to keeping the internet open and accessible, while telecom companies must uphold transparency and resist arbitrary shutdown orders. The African Union’s recent Resolution 580 by the African Commission on Human and Peoples’ Rights (ACHPR) should specifically guide governments in keeping the internet on, even during electoral periods.

Curb Unmitigated Surveillance  

The privacy of individuals while using digital technologies is critical to protecting freedom of expression, the right to privacy, assembly, and association. Unregulated surveillance practices threaten privacy and freedom of expression across Africa, often targeting journalists, activists, and political opponents. Governments must adopt robust data protection laws, ensure judicial oversight over surveillance, and implement transparency mechanisms to prevent abuse.  In many countries,  laws governing state surveillance have gaps that allow state institutions to target government critics or political opposition members by conducting surveillance without sufficient judicial, parliamentary, or other independent, transparent and accountable oversight. 

Through research and training, CIPESA has highlighted the dangers of mass surveillance and supported the development of data protection frameworks. Our work with National Human Rights Institutions in countries like Ethiopia has strengthened their capacity to monitor and address surveillance abuses. 

Combat Disinformation  

The proliferation of disinformation is detrimental to citizens’ fundamental rights, including freedom of expression, access to information, freedom of assembly and association and participation, especially in electoral democracy. It also means that many citizens lack access to impartial and diverse information. Disinformation undermines trust, polarises societies, and disrupts democratic processes. Combating disinformation requires governments, civil society, and private sector collaboration on fact-checking, media literacy campaigns, and rights-respecting regulations.  

Our extensive research on countering disinformation in Africa provides actionable recommendations for addressing this challenge. By partnering with media organisations, platforms, and fact-checking initiatives, CIPESA has promoted factual reporting and fought misinformation, particularly during elections.

Fight Technology-Facilitated Gender-Based Violence (TFGBV)

Online harassment and abuse disproportionately target women and marginalised groups, limiting their ability to engage freely in digital spaces. Governments, intermediaries, and civil society must collaborate to ensure safer online environments and provide support systems for victims. Also, African countries need clear laws against TFGBV, with attendant capacity development for the judiciary and law enforcers to implement those laws.

CIPESA continues to conduct workshops on addressing gender-based violence in digital spaces and supporting organisations working on these issues, equipping key actors with tools to report and counter this vice. Our advocacy efforts have also emphasised platform accountability and comprehensive anti-TFGBV policies. 

De-weaponize the Law  

The digital civic space and the emerging issues such as disinformation, misinformation, false news and national security and public order have created opportunities for authoritarian governments to weaponise laws in the name of efforts to curb “abuse” by citizens. Unfortunately, the laws are employed as repressive tools targeted at curtailing freedom of expression, access to information, assembly and association online. Indeed they have been employed to gag the spaces within which freedoms were enjoyed, and to silence critics and dissenters. Governments should embark on a clear reform agenda to repeal all draconian legislation and enact laws which are progressive and align with the established regional and international human rights standards. 

As part of CIPESA’s efforts to expose civic space wrongs and manipulations through publishing of policy briefs and legal analyses, we enjoin partners, collaborators and other tech sector players in amplifying voices that call for actions to expose the misuse of laws on information disorder, anti-cybercrime laws and other repressive legislation through evidence based advocacy that could fundamentally  influence successful challenge of unjust laws in courts, regional forums and  human rights enforcement mechanisms for galvanisation of success across the continent.  

Arrest the Digital Divide  

The digital divide remains a significant barrier to the enjoyment of rights and to inclusive citizen participation, with rural, underserved communities, and marginalised groups disproportionately affected. This divide excludes millions from accessing opportunities in education, healthcare, and economic participation. Common contributing factors include high internet usage costs, expensive digital devices, inadequate digital infrastructure and low digital literacy. Addressing this gap requires affordable internet, investment in rural connectivity, and digital literacy programmes.

CIPESA’s research sheds light on the main barriers to connectivity and affordability, including the effective use of Universal Service Funds. Promoting inclusive digital access, particularly for marginalised communities, requires collective action from governments and other tech sector players, calculated towards enabling equitable access to, and utilisation of digital tools.

Promote Multistakeholder Engagements  

The complexity of digital rights challenges necessitates continuous collaboration and building of partnerships amongst governments, civil society, and private sector actors. CIPESA has facilitated multistakeholder dialogues that bring together diverse actors to address digital rights concerns, including national dialogues and the annual Forum on Internet Freedom in Africa (FIFAfrica). These engagements have led to actionable commitments form governments, civil society and other tech sector players and strengthened partnerships for progressive reforms. 


Last Word

CIPESA reaffirms its commitment to advancing digital rights for all across Africa. However, the challenges to meaningful enjoyment of digital rights and the advancement of digital democracy are myriad. The solutions lie in concerted efforts by various actors, including governments, the private sector, and civil society, all of whom must act now to protect digital rights for a better human rights future . 

Fostering Responsible Business Conduct in Uganda’s Digital Age

By Patricia Ainembabazi |

The sixth edition of the Business and Human Rights Symposium in Uganda marked an essential step in Uganda’s journey to foster responsible and rights-respecting business conduct. Hosted on November 4-5, 2024, the symposium brought together over 200 participants from government, the private sector, academia, and civil society. It offered a platform to reflect on Uganda’s advancements in implementing its National Action Plan on Business and Human Rights and to consider newer frameworks such as the Corporate Sustainability Due Diligence Directive (CSDDD).

As part of the two-day proceedings, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) hosted a panel discussion on the interplay between digital innovation and the protection of human rights, highlighting both successes and challenges in Uganda’s tech ecosystem. The panel discussed the United Nations (UN) Guiding Principles on Business and Human Rights and how they align with the technology sector.

Source: Business for Social Responsibility  

Highlighting Uganda’s growing technology sector, including increased mobile and internet penetration as well as digitalisation of private and public services, the session also spotlighted pressing concerns, such as internet disruptions, labour rights violations, gender discrimination, and data protection and privacy, which continue to challenge human rights protections in the country’s growing digital economy.

Joel Basoga, Head of Technology Practice at H&G Advocates, stated that ​​it was “essential” for businesses in Uganda to embed respect for human rights as a core performance indicator guided by the UN Guiding Principles on Business and Human Rights. He added that for a tech-driven business landscape, legal frameworks surrounding digital rights need to be prioritised.

According to Patricia Ainembabazi, a Project Officer at CIPESA, there was limited understanding of business and human rights in the technology sector. Platforms such as the symposium were crucial in building a thematic understanding of digital rights. 

In 2021, Uganda became the first African country to finalise a National Action Plan on Business and Human Rights (NAPBHR), based on the United Nations Guiding Principles on Business and Human Rights. The plan strengthens the government’s duty to protect human rights, enhances the corporate responsibility to respect human rights, and ensures access to remedies for victims of human rights violations and abuses resulting from non-compliance by business entities.

In October 2024, CIPESA joined the first meeting of the Multi-Sectoral Technical Committee on Business and Human Rights, which supports the Uganda labour ministry’s role of coordinating the National Action Plan and provides technical guidance on all business and human rights interventions. At that meeting, CIPESA made the case for mainstreaming digital rights in the implementation of the action plan and also urged stakeholders to leverage innovative technologies to improve the outcomes of the action plan.

Similar to other countries in Africa, Uganda’s plan does not provide for digital rights protection, yet digital technologies have become central not only to how many businesses operate, but also to how individuals learn, work, socialise, and participate in community affairs. This increased digitalisation has had an impact on the ability of businesses to respect their human rights obligations.

Objectives of Uganda’s National Action Plan on Business and Human Rights
1. To strengthen institutional capacity, operations and coordination efforts of state and non-state actors for the protection and promotion of human rights in businesses;  
2. To promote human rights compliance and accountability by business actors;  
3. To promote social inclusion and rights of the vulnerable and marginalised individuals and groups in business operations;  
4. To promote meaningful and effective participation and respect for consent by relevant stakeholders in business operations; and  
5. To enhance access to remedy to victims of business-related human rights abuses and violations in business operations.

Speakers urged for increased cross-sector collaboration among stakeholders to align national frameworks more closely with the UN Guiding Principles. Opportunities for intervention include a push for robust data protection and privacy protections by the private sector; affordability of the internet and related technologies to ensure access to digital spaces; and raising awareness on digital rights roles and responsibilities for consumers and business owners. The symposium called upon stakeholders such as telecommunication companies, Internet Service Providers (ISP), financial institutions, innovators, and online platform operators to harmonise business goals with digital rights principles.

As part of the implementation of the NAPBHR, CIPESA is part of the newly launched Advancing Respect for Human Rights by Businesses in Uganda project led by Enabel and Uganda’s Ministry of Gender Labour and Social Development. The project is part of the European Union’s support towards the implementation of Uganda’s National Action Plan on Business and Human Rights and focuses on three thematic areas: labour rights in the agricultural sector, natural resource governance and land, and digital rights and internet governance. The project will work with six civil society organisations to drive advocacy, dialogue, and actions that strengthen Uganda’s Business and Human Rights agenda. Additionally, 50 businesses will receive support to implement human rights due diligence aligned with national and international standards.

ACHPR 81st Ordinary Session: CIPESA and Partners Host Dialogue on Advocacy Against Internet Shutdowns

By Patricia Ainembabzi |

At the 81st Ordinary Session of the African Commission on Human and Peoples’ Rights (ACHPR) held in Banjul on October 17-November 6, 2024, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), the International Center for Not-for-Profit Law (ICNL), and the Kenya Human Rights Commission (KHRC) hosted a side event centered on technology and electoral democracy in Africa. While drawing impetus from various experts, discussions delved into the impact of internet shutdowns on freedom of expression, access to information, business transactions, and exclusion of vulnerable communities across the African continent. This is despite the growing role of technology as an enabler of democratic participation and increased transparency and accountability.

In March 2024, the ACHPR adopted Resolution 580, urging African states to refrain from imposing internet shutdowns, particularly during electoral periods. Eight months on, the resolution is yet to gain traction. Over 100 shutdowns have been documented in Africa since 2019, reflecting a worrying escalation in digital rights abuses. The cumulative effect of these shutdowns includes suppression of political discourse, economic losses, and an erosion of public trust in government institutions, ultimately infringing on fundamental rights to free and fair participation in the digital age.

“37% of Africa’s population has experienced internet disruptions in recent years”, Florence Nakazibwe, ICNL.

Thobekile Matimbe of Paradigm Initiative, stated that national security as the rationale for internet shutdowns was disproportionate and Courts including at the Economic Community for West African States (ECOWAS) had rejected the security argument in favour of protecting digital rights. Acknowledging disinformation as a growing threat on the continent and one of the common themes in the national security justifications, Matimbe recommended targeted policies that balance free expression and countering disinformation alongside digital literacy programmes towards strengthening public trust and promoting access to information during electoral periods.

According to Grace Wangenchi from the  Independent Medico-Legal Unit (IMLU) internet shutdowns also have social impacts. Citing the example of heightened risks and isolations for victims of violence who rely on online resources for support, Wangechi called on interventions pushing back against shutdowns to be centred around the unique needs of vulnerable and marginalised communities.

Martin Mavenjina from the Kenya Human Rights Commission, added that documenting the social impact of internet shutdowns through case studies could support strategic litigation against shutdowns. He noted precedent set by the ECOWAS Court in cases from Togo, Guinea and Nigeria, Mavenjina called for advocacy efforts to ensure rulings “led to meaningful change”. 

Other avenues for pushing back against shutdowns put forward included the development of toolkits for civil society, that are anchored in ACHPR resolutions and continued research and documentation to inform engagements with policymakers, regulators, and internet service providers. Where opportunities were available, advocates were also called upon to inform consultations and calls for input by national task forces on elections.

Discussions also featured unpacking the newly developed toolkit to support the monitoring, documentation, and reporting on digital rights violations such as internet shutdowns by National Human Rights Institutes (NHRIs). The side-event builds on CIPESA and partners’ efforts to prioritise and spotlight digital rights issues as part of ACHPRs proceedings.

Confronting the Toll of Online Work on Women in Africa

By Ashnah Kalemera |

From domestic work, ride hailing, content moderation, and delivery services, to sex work, technology has revolutionised employment and labour across the world. In Africa, according to a 2023 report by the World Bank, between 2016 and 2020, job postings on one of the largest digital labour platforms more than doubled. This demand on the continent is expected to grow over the coming years.

These new forms of labour and employment have generally advanced inclusion in the workforce and promoted economic empowerment. However, despite the existence of initiatives such as SheWorks! that are dedicated to engaging women in online work, the promise of new skills, flexibility and income, the potential of women’s participation in digital work has not been fully realised. For instance, in South Africa, the share of women in online work (52%) is growing but still less than in similar occupations in the workforce at large (61%). 

Documented barriers to African women’s participation in online work include the gender digital divide and uneven access to the internet and digital tools. Meanwhile, the wider challenges of digitalisation of work and labour, including the lack of social protections, job insecurity, unequal pay, unfair treatment, discrimination, bias, increased surveillance and lack of autonomy, are exacerbated for women.

In an effort to build narratives and movements on gender and labour online, the recently concluded Forum on Internet Freedom in Africa 2024 (FIFAfrica24) featured discussions on feminist futures of work that highlighted lived experiences and advocacy strategies.

Speaking at the Forum, Abigail Osiki, a Lecturer in the Department of Mercantile and Labour Law at the University of Western Cape, South Africa, stated that fair and decent work for women online not only encompasses security, fair wages and productivity, but also mental health. The mental health challenges of online employment opportunities for women were said to be compounded by stigmas about certain forms of labour – such as sex work being considered prostitution and virtual assistants being “just secretaries”.

“The worst part about working for OnlyFans for me is the toll it takes on my mental health. For a long time I kind of swallowed my emotions and didn’t care what anyone thought because I was making so much money, but then it got to a point where I felt like I was selling my soul. I would often break down to my younger brother that I feel like I sold my soul.” – an Interview with an OnlyFans Worker.

The “uneven power” within online employment was also pointed out. Highlighting the example of non-disclosure agreements (NDAs), speakers at FIFAfrica24 argued that such agreements perpetuated “master-servant relationships” and “forced labour”, leaving many women with no option but to work “from a point of desperation as opposed to choice”.

A former content moderator for a social media platform narrated her recruitment as a language translator, signing of an NDA, and only finding out the scope of her work upon exposure to graphic content. She recalled the mental health side effects of the job and the inability to disclose the nature of her work to healthcare professionals for “fear of going to jail for 20 years” as stipulated in the NDA. She added that she worked in a foreign country without a work permit for a year, isolated in a hotel for six months and without leave days. Whereas she and colleagues were able to unionise, they had no legal support. The unionisation initially led to salary increments but their employment contracts were later terminated without benefits.

For African women in the informal sector, the situation was said to be even more dire. As part of her address in the opening ceremony, Catherine M’seteka of the International Domestic Workers Federation (IDWF) argued that limited access to information and digital illiteracy had made it harder for domestic workers to mobilise or report common violations such as forced labour, exploitation and sexual harassment.

According to Siasa Place’s Angela Chukunzira, digitalisation also had an impact on non-tech based labour. She cited the example of online reviews in the hospitality sector and their impact on the rights of housekeeping workers – who are usually women.

“Marginalised workers are invisible in policy making,” said M’Seteka as she called for more platforms  – both formal and informal – for multi-stakeholder engagement and advocacy on the digital economy.

M’Seteka’s and others’ calls echo recommendations in a policy brief on Labour and Digital Rights in Africa, which emphasised the need to strengthen legal recognition of workers online to ensure their safety and welfare alongside efforts to foster innovation and economic growth that overcome inequalities, bias and discrimination.

In visioning a future of work from a feminist perspective, Osiki stated that advocacy and policy interventions must consider women in the digital workforce as heterogeneous – of different cultures, contexts and involved in different types of work. That way, regulation of uneven power relations and efforts in collective bargaining would articulate varied interests to avoid exploitation. Priorities put forward for collective bargaining were equal pay, contract transparency, protection against harassment and exploitation, alongside career mobility and progression as well as health and safety. “All these [interests] vary for freelancers, domestic workers, location-based service providers and content moderators,” said Osiki.

African states were urged not to politicise technology-based jobs as a solution to the continent’s unemployment and poverty crisis. Rather, they should negotiate partnerships and equitable regulation with a view of increasing tax revenue to enable provision of social and welfare protections for citizens.

For the wider community – users, activists, media, the legal fraternity and civil society organisations – there were calls for solidarity such as in strategic litigation and establishing communities of care to support women in digital workspaces. This, together with efforts to promote cultural and language sensitivity for some forms of employment would go a long way in overcoming derogatory and biased narratives in society.