Shifting the Burden: Online Violence Against Women

By Evelyn Lirri |

Across Africa, the use of Information and Communications Technology (ICT) by women and girls remains low. Yet amidst the low access to digital tools, women, particularly those in public and political spaces, such as human rights defenders (HRDs), bloggers, and journalists, continue to be the primary target of various forms of online violence such as cyberstalking, sexual harassment, trolling, body shaming and blackmail.

 According to a 2021 global survey by UNESCO, nearly three-quarters of female journalists have experienced online harassment in the course of their work, forcing many to self-censor. Furthermore, a 2020 report by UN Women found that women in politics and the media were more likely to be victims of technology-based violence as a consequence of their work and public profiles.

Over the years, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has documented and pursued interventions aimed at addressing the significant obstacles hindering an increase in women’s participation not only in online spaces but also in the political sphere. A concerning and recurring trend is that, oftentimes, responses to violence against women have prioritised an individual’s responsibility for self-protection rather than systematic or policy actions. 

 At the Forum on Internet Freedom in Africa 2023 (FIFAfrica23), the National Democratic Institute (NDI), Pollicy, Africtivites, the Women of Uganda Network (WOUGNET), Internews and the Solidarity Centre shared lessons learned from their work implementing multi-stakeholder interventions to address online violence against women. During a panel discussion, it was noted that applying multi-stakeholder interventions that include governments, civil society, technology platforms and media was critical in promoting safe and meaningful participation of women in online spaces. Internews and WOUGNET highlighted the work they have been jointly engaged in through the FemTech project in various African countries, aimed at empowering women human rights defenders to safely participate in digital spaces while promoting equitable access to technology. Through trainings of women human rights defenders, CSOs, policy makers and law enforcers, the project is raining awareness on how women are often impacted by cyber crimes legislations. 

In Senegal, AfricTivistes, a network organisation made up of journalists, bloggers and HRDs, has spearheaded public advocacy campaigns on responsible use of the internet. The organisation has conducted gender-inclusive training and capacity-building workshops for journalists, bloggers, public officials and political leaders on how to respond to cyber violence. Aisha Dabo, a Programme Coordinator at AfricTivistes, noted that since 2017, over 700 people in 15 African countries have been reached with these trainings. The organisation also conducts media monitoring of online violence on social media platforms. 

Sarah Moulton, NDI’s Deputy Director for Democracy and Technology, highlighted the negative impact that online violence continues to have on women who are actively engaged in politics and political spaces. In Uganda, for instance, a joint report by Pollicy and NDI documented cases of gender-based online violence during the 2021 general elections and found that women and men politicians experienced online violence differently, with women candidates likely to be trolled and body shamed while men were more likely to experience hate speech. This echoed research by CIPESA which analysed the gender dynamics of politics in online spaces in Uganda, including campaigns for presidential, parliamentary, mayoral, and other local government seats during the same elections. The CIPESA research also explored the legal landscape and in similarity to Pollicy and NDI found that although Uganda has enacted a number of laws aimed at improving digital access and rights such as the Computer Misuse Act 2011, the Anti Pornography Act 2014, the Excise Duty (Amendment) Act 2018, most do not address the gender dynamics of the internet such as targeted online gender-based violence, affordability, and the lack of digital skills among women.  

Like Africvistes, NDI has engaged in a number of campaigns to document these various forms of violence and make recommendations to address the problem. In 2022, it released a  list of interventions that could be adopted globally by technology platforms, governments, civil society and the media to mitigate the impact of online violence against women in politics and hold perpetrators to account.  

“Often, the expectation is that the individual is responsible for addressing the issue or for advocating on behalf of themselves. It really needs to involve a lot of actors,” said Moulton. 

On its part, the Solidarity Centre has been spearheading a global campaign to end gender-based violence and harassment in the world of work. With the advent of Covid-19, a growing number of women shifted online for employment opportunities, access to services and education, among others. It was highlighted that female platform workers, including influencers, content creators and women who run online retail businesses, continue to face various violations such as sexual harassment and cyberbullying. 

Panelists called on governments to ratify the International Labour Organisation (ILO) Convention No. 190 on violence and harassment in the world of work. This global treaty recognises the impact of domestic violence in the workplace, and how women are often disproportionately affected.  Currently, the convention has been ratified by 32 countries globally, of which only eight are African.

Journalists attending FIFAfrica23 also shared their encounters with online violence and called for regular digital literacy skills to stay safe online. Alongside the need for enhanced digital literacy, participants also noted the lack of effective reporting mechanisms for cases. Ultimately, it was noted that efforts that shift the burden of blame from victims of online violence against women in Africa need to be more actively pursued, alongside more actionable, collaborative and systematic interventions by governments, law enforcement, and platforms.

CIPESA Conducts Digital Rights Training for Ethiopian Human Rights Commission Staff

By CIPESA Writer |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has conducted a digital rights training for staff of the Ethiopian Human Rights Commission (EHRC) in a programme that benefitted 22 experts from various departments of the statutory entity. 

The training was a response to the desire by the commission to build its organisational capacity in understanding and defending digital rights and CIPESA’s vision to grow the ability of African national human rights institutions (NHRIs) to monitor, protect and promote digital freedoms.

Conducted in the Ethiopian capital Addis Ababa on October 11-12, 2023, the programme aimed to build the EHRC staff’s understanding of digital rights issues and the link with traditional rights. Participants went on to brainstorm how the EHRC should strengthen human rights protection in the digital space and through the use of technology.

Dr. Abdi Jibril, the Commissioner for Civil and Political and Socio-Economic Rights at the EHRC, noted that the proliferation of digital technology has contributed positively to human rights protection. It was therefore necessary to maximise the benefits of digital technology and to expand its usage for the promotion and enforcement of human rights.

The importance of growing the capacity of NHRIs was underscored by Line Gamrath Rasmussen, Senior Adviser, Human Rights, Tech and Business at the Danish Institute for Human Rights and CIPESA Executive Director Dr. Wairagala Wakabi. African NHRIs are not always well versed with the opportunities and challenges which technology presents, which creates a need for capacity development and developing partnerships with stakeholders such as civil society. 

As legislation governing the technology domain is fast-evolving, NHRIs in many countries are playing catch up. As such, these institutions need to constantly keep updating themselves on new legislation and implications of these laws on human rights in the digital domain. The NHRIs need to  enhance their capacity to document, investigate and report on digital rights. 

The NHRIs also need to pay specific attention to the vices such as hate speech, disinformation, and technology-facilitated gender-based violence (TF GBV), that are being perpetuated with the aid of technology.

Dr. Daniel Bekele, the Chief Commissioner at EHRC, stated that social media companies and messaging platforms are not doing enough to moderate harmful content in Africa yet in other geographical regions they have invested more resources and efforts in content moderation. He said African countries need to work together in order to build a strong force against the powerful platforms. The official proposed that the African Union (AU), working with relevant governments and other stakeholders, should spearhead the development of regulations which African countries can jointly use in their engagements with the tech giants on issues such as content moderation. 

 The two-day training discussed the positive and negative effects of digital technology on human rights and how the commission’s work to enforce human rights can be strengthened through the use of digital technology.

Among other topics, the training also addressed the human rights, governance and technology landscape in Sub-Saharan Africa; public and private sector digitalisation and challenges for human rights; the link between online and offline rights; transparency and accountability of the private sector in upholding human rights; and opportunities for NHRIs to advance online rights at national, regional and international levels. It also featured deep dives into key digital rights concerns such as surveillance, online violence against women, disinformation, and network disruptions. 

At the end of the training, the EHRC staff identified key actions the commission could integrate in its annual work plans, such as digital rights monitoring, advocacy for enabling laws to be enacted, and developing tools for follow up on implementation of recommendations on digital rights by treaty bodies and the Human Rights Council. Others were collaborations with local and regional actors including media, fact-checkers, civil society organisations, and platforms; working with the police and other national mechanisms to tackle hate speech and disinformation while protecting human rights; and conducting digital literacy.

Trainers in the programme were drawn from CIPESA, the Centre for the Advancement of Rights and Democracy (CARD), the Danish Institute for Human Rights, the Centre for International Private Enterprise (CIPE), the African Centre for Media Excellence (ACME), Inform Africa, and the Kenya National Cohesion and Integration Commission (NCIC).

Meanwhile, after the aforementioned training, CIPESA teamed up with Ethiopian civil society partners to conduct a training on disinformation and hate speech for journalists, bloggers and digital rights activists. Like many African countries, Ethiopia is grappling with a significant and alarming rise in hate speech and disinformation, particularly on social media platforms. This surge in disinformation is undermining social cohesion, promoting conflict, and leading to a concerning number of threats against journalists and human rights defenders.

The proliferation of disinformation is to citizens’ fundamental rights as studies have shown that many Ethiopians feel their right to freedom of expression is compromised. The prevalence of disinformation also means that many Ethiopians lack access to impartial and diverse information.

Disinformation has been directly fueling conflict in several regions of Ethiopia. According to workshop participants and reports, both pro-government and anti-government actors have perpetuated this vice, whose real-world consequences are severe, including the loss of life and large-scale violent events.

Whereas Ethiopia in 2020 enacted legislation to curb hate speech and disinformation, the effectiveness of this law has been called into question. Some critics argue that it has not been effectively implemented and could be used to undermine citizens’ rights.

The training equipped 21 journalists, bloggers and activists with knowledge to navigate this law and with skills to call out and fight disinformation and hate speech. The efforts of the trained journalists, and those which the human rights commission could implement, are expected to boost the fight against online harms and contribute to the advancement of digital rights in Ethiopia.

Social Media 4 Peace: An Initiative to Tackle the Quagmire Of Content Moderation

By Juliet Nanfuka |

Content moderation has emerged as a critical global concern in the digital age. In Africa, coinciding with increasing digital penetration and digitalisation, along with problematic platform responses to harmful content and retrogressive national legislation, the demand for robust and rights-respecting content moderation has reached a new level of urgency. 

Online content platforms and governments are increasingly getting caught in a contentious struggle on how to address content moderation, especially as online hate speech and disinformation become more pervasive. These vices regularly seep into the real world, undermining human rights, social cohesion, democracy, and peace. They have also corroded public discourse and fragmented societies, with marginalised communities often bearing some of the gravest  consequences. 

The Social Media 4 Peace (SM4P) project run by the United Nations Educational, Scientific and Cultural Organization (UNESCO) was established to strengthen the resilience of societies to potentially harmful content spread online, in particular hate speech inciting violence, while protecting freedom of expression and enhancing the promotion of peace through digital technologies, notably social media. The project was piloted in four countries – Bosnia and Herzegovina, Colombia, Indonesia, and Kenya.

At the Forum on Internet Freedom in Africa (FIFAfrica) held in Tanzania in September 2023, UNESCO hosted a session where panellists interrogated the role of a multi-stakeholder coalition in addressing gaps in content moderation with a focus on Kenya. The session highlighted the importance of multi-stakeholder cooperation, accountability models, and safety by design to address online harmful content, particularly disinformation and hate speech in Africa. 

In March 2023, as a product of the SM4P, UNESCO in partnership with the National Cohesion and Integration Commission (NCIC) launched the National Coalition on Freedom of Expression and Content Moderation in Kenya. The formation of the coalition, whose membership has grown to include various governmental, civil society and private sector entities, is a testament to the need for content moderation efforts based on broader multi-stakeholder collaboration. 

As such, the coalition provided a learning model for the participants at FIFAfrica, who included legislators, regulators, civil society activists and policy makers, whose countries are grappling with establishing effective and rights-based content moderation mechanisms. The session explored good practices from the SM4P project in Kenya for possible replication of the model coalition across African countries. Discussions largely centred on issues around content moderation challenges and opportunities for addressing these issues in the region.

Online content moderation has presented a new regulatory challenge for governments and technology companies. Striking a balance between safeguarding freedom of expression and curtailing harmful and illegal content has presented a challenge especially as such decisions have largely fallen into the remit of platforms, and state actors have often criticised platforms’ content moderation practices. 

This has resulted in governments and platforms coming to loggerheads as was witnessed during the 2021 Uganda elections. Six days before the election, Meta blocked various accounts and removed content for what it termed as “coordinated inauthentic behavior”. Most of the accounts affected were related to pro-ruling party narratives or had links to the ruling party. In response, the Uganda government blocked social media before blocking access to the entire internet. Nigeria similarly suspended Twitter in June 2021 for deleting a post made from the president’s account.

Social media companies are taking various measures to curb such content, including by using artificial intelligence tools, employing human content moderators, collaborating with fact-checking organisations and trusted partner organisations that identify false and harmful content, and relying on user reports of harmful and illegal content. However, these measures by the platforms are often criticised as inadequate. 

On the other hand, some African governments were also criticised for enacting laws and issuing directives that undermine citizens’ fundamental rights, under the guise of combating harmful content.  

Indeed, speaking at the 2023 edition of FIFAfrica, Felicia Anthonio, #KeepItOn Campaign Manager at Access Now, stated that weaknesses in content moderation are often cited by some governments that shut down the internet. She noted that governments justify shutting down internet access as a tool to control harmful content amidst concerns of hate speech, disinformation, and incitement of violence. This was reaffirmed by Thobekile Matimbe from Paradigm Initiative who noted that “content moderation is a delicate test of speech”, stressing that if content moderation is not balanced against freedom of expression and access to information, it would result in violation of fundamental human rights. Beyond governments and social media companies, other stakeholders need to step up efforts to combat harmful content and advocate for balanced content moderation policies. For instance, speakers at the FIFAfrica session were unanimous that civil society organisations, academic institutions, and media regulators need to enhance digital media literacy and increase collaborative efforts. Further, they stressed the need for these stakeholders to regularly conduct research to underpin evidence-based advocacy, and to support the development of human rights-centered strategies in content moderation.

Effects of Disinformation on the Digital Civic Space Spotlighted at the African Commission

By CIPESA Writer |

The effects of disinformation on the digital civic space have been put in the spotlight at the 77th Ordinary session of the African Commission on Human and Peoples Rights held in Arusha, Tanzania on October 16-18, 2023.

In a  panel session titled “Promoting rights-respecting government responses to disinformation in Sub-Saharan Africa,” speakers explored how disinformation affects online rights and freedoms including freedom of expression, access to information, freedom of assembly and association and participation especially in electoral democracy. Speakers at the session, which was part of the Non-Government Organisations (NGOs) Forum, were drawn from Global Partners Digital, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA), ARTICLE 19 Senegal/West Africa, PROTEGE QV of Cameroon, and the Centre for Human Rights at the University of Pretoria

Hlengiwe Dube of the Centre for Human Rights explored the general terrain of disinformation in Africa, including the steadily evolving information disorder. She also highlighted the LEXOTA disinformation tracker created by a project led by Global Partners Digital with several African partners,  which was intended to ensure that limitations and controls on freedom of expression and access to information, as well as assembly and association, are minimised. 

Sheetal Kumar, the Head of Engagement and Advocacy at Global Partners Digital, said the tracker is an essential tool for exploring how laws and government actions against disinformation impact freedom of expression across Sub-Saharan Africa.  The tracker is an interactive platform that allows for real time checking and comparison of laws and actions taken in 44 out of 55 African countries in response to disinformation. It provides a reference point for developments and trends.

Edrine Wanyama, a Legal Officer at CIPESA, observed that disinformation has been widely employed by governments in Burkina Faso, Cameroon, Ethiopia, Kenya, Nigeria, South Africa, Tanzania, Uganda  as a excuse to enact laws and adopt regulations and  policies that to curtail the digital civic space. As a result, access to the internet, access to information, freedom of expression, assembly and association and citizen participation in electoral democracy have been widely limited. Wanyama said that, as noted in the CIPESA research on Disinformation Pathways and Effects: Case Studies from Five African Countries, internet shutdowns during elections such as in Tanzania and Uganda were partly justified as a measure against disinformation, but led to questions about the credibility of the elections.  

While discussing the advocacy initiatives undertaken by the project, Sylvie Siyam, director at Protege QV, noted that during the Covid-19 pandemic, some governments introduced measures to combat disinformation which contravene regional and international human rights standards. She said some of those measures remain in place and continue to be used to curtail freedom of expression, access to information, assembly and association.

She called for multi-stakeholder engagement especially involving CSOs, parliaments, and relevant government entities to pursue progressive policy reforms such as was witnessed by the adoption of the access to information law in Zimbabwe. 

Most of the strategies employed by states to combat disinformation largely interfere with civil liberties. Laws and policies are often utilised to limit the space within which key players such as law dons, political dissidents, human rights defenders, journalists and online activists operate. The pinch has been widely felt through increased arrests, denial of fair trial rights, denial of participation in electoral democracy, censorship of the press, curtailment of freedom of expression and access to information and limiting enjoyment of economic freedoms.  

Alfred Bulakali, Deputy Regional Director of ARTICLE 19 Senegal/West Africa, observed that disinformation endangers  civic space given the regressive measures that states often take, such as the enactment and adoption of retrogressive legislation. He called on states to use human rights-based approaches when responding to disinformation as a means to safeguarding civil liberties. Bulakali also stressed the need for capacity building of CSOs to effectively challenge regressive and draconian laws. 

The five partners provided the following joint recommendations for inclusion in the NGOs Statement to the African Commission on Human and Peoples’ Rights (ACHPR) 77th Ordinary Session.

Recommendations for States:

  1. Review and revise disinformation laws to align with international and regional human rights law and standards, eliminating general prohibitions on vague and ambiguous information dissemination. Ensure they have a narrow scope, adequate safeguards, and cannot be weaponised against journalists and human rights defenders. Review punitive measures, repeal laws criminalising sedition and defamation in favour of civil sanctions, and ensure compliance with international human rights laws.
  2. Develop and implement laws that combat disinformation openly, inclusively, and transparently, consulting with stakeholders. Train relevant authorities on regulations without infringing human rights, clearly communicate penalties, and build safeguards against misuse.
  3. Build the capacity of relevant actors to address disinformation in compliance with international standards. This includes addressing disinformation with multi-stakeholder and multi-disciplinary solutions, including media literacy training, empowering fact-checkers, journalists, legislators, and regulators, taking into account vulnerable and marginalised groups, in compliance with international standards. 
  4. Conduct awareness-raising programmes on the information disorder.
  5. Desist from resorting to disproportionate measures that violate human rights like internet shutdowns or website blockages in response to disinformation. 
  6. Enact and enforce access to information laws with proactive disclosure of credible and accurate information.
  7. Create a conducive environment that promotes healthy information ecosystems and ensures that citizens have access to diverse, reliable information sources, either proactively or upon request, in line with international human rights standards on access to information.
  8. Fully enforce decisions and frameworks on decriminalisation of defamation and press libel, restrain from using specific laws to repress speech and media for information disclosure under vague disposals relating to false news.
  9. Integrate Information and Media Literacy into the curricula of journalism training centres and schools.
  10. Train law enforcement actors on public information disclosure, the protection of freedom of expression in their approach to tackling disinformation and the prevention of public and political propaganda and information manipulation.

Recommendations for Civil Society Organisations:

  1. Monitor, document, and raise awareness of illegitimate detentions or imprisonments related to disinformation charges.
  2. Strengthen the advocacy and capacity building initiatives that support legal reforms for human rights legislations and policies tackling disinformation.
  3. Include digital and media literacy in advocacy initiatives.

Recommendations for Regional and International Bodies:

  1. Issue clear guidance on how states should develop and enforce disinformation legislation in a rights-respecting manner, including through open, inclusive, and transparent policy processes and multi-stakeholder consultations.
  2. Denounce the use of disinformation laws for political purposes or to restrict the work of journalists and legitimate actors.
  3. Integrate information disorder as a priority in human rights, rule of law, democracy and governance frameworks under development cooperation (bilateral and multilateral cooperation) and access to information as a tool to achieve accountability on public governance and the Sustainable Development Agenda. 

Additional Recommendations to the African Commission Special Rapporteur on Freedom of Expression and Access to Information and other African Commission Special Mechanisms:

  1. Collaborate with stakeholders to address the information disorder in Africa.
  2. Promote the 2019 Declaration of Principles on Freedom of Expression and Access to Information in Africa for addressing the information disorder.
  3. Continuously monitor and document disinformation trends and expand the normative framework to combat disinformation.
  4. Organise country visits in member countries where disinformation laws and press libels are used to restrict speech and citizen engagement.

Digital Rights Hub of African Civil Society Organisations

By Edrine Wanyama |

Since 2016, the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has been partnering with the International Center for Not-for-Profit Law (ICNL) to improve African digital civic spaces. 

At the September  Forum on internet Freedoms in Africa, CIPESA and ICNL convened a digital rights hub in Dar es Salaam, Tanzania aimed at promoting the digital civic space in Africa. The hub brought together civil society organisations (CSOs) representatives from 10 African countries. 

Across the continent, there is increased demand for democratic rule yet civic spaces continue to be undermined by state autocracy which still prevails in at least half the continent.  Rights and freedoms such as assembly, association, access to information and data privacy in the online space continue to be curtailed. This is despite that 2016  UN Human Rights Council Resolution  that called for the protection of rights afforded offline, to be applied in equal measure online.

The hub  held discussions on the relationship between digital civic space and its importance to CSOs and the internet infrastructure governance. 

Further insights were drawn from developments on artificial intelligence, surveillance, privacy rights, network disruptions, online content moderation, and the burgeoning concerns on  disinformation and its impact on the digital society. 

The hub concluded by defining the  role of CSOs in protecting the digital civic space through effective advocacy strategies such as litigation, legal analysis and the law making process, capacity building of key stakeholders including parliamentarians and making use of regional human rights monitoring mechanisms such as the United Nations Human Rights Council mechanisms like Universal Periodic Review and Special Rapporteurs and the African Commission on Human and Peoples Rights monitoring mechanisms were enumerated.  

The emerging statement from the convening can be accessed here