Applications are Open for a New Round of Africa Digital Rights Funding!

Announcement |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) is calling for proposals to support digital rights work across Africa.

This call for proposals is the 10th under the CIPESA-run Africa Digital Rights Fund (ADRF) initiative that provides rapid response and flexible grants to organisations and networks to implement activities that promote digital rights and digital democracy, including advocacy, litigation, research, policy analysis, skills development, and movement building.

 The current call is particularly interested in proposals for work related to:

  • Data governance including aspects of data localisation, cross-border data flows, biometric databases, and digital ID.
  • Digital resilience for human rights defenders, other activists and journalists.
  • Censorship and network disruptions.
  • Digital economy.
  • Digital inclusion, including aspects of accessibility for persons with disabilities.
  • Disinformation and related digital harms.
  • Technology-Facilitated Gender-Based Violence (TFGBV).
  • Platform accountability and content moderation.
  • Implications of Artificial Intelligence (AI).
  • Digital Public Infrastructure (DPI).

Grant amounts available range between USD 5,000 and USD 25,000 per applicant, depending on the need and scope of the proposed intervention. Cost-sharing is strongly encouraged, and the grant period should not exceed eight months. Applications will be accepted until November 17, 2025. 

Since its launch in April 2019, the ADRF has provided initiatives across Africa with more than one million US Dollars and contributed to building capacity and traction for digital rights advocacy on the continent.  

Application Guidelines

Geographical Coverage

The ADRF is open to organisations/networks based or operational in Africa and with interventions covering any country on the continent.

Size of Grants

Grant size shall range from USD 5,000 to USD 25,000. Cost sharing is strongly encouraged.

Eligible Activities

The activities that are eligible for funding are those that protect and advance digital rights and digital democracy. These may include but are not limited to research, advocacy, engagement in policy processes, litigation, digital literacy and digital security skills building. 

Duration

The grant funding shall be for a period not exceeding eight months.

Eligibility Requirements

  • The Fund is open to organisations and coalitions working to advance digital rights and digital democracy in Africa. This includes but is not limited to human rights defenders, media, activists, think tanks, legal aid groups, and tech hubs. Entities working on women’s rights, or with youth, refugees, persons with disabilities, and other marginalised groups are strongly encouraged to apply.
  • The initiatives to be funded will preferably have formal registration in an African country, but in some circumstances, organisations and coalitions that do not have formal registration may be considered. Such organisations need to show evidence that they are operational in a particular African country or countries.
  • The activities to be funded must be in/on an African country or countries.

Ineligible Activities

  • The Fund shall not fund any activity that does not directly advance digital rights or digital democracy.
  • The Fund will not support travel to attend conferences or workshops, except in exceptional circumstances where such travel is directly linked to an activity that is eligible.
  • Costs that have already been incurred are ineligible.
  • The Fund shall not provide scholarships.
  • The Fund shall not support equipment or asset acquisition.

Administration

The Fund is administered by CIPESA. An internal and external panel of experts will make decisions on beneficiaries based on the following criteria:

  • If the proposed intervention fits within the Fund’s digital rights priorities.
  • The relevance to the given context/country.
  • Commitment and experience of the applicant in advancing digital rights and digital democracy.
  • Potential impact of the intervention on digital rights and digital democracy policies or practices.

The deadline for submissions is Monday, November 17, 2025. The application form can be accessed here.

Democratising Big Tech: Lessons from South Africa’s 2024 Election

By Jean-Andre Deenik | ADRF

South Africa’s seventh democratic elections in May 2024 marked a critical turning point — not just in the political sphere, but in the digital one too. For the first time in our democracy’s history, the information space surrounding an election was shaped more by algorithms, platforms, and private tech corporations than by public broadcasters or community mobilisation.

We have entered an era where the ballot box is not the only battleground for democracy. The online world — fast-moving, largely unregulated, and increasingly dominated by profit-driven platforms — has become central to how citizens access information, express themselves, and participate politically.

At the Legal Resources Centre (LRC), we knew we could not stand by as these forces influenced the lives, choices, and rights of South Africans — particularly those already navigating inequality and exclusion. Between May 2024 and April 2025, with support from the African Digital Rights Fund (ADRF), we implemented the Democratising Big Tech project: an ambitious effort to expose the harms of unregulated digital platforms during elections and advocate for transparency, accountability, and justice in the digital age.

Why This Work Mattered

The stakes were high. In the run-up to the elections, political content flooded platforms like Facebook, YouTube, TikTok, and X (formerly Twitter). Some of it was civic-minded and constructive — but much of it was misleading, inflammatory, and harmful.

Our concern wasn’t theoretical. We had already seen how digital platforms contributed to offline violence during the July 2021 unrest, and how coordinated disinformation campaigns were used to sow fear and confusion. Communities already marginalised — migrants, sexual minorities, women — bore the brunt of online abuse and harassment.

South Africa’s Constitution guarantees freedom of expression, dignity, and access to information. Yet these rights are being routinely undermined by algorithmic systems and opaque moderation policies, most of which are designed and governed far beyond our borders. Our project set out to change that.

Centering People: A Public Education Campaign

The project was rooted in a simple truth: rights mean little if people don’t know they have them — or don’t know when they’re being violated. One of our first goals was to build public awareness around digital harms and the broader human rights implications of tech platforms during the elections.

We launched Legal Resources Radio, a podcast series designed to unpack the real-world impact of technologies like political microtargeting, surveillance, and facial recognition. Our guests — journalists, legal experts, academics, and activists — helped translate technical concepts into grounded, urgent conversations.

We spoke to:

Alongside the podcasts, we used Instagram to host

Holding Big Tech to Account

A cornerstone of the project was our collaboration with Global Witness, Mozilla, and the Centre for Intellectual Property and Information Technology Law (CIPIT). Together, we set out to test whether major tech companies (TikTok, YouTube, Facebook, and X) were prepared to protect the integrity of South Africa’s 2024 elections. To do this, we designed and submitted controlled test advertisements that mimicked real-world harmful narratives, including xenophobia, gender-based disinformation, and incitement to violence. These ads were submitted in multiple South African languages to assess whether the platforms’ content moderation systems, both automated and human, could detect and block them. The findings revealed critical gaps in platform preparedness and informed both advocacy and public awareness efforts ahead of the elections.

The results were alarming.

  • Simulated ads with xenophobic content were approved in multiple South African languages;
  • Gender-based harassment ads directed at women journalists were not removed;
  • False information about voting — including the wrong election date and processes — was accepted by TikTok and YouTube.

These findings confirmed what many civil society organisations have long argued: that Big Tech neglects the Global South, failing to invest in local language moderation, culturally relevant policies, or meaningful community engagement. These failures are not just technical oversights. They endanger lives, and they undermine the legitimacy of our democratic processes.

Building an Evidence Base for Reform

Beyond exposing platform failures, we also produced a shadow human rights impact assessment. This report examined how misinformation, hate speech, and algorithmic discrimination disproportionately affect marginalised communities. It documented how online disinformation isn’t simply digital noise — it often translates into real-world harm, from lost trust in electoral systems to threats of violence and intimidation.

We scrutinised South Africa’s legal and policy frameworks and found them severely lacking. Despite the importance of online information ecosystems, there are no clear laws regulating how tech companies should act in our context. Our report recommends:

  • Legal obligations for platforms to publish election transparency reports;
  • Stronger data protection and algorithmic transparency;
  • Content moderation strategies inclusive of all South African languages and communities;
  • Independent oversight mechanisms and civil society input.

This work is part of a longer-term vision: to ensure that South Africa’s digital future is rights-based, inclusive, and democratic.

Continental Solidarity

In April 2025, we took this work to Lusaka, Zambia, where we presented at the Digital Rights and Inclusion Forum (DRIF) 2025. We shared lessons from South Africa and connected with allies across the continent who are also working to make technology accountable to the people it impacts.

What became clear is that while platforms may ignore us individually, there is power in regional solidarity. From Kenya to Nigeria, Senegal to Zambia, African civil society is uniting around a shared demand: that digital technology must serve the public good — not profit at the cost of people’s rights.

What Comes Next?

South Africa’s 2024 elections have come and gone. But the challenges we exposed remain. The online harms we documented did not begin with the elections, and they will not end with them.

That’s why we see the Democratising Big Tech project not as a one-off intervention, but as the beginning of a sustained push for digital justice. We will continue to build coalitions, push for regulatory reform, and educate the public. We will work with journalists, technologists, and communities to resist surveillance, expose disinformation, and uphold our rights online.

Because the fight for democracy doesn’t end at the polls. It must also be fought — and won — in the digital spaces where power is increasingly wielded, often without scrutiny or consequence.

Final Reflections

At the LRC, we do not believe in technology for technology’s sake. We believe in justice — and that means challenging any system, digital or otherwise, that puts people at risk or threatens their rights. Through this project, we’ve seen what’s possible when civil society speaks with clarity, courage, and conviction.

The algorithms may be powerful. But our Constitution, our communities, and our collective will are stronger.

Amplifying African Voices in Global Digital Governance

By CIPESA Writer |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) will be participating at the 2025 Internet Governance Forum (IGF) in Norway. The IGF serves as key  global multistakeholder platform that facilitates the discussion of public policy issues pertaining to the Internet. This year, the Forum takes place from June 23-27, 2025 in Lillestrom, Norway under the overarching theme of Building Our Multistakeholder Digital Future.

CIPESA will contribute expertise across multiple sessions that examine digital rights in the Global South. These include discussions on repressive cyber laws and their impact on civic space, inclusive and harmonised data governance frameworks for Africa, and the barriers to participation in global technical standards development. CIPESA will also join sessions highlighting cross-regional cooperation on data governance, digital inclusion of marginalised groups, and the need for multilingual accessibility in global digital processes. CIPESA will also support a booth (number #57) hosted by the Civil Society Alliances for Digital Empowerment (CADE)  of which it is a member. The booth will showcase activities and resources, including the winners of the AI Artivism for Digital Rights Competition, the Youth Voices for Digital Rights programme, and much more. Through these engagements, CIPESA enhance and amplify African perspectives on platform accountability, digital justice, and rights-based approaches to internet governance.The insights gathered and shared at the IGF will also inform the upcoming 2025 edition of the Forum on Internet Freedom in Africa (FIFAfrica25) – an event convened annually by CIPESA. The Forum, now in its 12th year ranks as Africa’s leading platform for shaping digital rights, inclusion, and governance conversations. This year, the Forum will be hosted in  Windhoek, Namibia and will take place on September 24–26, 2025.

Here is where to find CIPESA @ IGF2025 ..

Monday, June 23 | 16:00-17:00 (CEST) – Workshop Room 3

Day 0 Event #257:  Enhancing Data Governance in the Public Sector  

This session will examine the state of data governance in the public sector of developing countries, emphasizing the importance of inclusive, multi-stakeholder engagement. It highlights how current frameworks often centre government institutions while neglecting interoperability, collaboration, and broader policy cohesion. Using global case studies—particularly from Papua New Guinea—it will spotlight challenges and propose innovations like centralized oversight bodies, interoperable platforms, and adaptive governance. Best practices such as real-time analytics, data partnerships, and capacity building will be explored to support scalable and context-specific governance solutions.

Tuesday, June 24 |  14:45–15:45 (CEST) – Workshop Room 4

Open Forum #56: Shaping Africa’s Digital Future: Multi-Stakeholder Panel on Data Governance

As Africa advances its digital transformation, harmonized data governance is critical to unlocking the continent’s potential for inclusive growth and digital trade. Fragmented national policies and inconsistent cross-border data frameworks create barriers to innovation, privacy, and cybersecurity. This session will convene stakeholders from government, industry, and civil society to explore strategies for regulatory alignment, trusted data flows, and climate-resilient governance models. Aligned with the AU Data Policy Framework, it will highlight best practices to build a unified, rights-respecting digital economy in Africa.

Tuesday, June 24 | 13:30-15:30 (CEST) –  Room  Studio N

Parliamentary session 4: From dialogue to action: Advancing digital cooperation across regions and stakeholder groups

Host: UN, Stortinget (Norwegian Parliament) and Inter Parliamentary UnionInternet (IPU)

Building on the outputs of the 2024 IGF Parliamentary Track and the discussions held so far in 2025, this multi-stakeholder consultation will bring together MPs and key digital players to reflect on how to operationalize concrete, inclusive and collaborative policymaking efforts. All groups will be invited to propose cooperative approaches to building digital governance and identify practical steps for sustaining cooperation beyond the IGF.

Wednesday, June 25 | 17:30 -19:00(CEST) – Workshop room 4, NOVA Spektrum 

Side event: Aspirations for the India AI Impact Summit

Hosts: CIPESA, Centre for Communication Governance at National Law University Delhi (CCG), United Nations Office for Digital and Emerging Technologies (UN ODET).

This closed-door dialogue aims to spark early conversations toward an inclusive and representative Global AI Impact Summit, focusing on the participation of Global Majority experts. It will explore meaningful engagement in Summit working groups, side events, and knowledge sharing, especially building on insights from the Paris Summit. The session is part of a broader effort to host multiple convenings that strengthen diverse stakeholder participation in global AI governance. By addressing foundational questions now, the dialogue seeks to shape intentional, impactful, and inclusive discussions at the upcoming Summit.

Wednesday June 25 | 09:00-09:45 (CEST) – Workshop Room 4

Networking Session #93: Cyber Laws and Civic Space – Global South–North Advocacy Strategies

Host: CADE

Many governments are enacting cyber laws to address online crime, but these often contain vague provisions that enable repression of journalists, activists, and ordinary citizens. In practice, such laws have facilitated mass surveillance, curtailed privacy, and been weaponised to stifle dissent, particularly under authoritarian regimes. This session brings together Global North and Global South civil society actors to exchange experiences, resources, and strategies for resisting repressive cyber legislation. It will focus on how collaborative advocacy can support legal reform and safeguard digital rights through shared tools, solidarity, and policy influence.

Wednesday, June 25 | 14:15–15:30 (CEST) – Workshop Room 4

Open Forum #7: Advancing Data Governance Together – Across Regions

Hosts: CIPESA, Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH, The Republic of The Gambia

As cross-border data flows grow rapidly, effective data governance is essential for fostering trust, security, and inclusive digital development. However, fragmented national regulations and inconsistent privacy and cybersecurity standards pose challenges to regional and global cooperation. This session brings together stakeholders from Africa, the Eastern Partnership, and the Western Balkans to explore harmonized, interoperable governance models that support responsible data sharing and economic growth. Through collaborative dialogue, the session will identify strategies for aligning data governance with digital rights, innovation, and sustainable development across diverse regional contexts.

Thursday, June 26 | 12:30–13:00 (CEST) – Open Stage

Lightning Talk #90: Tower of Babel Chaos – Tackling the Challenges of Multilingualism for Inclusive Communication

Host: CADE

This interactive session, led by members of the Civil Alliances for Digital Empowerment (CADE), highlights the communication challenges faced in global digital forums due to linguistic, gender, and geographic diversity. Using a flash-mob-style simulation, participants will experience firsthand the difficulties of multistakeholder dialogue when multiple native languages intersect without common understanding. The session underscores that language is often the most significant barrier to meaningful inclusion in global digital governance. It aims to provoke thought on the urgent need for more multilingual and accessible participation in international digital policy spaces.

Thursday, June 26 | 16:00-17:00 (CEST) – Workshop Room 6

WS #214: AI Readiness in Africa in a Shifting Geopolitical Landscape

Host: German Federal Ministry for Economic Cooperation and Development (BMZ), supported by GIZ

AI has vast potential, but without proper governance, it risks deepening inequality and reinforcing Africa’s dependency on global tech powers. Despite growing local engagement, Africa remains underrepresented in global AI development due to limited investment, regulatory gaps, and the dominance of multinational firms, raising concerns about digital exploitation. This session will bring together diverse voices to explore how Africa can build inclusive, locally rooted AI ecosystems that protect rights and serve regional needs.

Friday, June 27 | 09:00–10:00 (CEST) – Workshop Room 2

Open Forum #34: How Technical Standards Shape Connectivity and Inclusion

Host: Freedom Online Coalition

Technical standards are essential to enabling global connectivity, interoperability, and inclusive digital access, but their development often excludes voices from the Global Majority and marginalized communities. This session will examine how open and interoperable standards can bridge the digital divide, focusing on infrastructure such as undersea cables, network protocols, and security frameworks. It will explore barriers to inclusive participation in standard-setting bodies like the ITU, IETF, IEEE, and W3C, and identify strategies for transparency and multistakeholder engagement. By promoting equitable, rights-respecting technical governance, the session aims to support digital inclusion and advance sustainable development goals.

Friday, June 27 | 11:45–12:30 (CEST) – Workshop Room 6

Networking Session #74: Mapping Digital Rights Capacities and Threats

Host: Oxfam

This session will present findings from multi-country research on digital rights capacities and threats, with a focus on historically marginalised groups in the Global South. It will showcase innovative strategies and tools used to build digital literacy and awareness, using poster presentations from Bolivia, Cambodia, Palestine, Somalia, and Vietnam. Participants will engage in a moderated discussion to share practical approaches and collaborate on building a more inclusive, rights-based digital ecosystem. The session will also contribute to a shared online repository of tools, fostering international cooperation and capacity-building through the ReCIPE program led by Oxfam.

Friday, June 27 | 11:45–12:30 (CEST) – Workshop Room 5

Networking Session #200 – Cross-Regional Connections for Information Resilience 

Host: Proboxve

This networking session brings together participants from diverse regions to connect, share experiences, and develop collaborative strategies for safeguarding information integrity in electoral processes while upholding internet freedoms. The session will address critical challenges such as disinformation, censorship, foreign interference, platform manipulation, and civic education, emphasizing the importance of protecting digital rights, especially during elections.

Africa’s Digital Dilemma: Platform Regulation Vs Internet Freedom

By Brian Byaruhanga |

Imagine waking up to find Facebook and Instagram inaccessible on your phone – not due to a network disruption, but because the platforms pulled their services out of your country. This scenario now looms over Nigeria, as Meta, the parent company of Facebook and Instagram, may shut down its services in Nigeria over nearly USD 290 million in regulatory fines. The fines stem from allegations of anti-competitive practices, data privacy violations, and unregulated advertising content contrary to the national laws. Nigerian authorities insist the company must comply with national laws, especially those governing user data and competition. 

While this standoff centres on Nigeria, it signals a deeper struggle across Africa as governments assert digital sovereignty over global tech platforms. At the same time, millions of citizens rely on these platforms for communication, activism, access to health and education, economic livelihood, and self-expression. Striking a balance between regulation and rights in Africa’s evolving digital landscape has never been more urgent.

Meta versus Nigeria: Not Just One Country’s Battle

The tension between Meta and Nigeria is not new, nor is it unique. Similar dynamics have played out elsewhere on the continent:

  • Uganda (2021–Present): The Ugandan government blocked Facebook after the platform removed accounts linked to state actors during the 2021 elections. The block remains in place, effectively cutting off millions from a critical social media service unless they use Virtual Private Networks (VPNs) to circumvent the blockage.
  • Senegal (2023): TikTok was suspended amid political unrest, with authorities citing the app’s use for spreading misinformation and hate speech.
  • Ethiopia (2022): Facebook and Twitter were accused of amplifying hate speech during internal conflicts, prompting pressure for tighter oversight.
  • South Africa (2025): In a February 2025 report, the Competition Commission found that freedom of expression, plurality and diversity of media in South Africa had been severely infringed upon by platforms including Google and Facebook. 

The Double-Edged Sword of Regulation

Governments have legitimate reasons to demand transparency, data protection, and content moderation. Today, over two-thirds of African countries have legislation to protect personal data, and regulators are becoming more assertive. Nigeria’s Data Protection Commission (NDPC), created by a 2023 law, wasted little time in taking on a behemoth like Meta. Kenya also has an active Office of the Data Protection Commissioner, which has investigated and fined companies for data breaches. 

South Africa’s Information Regulator has been especially bold, issuing an enforcement notice to WhatsApp to comply with privacy standards after finding that the messaging service’s privacy policy in South Africa was different to that in the European Union. These actions send a clear message that privacy is a universal right, and Africans should not have weaker safeguards.

These regulatory institutions aim to ensure that citizens’ data is not exploited and that tech companies operate responsibly. Yet, in practice, digital regulation in Africa often walks a thin line between protecting rights and suppressing them.

While governments deserve scrutiny, platforms like Meta, TikTok, and X are not blameless. They often delay to respond to harmful content that fuels violence or division. Their algorithms can amplify hate, misinformation, and sensationalism, while opaque data harvesting practices continue to exploit users. For instance, Branch, a San Francisco-based microlending app operating in Kenya and Nigeria, collects extensive personal data such as handset details, SMS logs, GPS data, call records, and contact lists in exchange for small loans, sometimes for as little as USD 2. This exploitative business model capitalises on vulnerable socio-economic conditions, effectively forcing users to trade sensitive personal data for minimal financial relief.

Many African regulators are pushing back by demanding localisation of data, adherence to national laws, and greater responsiveness, but platform threats to exit rather than comply raise concerns of digital neo-colonialism where African countries are expected to accept second-tier treatment or risk exclusion.

Beyond privacy, African regulators are increasingly addressing monopolistic behaviour and unfair practices by Big Tech as part of a broader push for digital sovereignty. Nigeria’s USD 290 million fine against Meta is not just about data protection and privacy, but also fair competition, consumer rights, and the country’s authority to govern its digital space. Countries like Nigeria, South Africa and Kenya are asserting their right to regulate digital platforms within their borders, challenging the long-standing dominance of global tech firms. The actions taken against Meta highlight the growing complexity of balancing national interests with the transnational influence of tech giants. 

While Meta’s threat to exit may signal its discomfort with what it views as restrictive regulation, it also exposes the real struggle governments face in asserting control over digital infrastructure that often operates beyond state jurisdiction. Similarly, in other parts of Africa, there are inquiries and new policies targeting the market power of tech giants. For instance, South Africa’s competition authorities have looked at requiring Google and Facebook to compensate news publishers  (similar to the News Media and Digital Platforms Mandatory Bargaining Code in Australia). These moves reflect a broader global concern that a few platforms have too much control over markets and need checks to ensure fairness.

The Cost of Disruption: Economic and Social Impacts

When platforms go dark, the consequences are swift:

  • Businesses and entrepreneurs lose access to vital marketing and sales tools.
  • Creators and influencers face income loss and audience disconnection.
  • Activists and journalists find their voices limited, especially during politically charged periods.
  • Citizens are excluded from conversations and accessing information that could help them make critical decisions that affect their livelihoods.
  • Students and educators experience setbacks in remote learning, particularly in under-resourced communities that rely on social media or messaging apps to coordinate learning.
  • Access to public services is disrupted, from health services to government updates and emergency communications.

A 2023 GSMA report showed that more than 50% of small businesses in Sub-Saharan Africa use social media for customer engagement. In countries such as Nigeria, Uganda, Kenya or South Africa, Facebook and Instagram are lifelines. Losing access even temporarily sets back innovation, erodes trust, and impacts livelihoods.

A Call for Continental Solutions

Africa’s digital future must not hinge on the whims of a single government or a foreign tech giant. Both states and companies should be accountable for protecting rights in digital contexts, ensuring that development and digitisation do not trample on dignity and equity. This requires:

  • Harmonised continental policies on data protection, content regulation, and digital trade.
  • Regional norm-setting mechanisms (like the African Union) to enforce accountability for both governments and corporations.
  • Investments in African tech platforms to offer resilient alternatives.
  • Public education on digital rights to empower users against abuse from both state and corporate actors.
  • Pan-African contextualised business and human rights frameworks to ensure that digital governance aligns with both local realities and global human rights standards. This includes the operationalisation of the UN Guiding Principles on Business and Human Rights, following the examples of countries like Kenya, South Africa and Uganda, which have developed national action plans to embed human rights in corporate practice.

The stakes are high in the confrontation between Nigeria and Meta. If mismanaged, this tension could lead to fragmentation, exclusion, and setbacks for internet freedom, with ordinary users across the continent paying the price. To avoid this, the way forward must be grounded in the multistakeholder model of internet governance which aims for governments to regulate wisely and transparently, and for tech companies to respect local laws and communities, and for civil society to be actively engaged and vigilant. This will contribute to a future where the internet is open, secure, and inclusive and where innovation and justice thrive. 

CIPESA Joins over 125 Organisations and Academics In Submitting Letter to the UN Ad Hoc Committee on Cybercrime

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) has joined over 125 organisations and academics who work to protect and advance human rights, online and offline in submitting a letter to the United Nations Ad Hoc Committee on Cybercrime. The letter stresses that the process through which the Ad Hoc Committee does its work includes robust civil society
participation throughout all stages of the development and drafting of a convention, and that
any proposed convention include human rights safeguards applicable to both its substantive and
procedural provisions. The first session of the Ad Hoc Committee, which was scheduled to begin on January 17, 2022, has been rescheduled to begin on February 28, 2022, due to the ongoing situation concerning the coronavirus disease. See the full letter below.

————————————————————————————————————————————-

December 22, 2021

H.E. Ms. Faouzia Boumaiza Mebarki

Chairperson

Ad Hoc Committee to Elaborate a Comprehensive International Convention on Countering the Use of Information and Communication Technologies for Criminal Purposes

Your Excellency,

We, the undersigned organizations and academics, work to protect and advance human rights, online and offline. Efforts to address cybercrime are of concern to us, both because cybercrime poses a threat to human rights and livelihoods, and because cybercrime laws, policies, and initiatives are currently being used to undermine people’s rights. We therefore ask that the process through which the Ad Hoc Committee does its work includes robust civil society participation throughout all stages of the development and drafting of a convention, and that any proposed convention include human rights safeguards applicable to both its substantive and procedural provisions.

Background

The proposal to elaborate a comprehensive “international convention on countering the use of information and communications technologies for criminal purposes” is being put forward at the same time that UN human rights mechanisms are raising alarms about the abuse of cybercrime laws around the world. In his 2019 report, the UN special rapporteur on the rights to freedom of peaceful assembly and of association, Clément Nyaletsossi Voule, observed, “A surge in legislation and policies aimed at combating cybercrime has also opened the door to punishing and surveilling activists and protesters in many countries around the world.” In 2019 and once again this year, the UN General Assembly expressed grave concerns that cybercrime legislation is being misused to target human rights defenders or hinder their work and endanger their safety in a manner contrary to international law. This follows years of reporting from non-governmental organizations on the human rights abuses stemming from overbroad cybercrime laws.

When the convention was first proposed, over 40 leading digital rights and human rights organizations and experts, including many signatories of this letter, urged delegations to vote against the resolution, warning that the proposed convention poses a threat to human rights.

In advance of the first session of the Ad Hoc Committee, we reiterate these concerns. If a UN convention on cybercrime is to proceed, the goal should be to combat the use of information and communications technologies for criminal purposes without endangering the fundamental rights of those it seeks to protect, so people can freely enjoy and exercise their rights, online and offline. Any proposed convention should incorporate clear and robust human rights safeguards. A convention without such safeguards or that dilutes States’ human rights obligations would place individuals at risk and make our digital presence even more insecure, each threatening fundamental human rights.

As the Ad Hoc Committee commences its work drafting the convention in the coming months, it is vitally important to apply a human rights-based approach to ensure that the proposed text is not used as a tool to stifle freedom of expression, infringe on privacy and data protection, or endanger individuals and communities at risk.

The important work of combating cybercrime should be consistent with States’ human rights obligations set forth in the Universal Declaration of Human Rights (UDHR), the International Covenant on Civil and Political Rights (ICCPR), and other international human rights instruments and standards. In other words, efforts to combat cybercrime should also protect, not undermine, human rights. We remind States that the same rights that individuals have offline should also be protected online.

Scope of Substantive Criminal Provisions

There is no consensus on how to tackle cybercrime at the global level or a common understanding or definition of what constitutes cybercrime. From a human rights perspective, it is essential to keep the scope of any convention on cybercrime narrow. Just because a crime might involve technology does not mean it needs to be included in the proposed convention. For example, expansive cybercrime laws often simply add penalties due to the use of a computer or device in the commission of an existing offense. The laws are especially problematic when they include content-related crimes. Vaguely worded cybercrime laws purporting to combat misinformation and online support for or glorification of terrorism and extremism, can be misused to imprison bloggers or block entire platforms in a given country. As such, they fail to comply with international freedom of expression standards. Such laws put journalists, activists, researchers, LGBTQ communities, and dissenters in danger, and can have a chilling effect on society more broadly.

Even laws that focus more narrowly on cyber-enabled crimes are used to undermine rights. Laws criminalizing unauthorized access to computer networks or systems have been used to target digital security researchers, whistleblowers, activists, and journalists. Too often, security researchers, who help keep everyone safe, are caught up in vague cybercrime laws and face criminal charges for identifying flaws in security systems. Some States have also interpreted unauthorized access laws so broadly as to effectively criminalize any and all whistleblowing; under these interpretations, any disclosure of information in violation of a corporate or government policy could be treated as “cybercrime.” Any potential convention should explicitly include a malicious intent standard, should not transform corporate or government computer use policies into criminal liability, should provide a clearly articulated and expansive public interest defense, and include clear provisions that allow security researchers to do their work without fear of prosecution.

Human Rights and Procedural Safeguards

Our private and personal information, once locked in a desk drawer, now resides on our digital devices and in the cloud. Police around the world are using an increasingly intrusive set of investigative tools to access digital evidence. Frequently, their investigations cross borders without proper safeguards and bypass the protections in mutual legal assistance treaties. In many contexts, no judicial oversight is involved, and the role of independent data protection regulators is undermined. National laws, including cybercrime legislation, are often inadequate to protect against disproportionate or unnecessary surveillance.

Any potential convention should detail robust procedural and human rights safeguards that govern criminal investigations pursued under such a convention. It should ensure that any interference with the right to privacy complies with the principles of legality, necessity, and proportionality, including by requiring independent judicial authorization of surveillance measures. It should also not forbid States from adopting additional safeguards that limit law enforcement uses of personal data, as such a prohibition would undermine privacy and data protection. Any potential convention should also reaffirm the need for States to adopt and enforce “strong, robust and comprehensive privacy legislation, including on data privacy, that complies with international human rights law in terms of safeguards, oversight and remedies to effectively protect the right to privacy.”

There is a real risk that, in an attempt to entice all States to sign a proposed UN cybercrime convention, bad human rights practices will be accommodated, resulting in a race to the bottom. Therefore, it is essential that any potential convention explicitly reinforces procedural safeguards to protect human rights and resists shortcuts around mutual assistance agreements.

Meaningful Participation

Going forward, we ask the Ad Hoc Committee to actively include civil society organizations in consultations—including those dealing with digital security and groups assisting vulnerable communities and individuals—which did not happen when this process began in 2019 or in the time since.

Accordingly, we request that the Committee:

●  Accredit interested technological and academic experts and nongovernmental groups, including those with relevant expertise in human rights but that do not have consultative status with the Economic and Social Council of the UN, in a timely and transparent manner, and allow participating groups to register multiple representatives to accommodate the remote participation across different time zones.

●  Ensure that modalities for participation recognize the diversity of non-governmental stakeholders, giving each stakeholder group adequate speaking time, since civil society, the private sector, and academia can have divergent views and interests.

●  Ensure effective participation by accredited participants, including the opportunity to receive timely access to documents, provide interpretation services, speak at the Committee’s sessions (in-person and remotely), and submit written opinions and recommendations.

●  Maintain an up-to-date, dedicated webpage with relevant information, such as practical information (details on accreditation, time/location, and remote participation), organizational documents (i.e., agendas, discussions documents, etc.), statements and other interventions by States and other stakeholders, background documents, working documents and draft outputs, and meeting reports.

Countering cybercrime should not come at the expense of the fundamental rights and dignity of those whose lives this proposed Convention will touch. States should ensure that any proposed cybercrime convention is in line with their human rights obligations, and they should oppose any proposed convention that is inconsistent with those obligations.

We would be highly appreciative if you could kindly circulate the present letter to the Ad Hoc Committee Members and publish it on the website of the Ad Hoc Committee.

Signatories,*

  1. Access Now – International
  2. Alternative ASEAN Network on Burma (ALTSEAN) – Burma
  3. Alternatives – Canada
  4. Alternative Informatics Association – Turkey
  5. AqualtuneLab – Brazil
  6. ArmSec Foundation – Armenia
  7. ARTICLE 19 – International
  8. Asociación por los Derechos Civiles (ADC) – Argentina
  9. Asociación Trinidad / Radio Viva – Trinidad
  10. Asociatia Pentru Tehnologie si Internet (ApTI) – Romania
  11. Association for Progressive Communications (APC) – International
  12. Associação Mundial de Rádios Comunitárias (Amarc Brasil) – Brazil
  13. ASEAN Parliamentarians for Human Rights (APHR)  – Southeast Asia
  14. Bangladesh NGOs Network for Radio and Communication (BNNRC) – Bangladesh
  15. BlueLink Information Network  – Bulgaria
  16. Brazilian Institute of Public Law – Brazil
  17. Cambodian Center for Human Rights (CCHR)  – Cambodia
  18. Cambodian Institute for Democracy  –  Cambodia
  19. Cambodia Journalists Alliance Association  –  Cambodia
  20. Casa de Cultura Digital de Porto Alegre – Brazil
  21. Centre for Democracy and Rule of Law – Ukraine
  22. Centre for Free Expression – Canada
  23. Centre for Multilateral Affairs – Uganda
  24. Center for Democracy & Technology – United States
  25. Center for Justice and International Law (CEJIL) – International
  26. Centro de Estudios en Libertad de Expresión y Acceso (CELE) – Argentina
  27. Civil Society Europe
  28. Coalition Direitos na Rede – Brazil
  29. Código Sur – Costa Rica
  30. Collaboration on International ICT Policy for East and Southern Africa (CIPESA) – Africa
  31. CyberHUB-AM – Armenia
  32. Data Privacy Brazil Research Association – Brazil
  33. Dataskydd – Sweden
  34. Derechos Digitales – Latin America
  35. Defending Rights & Dissent – United States
  36. Digital Citizens – Romania
  37. DigitalReach – Southeast Asia
  38. Digital Rights Watch – Australia
  39. Digital Security Lab – Ukraine
  40. Državljan D / Citizen D – Slovenia
  41. Electronic Frontier Foundation (EFF) – International
  42. Electronic Privacy Information Center (EPIC) – United States
  43. Elektronisk Forpost Norge – Norway
  44. Epicenter.works for digital rights – Austria
  45. European Center For Not-For-Profit Law (ECNL) Stichting – Europe
  46. European Civic Forum – Europe
  47. European Digital Rights (EDRi) – Europe
  48. ​​eQuality Project – Canada
  49. Fantsuam Foundation – Nigeria
  50. Free Speech Coalition  – United States
  51. Foundation for Media Alternatives (FMA) – Philippines
  52. Fundación Acceso – Central America
  53. Fundación Ciudadanía y Desarrollo de Ecuador
  54. Fundación CONSTRUIR – Bolivia
  55. Fundacion Datos Protegidos  – Chile
  56. Fundación EsLaRed de Venezuela
  57. Fundación Karisma – Colombia
  58. Fundación OpenlabEC – Ecuador
  59. Fundamedios – Ecuador
  60. Garoa Hacker Clube  –  Brazil
  61. Global Partners Digital – United Kingdom
  62. GreenNet – United Kingdom
  63. GreatFire – China
  64. Hiperderecho – Peru
  65. Homo Digitalis – Greece
  66. Human Rights in China – China
  67. Human Rights Defenders Network – Sierra Leone
  68. Human Rights Watch – International
  69. Igarapé Institute — Brazil
  70. IFEX – International
  71. Institute for Policy Research and Advocacy (ELSAM) – Indonesia
  72. The Influencer Platform – Ukraine
  73. INSM Network for Digital Rights – Iraq
  74. Internews Ukraine
  75. InternetNZ – New Zealand
  76. Instituto Beta: Internet & Democracia (IBIDEM) – Brazil
  77. Instituto Brasileiro de Defesa do Consumidor (IDEC) – Brazil
  78. Instituto Educadigital – Brazil
  79. Instituto Nupef – Brazil
  80. Instituto de Pesquisa em Direito e Tecnologia do Recife (IP.rec) – Brazil
  81. Instituto de Referência em Internet e Sociedade (IRIS) – Brazil
  82. Instituto Panameño de Derecho y Nuevas Tecnologías (IPANDETEC) – Panama
  83. Instituto para la Sociedad de la Información y la Cuarta Revolución Industrial – Peru
  84. International Commission of Jurists – International
  85. The International Federation for Human Rights (FIDH)
  86. IT-Pol – Denmark
  87. JCA-NET – Japan
  88. KICTANet – Kenya
  89. Korean Progressive Network Jinbonet – South Korea
  90. Laboratorio de Datos y Sociedad (Datysoc) – Uruguay
  91. Laboratório de Políticas Públicas e Internet (LAPIN) – Brazil
  92. Latin American Network of Surveillance, Technology and Society Studies (LAVITS)
  93. Lawyers Hub Africa
  94. Legal Initiatives for Vietnam
  95. Ligue des droits de l’Homme (LDH) – France
  96. Masaar – Technology and Law Community – Egypt
  97. Manushya Foundation – Thailand
  98. MINBYUN Lawyers for a Democratic Society – Korea
  99. Open Culture Foundation – Taiwan
  100. Open Media  – Canada
  101. Open Net Association – Korea
  102. OpenNet Africa – Uganda
  103. Panoptykon Foundation – Poland
  104. Paradigm Initiative – Nigeria
  105. Privacy International – International
  106. Radio Viva – Paraguay
  107. Red en Defensa de los Derechos Digitales (R3D) – Mexico
  108. Regional Center for Rights and Liberties  – Egypt
  109. Research ICT Africa
  110. Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) – Canada
  111. Share Foundation – Serbia
  112. Social Media Exchange (SMEX) – Lebanon, Arab Region
  113. SocialTIC – Mexico
  114. Southeast Asia Freedom of Expression Network (SAFEnet) – Southeast Asia
  115. Supporters for the Health and Rights of Workers in the Semiconductor Industry (SHARPS) – South Korea
  116. Surveillance Technology Oversight Project (STOP)  – United States
  117. Tecnología, Investigación y Comunidad (TEDIC) – Paraguay
  118. Thai Netizen Network  – Thailand
  119. Unwanted Witness – Uganda
  120. Vrijschrift – Netherlands
  121. West African Human Rights Defenders Network – Togo
  122. World Movement for Democracy – International
  123. 7amleh – The Arab Center for the Advancement of Social Media  – Arab Region

Individual Experts and Academics

  1. Jacqueline Abreu, University of São Paulo
  2. Chan-Mo Chung, Professor, Inha University School of Law
  3. Danilo Doneda, Brazilian Institute of Public Law
  4. David Kaye, Clinical Professor of Law, UC Irvine School of Law, former UN Special Rapporteur on Freedom of Opinion and Expression (2014-2020)
  5. Wolfgang Kleinwächter, Professor Emeritus, University of Aarhus; Member, Global Commission on the Stability of Cyberspace
  6. Douwe KorffEmeritus Professor of International LawLondon Metropolitan University
  7. Fabiano Menke, Federal University of Rio Grande do Sul
  8. Kyung-Sin Park, Professor, Korea University School of Law
  9. Christopher Parsons, Senior Research Associate, Citizen Lab, Munk School of Global Affairs & Public Policy at the University of Toronto
  10. Marietje Schaake, Stanford Cyber Policy Center
  11. Valerie Steeves, J.D., Ph.D., Full Professor, Department of Criminology University of Ottawa