Participant Reflection on #FIFAfrica22: Effective Engagement in the UPR Process for Digital Rights Promotion

By Murungi Judith |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) and Small Media held a workshop on the Universal Periodic Review (UPR) process as part of  Forum on Internet Freedom in Africa (FIFAfrica 22), which was held in Lusaka Zambia from September 26-29, 2022. The workshop is a product of the UPROAR project aimed at advancing the cause of digital rights globally by supporting engagement in international advocacy at the UPR. 

The 32 participants at the workshop represented a diverse array of backgrounds including  civil society, digital rights activism and advocacy, legal, journalism, and academia.  A total of  20 countries were also represented -Benin,Burundi, Botswana,  Cameroon, Democratic Republic of Congo, Ethiopia, Ghana, India Kenya, Mozambique, Nigeria,  Senegal, Sri Lanka, Sudan, Tanzania,Uganda,United Kingdom, United States of America, Zambia, and Zimbabwe.

The workshop entailed an overview of the UPR, its  purpose and the processes. Also included were in depth  discussions on international and regional normative frameworks on digital rights. Specific attention was drawn to the Universal Declaration of Human Rights as the first normative framework on freedom of expression. The International Convention on Civil and Political Rights (ICCPR) was also explored under the core tenets of the right to hold opinions without interference (freedom of opinion), the right to seek and receive information (access to information) and the right to impart information (freedom of expression).

It was noted that the right to freedom of expression is not absolute and that the three-part test is key in determining the circumstances which potentially justify limitations. Under Article 19 (2) of the ICCPR, limitations are specifically listed as ( i) it must be provided for in law (ii) it must pursue a legitimate aim (iii) it must be necessary for a legitimate purpose.

The three-part-test formed the basis of heated debate related to electoral democracy and internet shutdowns in countries like Cameroon and Tanzania when compared to Kenya where the government did not impose an internet shutdown during their recent elections. As a result of the comparative discussions, participants reached  the conclusion that there are still actions of governments that are a threat to internet freedom such as arrests, detention and assassination of some journalists. It is the responsibility of civil society, activists and human rights defenders to hold governments accountable through the use and increased participation in the UPR process. 

The presence of Hon. Neema Lugangira from Tanzania, a member of Tanzanian Parliament and the Chairperson of the African Parliamentary Network on Internet Governance in the sessions was priceless and a beacon of hope in bridging the gap between civil society and policy makers towards promoting digital rights through the UPR.

The workshop also explored various case law on freedom of expression in Africa including precedent such as in Lohé Issa Konaté v Burkina Faso. Participants deliberated on the relevance of evaluating and critically assessing the law and ensuring that cases are framed in a manner that is in line with the jurisdiction of the particular court of law approached without which matters could be thrown out. This session gave the participants a clear understanding of the link between offline and online rights and specific laws that apply to minority and marginalised groups such as children, women, persons with disabilities and other vulnerable communities. 

The session on campaign and advocacy planning aimed at equipping participants with the necessary tools required to engage partners on how to carry out campaigns and to execute advocacy strategies through the UPR. It highlighted the eye-catching and precise advocacy materials that could be used in social media as well as other platforms for the UPR at local level. It led to discussions on the critical role played by local stakeholders in leveraging the UPR for digital rights development in their various contexts. The session helped the participants understand how to engage with local partners and to ensure that there is effective implementation of recommendations made to their respective countries. This involved fact sheets and how to use them during the UPR process. 

Participants engaged in a practical lobbying session where they had to appear before a UN delegate and present the issues affecting digital rights in their respective countries and recommendations for reform. This practical group exercise was very beneficial and informative because it gave the participants a chance to apply what they had learnt in regard to the UPR process. It gave them an opportunity to experience the review process at Geneva. 

Through the UPROAR Website, participants were guided on how to leverage research and social media platforms online for effective design and branding as part of UPR engagements  related to digital rights. The workshop also entailed guidance on what stakeholder mapping is and its importance.

In a subsequent panel entitled ‘Stemming the Tide: Has the Universal Periodic Review Mechanism Contributed to Changes in the Digital Rights Landscape of States Under Review?’ panelists shared experiences from Namibia, Democratic Republic of Congo, Uganda, Rwanda and many others. This gave the participants in the workshop an understanding on how to prepare for stakeholder engagements and how to conduct evidence-based advocacy at the United Nations Human Rights Council.

It was noted that the Covid-19 pandemic led to the imposition of travel restrictions which caused difficulties in traveling to Geneva to physically participate in the UPR process. Online opportunities were a welcome alternative but the lack of reliable internet access among civil society on the continent during the sessions presented an additional barrier

Beyond making submissions and engaging during review sessions, participants were urged to also take part in monitoring recommendations. Experiences were shared about governments such as that of Uganda which rejected all the recommendations that were given in regard to digital rights. In such instances participants were encouraged not to give up and draw back due to such government response but to keep doing the work of advocacy in line with digital rights since the same is also a notable step in the right direction. They were also encouraged to collaborate with law and policy members to ensure that they know about the UPR process and that they are able to positively respond to the recommendations given. They were also encouraged to ensure that there is in-country pressure from civil society to ensure that governments act on the recommendations given to them. It was noted that in Tanzania there has been a significant increase in the acceptance of recommendations after there has been collaboration between civil society and parliamentarians.  

The UPR sessions at FIFAfrica22 were very informative and intriguing as it engaged well-equipped workshop trainers. Experiences from those who had participated in Geneva engagements on digital rights stirred the urge for proactive engagement and participation by those coming up for review like Botswana.

How the MTN Group Can Improve its Digital Human Rights Policy and Reporting

CIPESA Writer |

These proposals are made to the MTN Group in respect of its Digital Human Rights Policy. The proposals commend the positive elements of the Policy including the proclamation to respect the rights of users including in privacy, communication, access and sharing information in a free and responsible manner. The submission points to areas where the telecoms group can further improve its role in the protection of human rights.

The United Nations Guiding Principles on Business and Human Rights (UNGPs) enjoin corporate entities to act with due diligence to avoid infringements on human rights. They also provide ways through which adverse impacts on human rights can be addressed. It is therefore commendable that MTN developed a Digital Human Rights Policy and is open to commentary and suggestions for  strengthening its implementation. It is imperative that MTN takes proactive and consistent measures to comply with international human rights instruments such as the UNGPs, the leading global framework focused on business responsibility and accountability for human rights, which were unanimously endorsed by States at the United Nations in 2011.

Some of the Principles that MTN needs to pay close attention to include the following:

 Principle 11: Business enterprises should respect human rights. This means that they should avoid infringing on the human rights of others and should address adverse human rights impacts with which they are involved.

Principle 13: The responsibility to respect human rights requires that business enterprises (a) Avoid causing or contributing to adverse human rights impacts through their own activities, and address such impacts when they occur; (b) Seek to prevent or mitigate adverse human rights impacts that are directly linked to their operations, products or services by their business relationships, even if they have not contributed to those impacts.

Principle 15. In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including:

(a) A policy commitment to meet their responsibility to respect human rights;

(b) A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights;

(c) Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.

Principle 23:  In all contexts, business enterprises should:

  1. Comply with all applicable laws and respect internationally recognised human rights, wherever they operate;
  2. Seek ways to honour the principles of internationally recognised human rights when faced with conflicting requirements;
  3. Treat the risk of causing or contributing to gross human rights abuses as a legal compliance issue wherever they operate.

Respect for digital rights is also stipulated in the Declaration of Principles on Freedom of Expression and Access to Information in Africa of 2019 which MTN needs to be cognisant of as part of efforts to ensure that it upholds respect for human rights.

CIPESA Proposals to the MTN Group
The MTN Group is a market leader in various service areas in several countries where it has operations. It is also a key employer and tax payer, and by facilitating the operations of other sectors,  MTN is a key contributor to the Gross Domestic Product (GDP) and to the health of the respective countries’ economies. It is crucial that the company develops and effects a robust Digital Human Rights Policy. Notably, MTN has trailed other operators, such as Orange, Millicom and Vodafone in rolling out a digital rights policy, and in transparency reporting.

While MTN last year issued its inaugural transparency report as part of its annual reporting, there are areas of concern for which we make the following recommendations:

  1. Provide more granular and disaggregated data about the number and nature of requests MTN receives from government agencies. At present, it is not clear how many of those requests relate to the release of users’ identifying data, how many were on metadata, and how many were on rendering support to communication monitoring and interception. Besides providing such a breakdown, MTN should also explain how many requests, if any, were not adhered to and why. Further, the report should indicate which particular government departments made the requests and whether all their requests were backed by a court order.
  2. Provide more nuanced information in reporting on the Digital Human Rights Policy to enable the contextualisation of country-specific explanations of government requests. In the last report, for instance, it is difficult to comprehend the information on government requests from Uganda. Given that Uganda is one of the countries where MTN has the largest number of subscribers, and given that country’s human rights record, the numbers are inexplicably few (12 in total) compared to Congo Brazzaville (1,600), eSwatini (3,661), Ghana (1,642), Guinea Conakry (6,480), Ivory Coast (4,215), Nigeria (4,751), Rwanda (602), South Africa (15,903), South Sudan (1,748), Sudan (5,105), and Zambia (8,294).
  3. In its transparency reporting on implementation of its Digital Human Rights Policy, MTN should reflect on the role of local laws and regulations in enabling or hampering the realisation of digital human rights. What elements are supportive and which ones are retrogressive? Which grey areas need clarification or call for repeal of laws?
  4. Include in the MTN transparency report a detailed and analytical section on network disruptions, as these are highly controversial and have wide-ranging economic, public service and human rights impacts yet they are becoming endemic in many of the countries where MTN operates. Further, MTN should include information on whether it received (or demanded – as we propose it should) written justifications from regulators (or government officials and bodies who issue shutdown orders) for the shutdown orders, including citation of the specific laws and provisions under which they are issued and the situation that warranted invoking the disruption. Additionally, the MTN Group should commit to scrutinise each demand, order or request and challenge them if they are not clear, specific, written, valid or do comply with national laws. It should also keep a written record of such demands, orders or requests.
  5. The MTN Policy and reporting should have a section and actions dedicated to inclusion of marginalised groups, a key area being enabling access and accessibility for persons with disabilities. Research conducted by CIPESA showed that, in countries where it operated, MTN had not taken any deliberate efforts to make its services more accessible to persons with disabilities. Beyond the additional section, MTN should appoint / designate Inclusion and Human Rights Ambassadors, and build the capacity of internal teams to facilitate engagement and compliance with digital accessibility obligations.
  6. MTN should take a proactive stance in making its Digital Human Rights Policy, including country-specific transparency information, well publicised among users, civil society and government officials in the respective countries. This will aid the growth of knowledge about MTN policies, inspire other companies to respect human rights, and draw feedback on how MTN can further improve its human rights policies and practices.
  7. MTN should develop relationships with, and have proactive and sustained engagements with civil society, consumer groups and governments on the implementation of its Digital Human Rights Policy. Such engagements should not only be post-mortem after-the-fact reviews of reports after their publication but should be continuous and feed into the annual reporting. This engagement should also include external experts and stakeholders in the conduct of regular human rights due diligence as envisaged by Principle 15 of the UNGPs. Such engagements could also relate to raising concern on the national laws, policies and measures which pose a risk to digital rights.
  8. As part of due diligence, MTN should periodically assess and examine the impact of its enforcement of its terms and service, policies and practices to ensure they do not pose risks to individual human rights, and the extent to which they comply with the UNGPs and are consistent with its Digital Human Rights Policy. Such assessments are essential to determining the right course of action when faced with government requests and other potential human rights harms.
  9. MTN should add to its Policy and make public its position on network disruptions and outline a clear policy and the procedures detailing how it handles information requests, interception assistance requests, and disruption orders from governments.
  10. Support initiatives that work to grow access, affordability, and secure use of digital technologies, and speak out about any licensing obligations and government practices that undermine digital rights.
  11. Join key platforms that collaboratively advance a free and open internet and respect for human rights in the telecommunications sector, such as the Global Network Initiative (GNI), endorse the GSMA Principles for Driving Digital Inclusion for Persons with Disabilities, and align with local actors on corporate accountability (such as the Uganda Consortium on Corporate Accountability).
  12. MTN should at a minimum, provide simple and clear terms of service, promptly notify users of decisions made affecting them, and provide accessible redress mechanisms and effective remedies.
  13. MTN should institutionalise its commitment to digital rights by putting in place a governance structure at the country level with oversight at a senior level, train its employees on the policy, and create awareness among its customers to ensure the realisation of the policy.

CIPESA stands ready to continue to engage with MTN on ways to improve and effect its Digital Human Rights Policy. We can be contacted at [email protected].

South Sudan’s Cybercrimes and Computer Misuse Order 2021 Stifles Citizens’ Rights

By Edrine Wanyama |

South Sudan has enacted the Cybercrimes and Computer Misuse Provisional Order 2021 aimed to  combat  cybercrimes. The country has a fast-evolving technology sector, with three mobile operators and 24 licensed internet service providers. Investments in infrastructure development have propelled internet penetration to 16.8% and mobile phone penetration to 23% of the country’s population of 11.3 million people, which necessitates a law to curb cybercrime.

The Order is based on article 86(1) of the Transitional Constitution of South Sudan 2011, which provides that when parliament is not in session, the president can issue a provisional order that has the force of law in urgent matters.

The Cybercrimes and Computer Misuse Order makes strides in addressing cybercrimes by extending the scope of jurisdiction in prosecuting cybercrimes to cover offences committed in or outside the country against citizens and the South Sudan state. The Order also establishes judicial oversight especially over the use of forensic tools to collect evidence, with section 10 requiring authorisation by a competent court prior to collecting such evidence. Furthermore, the Order attempts to protect children against child pornography (section 23 and 24), and provides for prevention of trafficking in persons (section 30) and drugs (section 31).

However, the Order is largely regressive of citizens’ rights including freedom of expression, access to information, and the right to privacy.

The Order gives overly broad definitions including of “computer misuse,” “indecent content,” “pornography,” and “publish” which are so ambiguous and wide in scope that they could be used by the state to target government opponents, dissidents and critics. The definitions largely limit the use of electronic gadgets and curtail the exercise of freedom of expression and access to information.

Article 22 of the Transitional Constitution of South Sudan 2011 guarantees the right to privacy. The country has ratified the International Convention on Civil and Political Rights (ICCPR) that provides for the right to privacy under article 17 and the African Charter on Human and Peoples Rights, whose article 5 provides for the right to respect one’s dignity, which includes the right to privacy. The Order appears to contravene these instruments by threatening individual privacy.

Despite a commendable provision in section 6 imposing an obligation on service providers to store information relating to communications, including personal data and traffic data of subscribers, for 180 days – a period far shorter compared to other countries – personal data is still potentially at risk. The section requires service providers and their agents to put in place technical capabilities to enable law enforcement agencies monitor compliance with the Order. With no specific data protection law in South Sudan and without making a commitment to the leading regional instrument, the African Union Convention on Cyber Security and Personal Data Protection, privacy of the citizens is at stake.

The section on offences and penalties lacks specificity on fines which may be levied on errant individuals or companies. On the other hand, some of the offences provided for under the Order potentially curtail freedom of expression and the right to information. For instance, the offence of spamming under section 21 could be interpreted to include all communications through online platforms including social media platforms like Facebook and WhatsApp. Under the provision, virtually all individuals who forward messages on social media stand the risk of prosecution. This also has a chilling effect on freedom of expression and the right to information.

The offence of offensive communication under section 25 potentially has a chilling effect on freedom of expression, media freedom and access to information. A similar provision under section 25 of the Computer Misuse Act, 2011 of Uganda has been widely misused to persecute, prosecute and silence political critics and dissidents. Section 25 of the South Sudan Cybercrimes Order could be used in a similar manner to target government critics and dissidents. 

In CIPESA’s analysis of the Order, we call for specific actions that could ensure the prevention of cybercrime while at the same time not hurting online rights and freedoms, including:

  • Deletion of problematic definitions or provisions from the Order.
  • Enactment of a specific data protection law to guarantee the protection of data of individuals.
  • Urgent drafting of rules and regulations to prescribe the procedures for implementing the Order.
  • Ratification of the African Union Convention on Cyber Security and Personal Data Protection.
  • Service providers should not be compelled to disclose their subscribers’ information to law enforcement agencies except on the basis of a court order.
  • Amendment of the Order to emphasise the oversight role of courts during the processes of access, inspection, seizure, collection and preservation of data or tracking of data under section 9.

Read the full analysis here.

Mauritius’ Social Media Regulation Proposal Centres State-Led Censorship

By Daniel Mwesigwa |

In Sub-Saharan Africa, Mauritius leads in many aspects. It is the only country on the continent categorised as a “full democracy” by the Economist Intelligence Unit Democracy Index for 2020. Additionally, it has the second highest per capita income (USD 11,099) and one of the highest internet penetration rates in the region (72.2%).

However, the recently published consultation paper on proposed amendments to the country’s Information and Communications Technology (ICT) law, purportedly aimed at curbing abuse and misuse of social media, could place Mauritius among the ranks of regressive states. The proposed establishment of a National Digital Ethics Committee (NDEC) to determine what content is problematic in addition to a Technical Enforcement Unit to oversee the technical enforcement of NDEC’s measures has potential surveillance and censorship implications.

The social media regulation proposals by Mauritius are made in light of increasing calls for accountability of technology platforms such as Google and Facebook by western countries. Indeed, the consultation paper cites Germany’s Network Enforcement Act (colloquially known as the Facebook Act), which requires social media platforms to remove “illegal content” from their platforms within 24 hours of notice by users and complaint bodies. Non-compliance penalties are large – with fines ranging between five  million and 50 million euros.

The paper states that, unlike in Germany and other countries like France, the United Kingdom, and Australia, complaints by Mauritian local authorities to social media platforms “remain unattended to or not addressed in a timely manner”. Moreover, it adds, cooperation under the auspices of domestic laws and regulations is only effective in countries where technology companies have local offices, which is not the case in Mauritius. As such, according to the Authority, “the only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.”

However, the Authority’s claims of powerlessness appear unfounded. According to Facebook’s Transparency report, Mauritius made two requests for preservation of five user accounts pending receipt of formal legal processes in 2017. In 2019, Mauritius made one request to Facebook for preservation of two accounts. Similarly, the country has barely made any requests for content take down to Google, with only a total of 13 since 2009. The country has never made a user information or content takedown request to Twitter. In comparison, South Africa made two requests to Facebook for preservation of 14 user accounts in 2017 and 16 requests for preservation of 68 user accounts in 2019. To Google, South Africa has made a total of 33 requests for 130 items for removal since 2009 while to Twitter, it has made six legal demands between 2012 and 2020.

Broad and Ambiguous Definitions

According to section 18(m) of Mauritius’ Information and Communication Technologies Act (2001, amended multiple times including in 2020), the ICT Authority shall “take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services”.

Although the consultation paper states that the Authority has previously fulfilled this mandate in the fight against child pornography,  it concedes that it has not fulfilled the part of curtailing illegal content as it is not currently vested with investigative powers under the Act. The consultation paper thus proposes to operationalise section 18(m) through an operational framework that empowers the Authority “to carry out investigations without the need to rely on the request for technical data from social media administrators.”

The amendments to the ICT Act will relate to defining a two-pronged operational framework with the setting up of: i) a National Digital Ethics Committee (NDEC) as the decision making body on illegal and harmful content; and ii) a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

However, neither the existing Act nor the consultation paper define what constitutes “illegal content”. Whereas the consultation paper indicates that the Chairperson and members of NDEC would be “independent, and persons of high calibre and good repute” in order to ensure transparency and public confidence in its functions, the selection criteria and appointing Authority are not specified, nor are recourse mechanisms for fair hearing and appeals against the decisions of the proposed entity.

An Authoritarian Approach to Internet Architecture

Through a technical toolset (a proxy server), proposed under section 11, the regulator will be able to identify social media traffic which will then be automatically decrypted, archived, and analysed. For instance, the technical toolset would undermine HTTPS in order to inspect internet traffic. This means that information of all social media users pertaining to device specifics, content type, location, among others, would be available to the authorities. The regulator expects that once a complaint regarding social media is received, they will be able to block the implicated web page or profile without necessarily needing the intervention of social media platforms.

Additionally, the Authority expects social media users to accept installation of a one-time digital certificate on their internet-enabled devices to facilitate the re-encryption of traffic before it is transferred to the social networking sites. In other words, the Authority wants internet users in Mauritius to replace their own padlocks used for their home security with ones given to them by the Authority, which it has open and unfettered access to.

On the other hand, Mauritius’ commitments to freedom of expression, data protection and privacy potentially collide with these social media regulation proposals. In particular, Mauritius’ Data Protection Act (2017) requires informed consent of users, prohibits disproportionate collection of user data, and mandates fair and lawful processing of user data. The Data Protection Act was enacted to align with the European Union’s General Data Protection Regulation (GDPR). In March 2018,  Mauritius also ratified the African Union Convention on Cybersecurity and Personal Data Protection, although the Convention is yet to be enforced due to lack of quorum. Moreover, in September 2020, Mauritius signed and ratified the Council of Europe’s Convention for the Protection of individuals with regard to automatic processing of personal data.

Indeed, the Authority is aware of the potential infractions of the proposed technical measures on basic freedoms — stating in the paper that “the proposed statutory framework will undoubtedly interfere with the Mauritian people’s fundamental rights and liberties in particular their rights to privacy and confidentiality and freedom of expression”. Its seeking views and suggestions of “an alternative technical toolset of a less intrusive nature” may very well be an open solicitation for more surreptitious ways of monitoring social media data, with fundamental rights still at stake.

 Democracy and Local Investment

While Mauritius runs a multiparty system of government, its human rights record has been steadily deteriorating, according to the United States Department of State’s Human Rights Report 2020. Moreover, basic freedoms such as freedom of expression are being curtailed through digital taxation and clampdown on social media dissent. Recently, Twitter cited stability and democracy as the key reasons for the opening of its first Africa offices in Ghana. Although Mauritius is strategically placed as a regional and economic hub in Africa, and has been positioning itself as a “Cyber Island”, legal frameworks such as the proposed ICT law amendments and mixed rankings on democracy alongside high rankings on internet access and ease of doing business may likely undermine the country’s international competitiveness and internet freedom standing.

Accordingly, the Authority would do well to immediately discontinue these plans to employ technical measures to monitor social media and internet traffic as they would amount to multiple breaches of fundamental freedoms. The proposals also run counter to the Data Protection Act which prioritises minimisation of data collected and informed user consent. Moreover, the technical proposal would promote self-censorship and undermine the basic workings of the institutions of democracy.

Further, although social media regulation could be paved by good intentions such as the need to stamp out inflammatory content, it could be more beneficial to explore alternative options with a range of stakeholders to promote more fair and transparent content moderation practices in line with international human rights law. Mauritius has already proved that aligning domestic and international laws and practices is necessary by fashioning its data protection law along the lines of the GDPR. Additionally, Mauritius could leverage existing partnerships with other countries of regional economic blocs such as The Common Market for Eastern and Southern Africa (COMESA) to form a coalition of fact-checkers that have direct access to social media platforms.

Finally, the Authority could collaborate with technology platforms such as Facebook to support Creole language human moderators. This could be a necessary step to enhancing content moderation through automated decisions and more so for “low resource” groups of languages including Mauritian Creole.

Regulating Freedom of Expression Amidst the Covid-19 Response in South Africa

By Tusi Fokane |

The global infodemic accelerated in part by the Covid-19 pandemic has raised important debates on how best to respond to the proliferation of false and misleading information online. The Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression addressed the critical issue of misinformation, noting that some actions undertaken by various governments to contain the spread of the coronavirus may fail to meet the test of legality, necessity and proportionality. The report cautioned against the introduction of vague and overly-broad laws to combat misinformation, proposing instead that governments provide reliable information to citizens.

Six months after a National State of Disaster was declared in South Africa, the government on September 16, 2020 eased the lockdown, removing “as many of the remaining restrictions on economic and social activity as it is reasonably safe to do.” One notable restriction still in place  is the criminalisation of the publication of “any statement through any medium including social media, with the intent to deceive,” pursuant to Regulation 11(5), under the Disaster Management Act, which was issued in March 2020. The offense is punishable with an unspecified fine, imprisonment of up to six months, or both.

The regulations were followed by directives from the Minister of Communications and Digital Technologies compelling communications service providers to “remove Covid-19 related fake news from their platforms immediately after it is identified as such”. Within days of its passing, several individuals were arrested for spreading false information about Covid-19. In one case relating to a Covid-19 interview, the Broadcasting Complaints Commission of South Africa fined two broadcasters South African Rand 10,000 (USD 660).

Whilst various activists initially raised their voices in support of   governments’ efforts to halt the spread of the disease, they also cautioned against overly restrictive conditions that limit human rights including  freedom of expression, access to information and public accountability.

Civil Society Reactions to the Regulations on “Fake News”

The debate about the impact of South Africa’s Covid-19 regulations on  free speech came into focus when a leading academic and member of the Covid-19 Ministerial Advisory Committee, Professor Glenda Gray made public comments about the effectiveness of the lockdown restrictions. The Minister of Health declared the academic’s views false and misleading. This prompted leading academics to conclude that “the government has repeatedly stressed that its primary goal in managing the pandemic is to save lives. But it needn’t kill speech to save lives.”

In April 2020, the Right2Know Campaign (R2K) wrote to the National Coronavirus Command Council regarding the “fake news” provisions of lockdown regulations. Whilst noting the potentially deleterious effects of false information, R2K made  proposals to amend the regulations to ensure the protection of the right to freedom of expression. Among the amendments proposed by R2K was the definition of “fake news”  to be clarified as the “dissemination of false information with the intention to deceive…”

Further, R2K noted that the “criminalisation of speech inevitably has a chilling effect on the right to freedom of expression.” It proposed administrative penalties, rather than criminal sanctions, for disseminating false information. Another key proposal was that the government should make provision for relevant defences that an offender could rely on when faced with a charge of spreading false information.

Other critics, such as the Free Market Foundation (FMF), rejected the fake news regulations outright, calling on the government to rely on existing common law and constitutional provisions rather than attempting to regulate expression through the introduction of additional regulations. The FMF argued that, “there is simply too much information circulating in society for any centralised body to be entrusted with deciding its accuracy. Instead, we must rely on the decentralised gatekeeping network known as ‘the market’ to assist us in judging what is true and what is false.”

Meanwhile, Media Monitoring Africa (MMA) stated in a statement in March that the regulations were narrowly defined, and proposed a high standard on the state to prove “intention to deceive.” The group  said the real challenge would be the government’s ability to implement and enforce the fake news regulations.

None of these proposals were taken into account and the current regulations remain in force under the extension of the state of national disaster, imposing undue restrictions on the right to freedom of expression.

Enforcement of the “Fake News” Regulations

As part of measures to enforce the regulations, the government established a multi-stakeholder monitoring and evaluation platform and Digital Complaints Committee to monitor and respond to misinformation and fake news related to Covid-19.  Then Acting Communications and Digital Technologies Minister, Jackson Mthembu, stated that the platform aims to assess misinformation complaints, take down fake news items, and submit cases to the police for investigation and prosecution.

According to MMA Director, William Bird, the task of combating fake news should not be left to government and platform providers. Since 2019, MMA has maintained Real411, an independent digital platform for reporting suspected misinformations. Thandi Smith, MMA’s Head of Programmes, explains that complaints are assessed by a team of three voluntary reviewers with legal, technology, and media expertise. The reviewers then make a recommendation to a five-member secretariat based on a set of assessment criteria.

Upon completion of an investigation, the secretariat recommends a range of actions which may include issuing a take-down notice, fact-checking verification, and publishing a counter-narrative infographic. Bird said the secretariat reports hate speech cases to the South African Human Rights Commission for further action. Extreme cases of misinformation would be reported to the South African police, but to-date no complaints warranting police investigation have been received. Complaints about the media and editorial content are referred to the relevant regulatory authority. Smith noted that there is an appeal process headed up by a retired Constitutional Court judge.

 Assessing the Effectiveness of Criminalising Misinformation

It may be difficult to assess the effectiveness of fake news regulations on Covid-19 given the rapid spread of information in the digital environment. This raises philosophical and policy issues on whether free expression online should even be regulated at all, and by whom.

Indeed, Ghalib Galant, Deputy National Coordinator & Head of Advocacy for the R2K Campaign, maintains that the challenge with South Africa’s Covid-19 misinformation regulatory framework is that government’s response was to criminalise behaviour rather than focusing on educating and supporting South Africans to understand the impact of the pandemic. As he puts it, “Government policed people, rather than healing a health pandemic.”

Galant suggested that administrative penalties may be a better deterrent than criminal sanctions. This would ensure the protection of the right to freedom of expression whilst the country debates whether or not new rules are needed for regulating false information, or a “re-imagining of section 16 of the Constitution.” Galant suggests that perhaps this could be within the purview of a statutory institution such as the Information Regulator.

Section 16(1) of South Africa’s Constitution states that “Everyone has the right to freedom of expression, which includes freedom of the press and other media; freedom to receive or impart information or ideas; freedom of artistic creativity; and academic freedom and freedom of scientific research.” Section 16(2) restricts speech related “to propaganda for war; incitement of imminent violence; or advocacy of hatred that is based on race, ethnicity, gender or religion, and that constitutes incitement to cause harm.”

The head of legal, policy and research at the FMF, Martin van Staden, said fake news regulations have not been effective as they are difficult to enforce. From his perspective, any prohibition on freedom of expression beyond Section 16(2) Constitutional limitations would amount to censorship. He stated: “The Constitution is unequivocal about the scope of the right to freedom of expression, and it does not include a provision that only ‘factual’ expression is allowed. This means that misinformation is constitutionally protected expression in South Africa, and must be left alone.”

He recommends that the government should instead provide accurate and reliable information, and develop a strong counter-narrative strategy, which would enable South African citizens to reach their own conclusions on the veracity of any information they receive.

Van Staden cautioned against the state’s “paternalism” and future attempts to introduce legislation aimed at ensuring the truthfulness of information that is disseminated. “The right to freedom of expression is meant to protect the uncomfortable, the unpopular, and the offensive,” he said.

Threats to Freedom of Expression Beyond Covid-19 Regulations

There is uncertainty on whether the National State of Disaster will be extended again beyond December 15, 2020, given concerns of a second wave of Covid-19 infections in the country. Freedom of expression experts have warned that whilst fake news may be decriminalised by a declaration of the end of the State of Disaster,  the government may attempt to use impending legislation to further regulate free speech online.

For example, in July 2020, the Minister of Communications and Digital Technologies released a call for comments on the gazetted draft Film and Publications amendment regulations, (commonly known as the internet censorship bill), which introduces a requirement for pre-classification of online content with the Film and Publications Board.

Another key piece of legislation in the pipeline is the Prevention and Combating of Hate Crimes and Hate Speech Bill, which lapsed and is currently on hold, pending judgment on the Qwelane hate speech Constitutional Court challenge which was heard on  September 22, 2020.

Qwelane contends that the prohibited grounds listed in section 10(1) of the Promotion of Equality and Protection of Unfair Discrimination Act (Equality Act) are overly broad, go far beyond the limitations set out in section 16(2) of the Constitution, and unjustifiably limit the right to freedom of expression.

The outcome of the Qwelane case will be important in clarifying the limitations on free speech for South Africans given ongoing debates on the regulation of freedom of expression both online and offline. This is particularly important in setting clear parameters for free speech and false and misleading information in South Africa. This will assist in ensuring that unprotected speech is very narrowly defined and does not unjustifiably limit the Constitutional right to freedom of expression.


Tusi Fokane is a 2020 CIPESA Fellow focussing on the the availability and use of digital technologies to combat the spread of Covid-19 in South Africa. She is also  studying the country’s readiness for electronic voting to comply with social distancing and other movement restrictions during the upcoming local government elections.