Data Protection in Africa in the Age of Covid-19

By Boel McAteer and Jean-Benoît Falisse |

As the Covid-19 pandemic spread around the world in the early part of 2020, governments and companies invested substantial resources in gathering data about suspected and confirmed cases, and related behaviours. Learning more about how the virus was spreading was a top priority around the world, and with this came new practices of sharing medical records, tracking people’s movements and tracing their contacts. This has created new norms for data governance in many countries, and in this brave new world of disease surveillance, it is more important than ever to understand data protection and privacy, and where these concepts fit in with the new priorities of managing the pandemic.

The Covid Governance research project has gathered information about country-level data protection and Covid-19 practices across the world. Covering over 200 countries and territories, the project’s Data Protection Explorer Tool provides a snapshot of the legal environment surrounding data protection and privacy, and how it is changing in response to Covid-19. Crucially, it focuses on restrictions on data collection, processing and cross-border transfers.  It also captures digital monitoring measures in place for Covid-19, such as contact tracing, and who owns that data. This will help form a picture of what has changed within data ethics and surveillance during the pandemic, and in the long term what those changes might mean.

So what are some of the key patterns that we can see so far? A joint statement on Data Protection and Privacy in the Covid-19 Response from a number of United Nations (UN) organisations states that any changed practices due to Covid-19 should be legalised and rooted in human rights. However, the information collected via the Data Protection Explorer Tool shows that about a third of Africa’s 54 countries did not have comprehensive data protection laws enforced or in place before the pandemic. During the pandemic, constitutional rights have often been backtracked as a part of the crisis response.

 The Explorer’s data also shows that African countries without specific data protection laws are particularly exposed. Take Namibia for example, whereas no comprehensive data protection law inlaw is in place.  – there is, however, a draft bill in the works and public consultations were conducted in 2020. In the absence of a dedicated data protection framework,  t does not mean data protection is inexistant: as in other African countries, there are provisions in other Namibian laws related to personal data of citizens in specific sectors of the economy such as accounting and  banking (the Banking Institutions Act, 1998 and 2010 amendment) or the legal professions and accounting (Legal Practitioners Act, 15 of 1995 as amended).

The right to personal privacy is also enshrined in Namibia’s constitution as a human right, but this right is limited including in the interests of health and public safety. This allows the government to legally prioritise public health over other human rights throughout the pandemic. Indeed, when Namibia declared a state of emergency in March 2020, many constitutional freedoms were temporarily suspended. For instance, access to education could not be guaranteed anymore and places of worship (constitutive of religious freedom) were closed.

Covid-19 tracing and surveillance mostly occurred offline but the University of Namibia (UNAM) successfully launched a mobile app, named “NamCotrace”, that collects substantial personal information such as the geolocation of users. The app is connected to epidemiological data and the national healthcare system in real time. Whereas it is alleged that “privacy by design” is core to the app, Namibia’s prevailing privacy and data protection legislative environment leaves room for arbitrary abuse. Similarly, Nigeria has developed various Covid-19 apps but with minimal data protection legal safeguards in place, there is ample room for misuse.

The Data Protection Explorer Tool also shows that countries with data protection laws remain vulnerable too. In many instances, the laws have been amended to allow practices that were previously prohibited to take place during the pandemic. In South Africa for instance, the response to Covid-19 has been governed through the Disaster Management Act from 2002 that allows the National Disaster Management Centre to request from individuals or organs of state information it “reasonably requires” and to escalate the matter to parliament in case of failure.

In April 2020 a regulation was introduced to legalise contact tracing in South Africa. This created a tracing database of Covid-19 cases, managed by the National Department of Health, where personal information is gathered from anyone tested for Covid-19. Information collected and stored in the database includes name, residential address, ID and passport number. This means that even though the information is collected legally without consent from the individuals, it would be unlawful to use that data for any other purpose than the one specified in the regulation. Despite these provisions, concerns have been raised that the contact tracing enables government surveillance of the population, since the Director General of Health can track the location of anyone suspected to have Covid-19 through phone service providers.

At the other end of the continent, Iin West Africa’s Burkina Faso, the data protection law prohibits collection of personal data relating to health. It had not, at the time of writing, been amended. However, since 2019, a digital platform for health surveillance has been  is in place.: One Health is funded by USAID and combines data from three ministries concerned with zoonotic disease control.  in the same place. When the first cases of Covid-19 were detected in the country in 2020, however, the platform was adapted to include data on the new virus, tracing cases, and their contacts. This is, obviously, raising privacy (and legality) concerns.

There are also some inspiring examples. The B’Safe app in Botswana was developed as an alternative to a manual Covid-19 tracing system. Described as privacy-friendly and in line with the country’s data protection (and privacy) law that pre-dates the pandemic, the app recorded a decent initial adoption rate. However, without an established data protection authority to enforce the law and oversee the app’s roll out, security vulnerabilities within the app led to private citizens lodging a court case against the country’s Covid-19 task force challenging the apps  its safety. The progress of the case remains unclear to-date. However, it highlights the importance of independent data protection authorities, good examples of which include in Angola and Senegal, and the pandemic potentially being a decisive push in countries where they are yet to be established.

Where are we heading now? Data protection laws in Africa were rapidly developing in the years leading up to the pandemic, with many new laws influenced by the European Union’s General Data Protection Regulation (GDPR) which was adopted in 2016. The examples above show the many ways in which the data protection environment in Africa is changing with the pandemic.

As the general state of democracy and freedoms is deemed to be worsening since the outbreak of Covid-19, it will be important to continue to monitor developments in data protection and privacy: the pandemic could be the opportunity to speed up the process of establishing much-needed laws and enforcement agencies but it could also lead to them being less protective of citizens (and more permissive for government) than in the pre-Covid-19 world.

The Covid Governance Project is an initiative of the University of Edinburgh. It was developed with support from the Foreign Commonwealth and Development Office (FDCD), the Global Challenges Research Fund – Scottish Funding Council, and the University of Edinburgh’s Challenge Investment Fund. Explore the Data Protection Explorer Tool.

Mauritius’ Social Media Regulation Proposal Centres State-Led Censorship

By Daniel Mwesigwa |

In Sub-Saharan Africa, Mauritius leads in many aspects. It is the only country on the continent categorised as a “full democracy” by the Economist Intelligence Unit Democracy Index for 2020. Additionally, it has the second highest per capita income (USD 11,099) and one of the highest internet penetration rates in the region (72.2%).

However, the recently published consultation paper on proposed amendments to the country’s Information and Communications Technology (ICT) law, purportedly aimed at curbing abuse and misuse of social media, could place Mauritius among the ranks of regressive states. The proposed establishment of a National Digital Ethics Committee (NDEC) to determine what content is problematic in addition to a Technical Enforcement Unit to oversee the technical enforcement of NDEC’s measures has potential surveillance and censorship implications.

The social media regulation proposals by Mauritius are made in light of increasing calls for accountability of technology platforms such as Google and Facebook by western countries. Indeed, the consultation paper cites Germany’s Network Enforcement Act (colloquially known as the Facebook Act), which requires social media platforms to remove “illegal content” from their platforms within 24 hours of notice by users and complaint bodies. Non-compliance penalties are large – with fines ranging between five  million and 50 million euros.

The paper states that, unlike in Germany and other countries like France, the United Kingdom, and Australia, complaints by Mauritian local authorities to social media platforms “remain unattended to or not addressed in a timely manner”. Moreover, it adds, cooperation under the auspices of domestic laws and regulations is only effective in countries where technology companies have local offices, which is not the case in Mauritius. As such, according to the Authority, “the only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.”

However, the Authority’s claims of powerlessness appear unfounded. According to Facebook’s Transparency report, Mauritius made two requests for preservation of five user accounts pending receipt of formal legal processes in 2017. In 2019, Mauritius made one request to Facebook for preservation of two accounts. Similarly, the country has barely made any requests for content take down to Google, with only a total of 13 since 2009. The country has never made a user information or content takedown request to Twitter. In comparison, South Africa made two requests to Facebook for preservation of 14 user accounts in 2017 and 16 requests for preservation of 68 user accounts in 2019. To Google, South Africa has made a total of 33 requests for 130 items for removal since 2009 while to Twitter, it has made six legal demands between 2012 and 2020.

Broad and Ambiguous Definitions

According to section 18(m) of Mauritius’ Information and Communication Technologies Act (2001, amended multiple times including in 2020), the ICT Authority shall “take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services”.

Although the consultation paper states that the Authority has previously fulfilled this mandate in the fight against child pornography,  it concedes that it has not fulfilled the part of curtailing illegal content as it is not currently vested with investigative powers under the Act. The consultation paper thus proposes to operationalise section 18(m) through an operational framework that empowers the Authority “to carry out investigations without the need to rely on the request for technical data from social media administrators.”

The amendments to the ICT Act will relate to defining a two-pronged operational framework with the setting up of: i) a National Digital Ethics Committee (NDEC) as the decision making body on illegal and harmful content; and ii) a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

However, neither the existing Act nor the consultation paper define what constitutes “illegal content”. Whereas the consultation paper indicates that the Chairperson and members of NDEC would be “independent, and persons of high calibre and good repute” in order to ensure transparency and public confidence in its functions, the selection criteria and appointing Authority are not specified, nor are recourse mechanisms for fair hearing and appeals against the decisions of the proposed entity.

An Authoritarian Approach to Internet Architecture

Through a technical toolset (a proxy server), proposed under section 11, the regulator will be able to identify social media traffic which will then be automatically decrypted, archived, and analysed. For instance, the technical toolset would undermine HTTPS in order to inspect internet traffic. This means that information of all social media users pertaining to device specifics, content type, location, among others, would be available to the authorities. The regulator expects that once a complaint regarding social media is received, they will be able to block the implicated web page or profile without necessarily needing the intervention of social media platforms.

Additionally, the Authority expects social media users to accept installation of a one-time digital certificate on their internet-enabled devices to facilitate the re-encryption of traffic before it is transferred to the social networking sites. In other words, the Authority wants internet users in Mauritius to replace their own padlocks used for their home security with ones given to them by the Authority, which it has open and unfettered access to.

On the other hand, Mauritius’ commitments to freedom of expression, data protection and privacy potentially collide with these social media regulation proposals. In particular, Mauritius’ Data Protection Act (2017) requires informed consent of users, prohibits disproportionate collection of user data, and mandates fair and lawful processing of user data. The Data Protection Act was enacted to align with the European Union’s General Data Protection Regulation (GDPR). In March 2018,  Mauritius also ratified the African Union Convention on Cybersecurity and Personal Data Protection, although the Convention is yet to be enforced due to lack of quorum. Moreover, in September 2020, Mauritius signed and ratified the Council of Europe’s Convention for the Protection of individuals with regard to automatic processing of personal data.

Indeed, the Authority is aware of the potential infractions of the proposed technical measures on basic freedoms — stating in the paper that “the proposed statutory framework will undoubtedly interfere with the Mauritian people’s fundamental rights and liberties in particular their rights to privacy and confidentiality and freedom of expression”. Its seeking views and suggestions of “an alternative technical toolset of a less intrusive nature” may very well be an open solicitation for more surreptitious ways of monitoring social media data, with fundamental rights still at stake.

 Democracy and Local Investment

While Mauritius runs a multiparty system of government, its human rights record has been steadily deteriorating, according to the United States Department of State’s Human Rights Report 2020. Moreover, basic freedoms such as freedom of expression are being curtailed through digital taxation and clampdown on social media dissent. Recently, Twitter cited stability and democracy as the key reasons for the opening of its first Africa offices in Ghana. Although Mauritius is strategically placed as a regional and economic hub in Africa, and has been positioning itself as a “Cyber Island”, legal frameworks such as the proposed ICT law amendments and mixed rankings on democracy alongside high rankings on internet access and ease of doing business may likely undermine the country’s international competitiveness and internet freedom standing.

Accordingly, the Authority would do well to immediately discontinue these plans to employ technical measures to monitor social media and internet traffic as they would amount to multiple breaches of fundamental freedoms. The proposals also run counter to the Data Protection Act which prioritises minimisation of data collected and informed user consent. Moreover, the technical proposal would promote self-censorship and undermine the basic workings of the institutions of democracy.

Further, although social media regulation could be paved by good intentions such as the need to stamp out inflammatory content, it could be more beneficial to explore alternative options with a range of stakeholders to promote more fair and transparent content moderation practices in line with international human rights law. Mauritius has already proved that aligning domestic and international laws and practices is necessary by fashioning its data protection law along the lines of the GDPR. Additionally, Mauritius could leverage existing partnerships with other countries of regional economic blocs such as The Common Market for Eastern and Southern Africa (COMESA) to form a coalition of fact-checkers that have direct access to social media platforms.

Finally, the Authority could collaborate with technology platforms such as Facebook to support Creole language human moderators. This could be a necessary step to enhancing content moderation through automated decisions and more so for “low resource” groups of languages including Mauritian Creole.

Challenges and Prospects of the General Data Protection Regulation (GDPR) in Africa

Policy Brief |
Privacy is a fundamental human right guaranteed by international human rights instruments including the Universal Declaration of Human Rights in its article 12 and the International Covenant on Civil and Political Rights, in its article 17. Further, these provisions have been embedded in different jurisdictions in national constitutions and in acts of Parliament.
In Africa, regional bodies have invested efforts in ensuring that data protection and privacy are prioritised by Member States. For instance, in 2014 the African Union (AU) adopted the Convention on Cybersecurity and Personal Data Protection. In 2010, the Southern African Development Community (SADC) developed a model law on data protection which it adopted in 2013. Also in 2010, the Economic Community of West African States (ECOWAS) adopted the Supplementary Act A/SA.1/01/10 on Personal Data Protection Within ECOWAS. The East African Community, in 2008, developed a Framework for Cyberlaws. Notwithstanding these efforts, many countries on the continent are still grappling with enacting specific legislation to regulate the collection, control and processing of individuals’ data.
On May 25, 2018, the European Union’s General Data Protection Regulation (GDPR) came into effect. The GDPR is likely to force African countries, especially those with strong trade ties to the EU, to prioritise data privacy and to more decisively meet their duties and obligations to ensure compliance.
See this brief on the Challenges and Prospects of the General Data Protection Regulation (GDPR) in Africa, where we explore the consequences of GDPR for African states and business entities.