Mauritius’ Social Media Regulation Proposal Centres State-Led Censorship

By Daniel Mwesigwa |

In Sub-Saharan Africa, Mauritius leads in many aspects. It is the only country on the continent categorised as a “full democracy” by the Economist Intelligence Unit Democracy Index for 2020. Additionally, it has the second highest per capita income (USD 11,099) and one of the highest internet penetration rates in the region (72.2%).

However, the recently published consultation paper on proposed amendments to the country’s Information and Communications Technology (ICT) law, purportedly aimed at curbing abuse and misuse of social media, could place Mauritius among the ranks of regressive states. The proposed establishment of a National Digital Ethics Committee (NDEC) to determine what content is problematic in addition to a Technical Enforcement Unit to oversee the technical enforcement of NDEC’s measures has potential surveillance and censorship implications.

The social media regulation proposals by Mauritius are made in light of increasing calls for accountability of technology platforms such as Google and Facebook by western countries. Indeed, the consultation paper cites Germany’s Network Enforcement Act (colloquially known as the Facebook Act), which requires social media platforms to remove “illegal content” from their platforms within 24 hours of notice by users and complaint bodies. Non-compliance penalties are large – with fines ranging between five  million and 50 million euros.

The paper states that, unlike in Germany and other countries like France, the United Kingdom, and Australia, complaints by Mauritian local authorities to social media platforms “remain unattended to or not addressed in a timely manner”. Moreover, it adds, cooperation under the auspices of domestic laws and regulations is only effective in countries where technology companies have local offices, which is not the case in Mauritius. As such, according to the Authority, “the only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.”

However, the Authority’s claims of powerlessness appear unfounded. According to Facebook’s Transparency report, Mauritius made two requests for preservation of five user accounts pending receipt of formal legal processes in 2017. In 2019, Mauritius made one request to Facebook for preservation of two accounts. Similarly, the country has barely made any requests for content take down to Google, with only a total of 13 since 2009. The country has never made a user information or content takedown request to Twitter. In comparison, South Africa made two requests to Facebook for preservation of 14 user accounts in 2017 and 16 requests for preservation of 68 user accounts in 2019. To Google, South Africa has made a total of 33 requests for 130 items for removal since 2009 while to Twitter, it has made six legal demands between 2012 and 2020.

Broad and Ambiguous Definitions

According to section 18(m) of Mauritius’ Information and Communication Technologies Act (2001, amended multiple times including in 2020), the ICT Authority shall “take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services”.

Although the consultation paper states that the Authority has previously fulfilled this mandate in the fight against child pornography,  it concedes that it has not fulfilled the part of curtailing illegal content as it is not currently vested with investigative powers under the Act. The consultation paper thus proposes to operationalise section 18(m) through an operational framework that empowers the Authority “to carry out investigations without the need to rely on the request for technical data from social media administrators.”

The amendments to the ICT Act will relate to defining a two-pronged operational framework with the setting up of: i) a National Digital Ethics Committee (NDEC) as the decision making body on illegal and harmful content; and ii) a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

However, neither the existing Act nor the consultation paper define what constitutes “illegal content”. Whereas the consultation paper indicates that the Chairperson and members of NDEC would be “independent, and persons of high calibre and good repute” in order to ensure transparency and public confidence in its functions, the selection criteria and appointing Authority are not specified, nor are recourse mechanisms for fair hearing and appeals against the decisions of the proposed entity.

An Authoritarian Approach to Internet Architecture

Through a technical toolset (a proxy server), proposed under section 11, the regulator will be able to identify social media traffic which will then be automatically decrypted, archived, and analysed. For instance, the technical toolset would undermine HTTPS in order to inspect internet traffic. This means that information of all social media users pertaining to device specifics, content type, location, among others, would be available to the authorities. The regulator expects that once a complaint regarding social media is received, they will be able to block the implicated web page or profile without necessarily needing the intervention of social media platforms.

Additionally, the Authority expects social media users to accept installation of a one-time digital certificate on their internet-enabled devices to facilitate the re-encryption of traffic before it is transferred to the social networking sites. In other words, the Authority wants internet users in Mauritius to replace their own padlocks used for their home security with ones given to them by the Authority, which it has open and unfettered access to.

On the other hand, Mauritius’ commitments to freedom of expression, data protection and privacy potentially collide with these social media regulation proposals. In particular, Mauritius’ Data Protection Act (2017) requires informed consent of users, prohibits disproportionate collection of user data, and mandates fair and lawful processing of user data. The Data Protection Act was enacted to align with the European Union’s General Data Protection Regulation (GDPR). In March 2018,  Mauritius also ratified the African Union Convention on Cybersecurity and Personal Data Protection, although the Convention is yet to be enforced due to lack of quorum. Moreover, in September 2020, Mauritius signed and ratified the Council of Europe’s Convention for the Protection of individuals with regard to automatic processing of personal data.

Indeed, the Authority is aware of the potential infractions of the proposed technical measures on basic freedoms — stating in the paper that “the proposed statutory framework will undoubtedly interfere with the Mauritian people’s fundamental rights and liberties in particular their rights to privacy and confidentiality and freedom of expression”. Its seeking views and suggestions of “an alternative technical toolset of a less intrusive nature” may very well be an open solicitation for more surreptitious ways of monitoring social media data, with fundamental rights still at stake.

 Democracy and Local Investment

While Mauritius runs a multiparty system of government, its human rights record has been steadily deteriorating, according to the United States Department of State’s Human Rights Report 2020. Moreover, basic freedoms such as freedom of expression are being curtailed through digital taxation and clampdown on social media dissent. Recently, Twitter cited stability and democracy as the key reasons for the opening of its first Africa offices in Ghana. Although Mauritius is strategically placed as a regional and economic hub in Africa, and has been positioning itself as a “Cyber Island”, legal frameworks such as the proposed ICT law amendments and mixed rankings on democracy alongside high rankings on internet access and ease of doing business may likely undermine the country’s international competitiveness and internet freedom standing.

Accordingly, the Authority would do well to immediately discontinue these plans to employ technical measures to monitor social media and internet traffic as they would amount to multiple breaches of fundamental freedoms. The proposals also run counter to the Data Protection Act which prioritises minimisation of data collected and informed user consent. Moreover, the technical proposal would promote self-censorship and undermine the basic workings of the institutions of democracy.

Further, although social media regulation could be paved by good intentions such as the need to stamp out inflammatory content, it could be more beneficial to explore alternative options with a range of stakeholders to promote more fair and transparent content moderation practices in line with international human rights law. Mauritius has already proved that aligning domestic and international laws and practices is necessary by fashioning its data protection law along the lines of the GDPR. Additionally, Mauritius could leverage existing partnerships with other countries of regional economic blocs such as The Common Market for Eastern and Southern Africa (COMESA) to form a coalition of fact-checkers that have direct access to social media platforms.

Finally, the Authority could collaborate with technology platforms such as Facebook to support Creole language human moderators. This could be a necessary step to enhancing content moderation through automated decisions and more so for “low resource” groups of languages including Mauritian Creole.

Why Data Rights are Central to Protection of Online Freedom

By CIPESA Staff Writer |

In an increasingly digitised world, safeguarding data rights has become central to protecting individuals’ rights to access and share information, express themselves, and associate using the internet and related platforms.

Advances in technology, alongside growth in mobile subscriptions and increased use of smartphones have pushed individuals online to shop, interact, share and search for information, learn, and work, alongside digitalisation of more sectors of economies and public services. As a result, there is increased collection, processing and sharing of personal data. With many users of Information and Communications Technology (ICT) not aware of the implications of their use of digital technologies and how their rights are compromised, the potential for the data to be manipulated and abused by individuals, private companies and governments is ever-present. 

At the end of 2019, 477 million people in Sub-Saharan Africa were subscribed to mobile services, accounting for 45% of the region’s population. According to the GSMA, the group that represents the interests of mobile operators worldwide, smartphone adoption continues to rise rapidly in the region, reaching 50% of total connections in 2020. Meanwhile, as of 2019, there were 469 million registered mobile money accounts in Sub-Saharan Africa, a figure that was expected to reach half a billion in 2020.

From the provision of eServices, to digital identity (or digital ID), voters registration, drivers’ license applications and issuance, through to mobile phone SIM card registration, public and private service bodies including immigration authorities, law and security enforcement, health service providers, telecom operators, and digital financial service providers are among the big collectors and processors of personal data in Africa. Increasingly, the nature of personal data being collected is expanding, to include biometric data such as facial images or fingerprints.

What is Personal Data?

Personal data refers to information that relates to an identified or identifiable natural person by which that person can be identified, “in particular by reference to an identification or to one or more factors specific to his/her physical, physiological, mental, economic, cultural or social identity.”

Upholding individuals’ data rights implies their personal data must be kept private and should not be known, stored, or used by unauthorised parties. Upholding data rights is then a central pillar of the long-recognised right to privacy, which national laws and international human rights frameworks such as the international bill of rights guarantee. Notably, the right to privacy is pivotal in a democratic society as it is both an enabler and reliant on the enjoyment of other rights, such as freedom of expression, information and association.

As businesses, governments, and civil society organisations seek to maximise value of increased data flows, the dangers of cyberthreats, cybercrimes, surveillance, and general data misuse pose threats that require national, regional, and international action to address. At the same time, excessive restrictions on the flow of data between countries can undermine regional economic benefits if no best practices are adopted on how data should flow, be stored, protected, and disposed – Building an Enabling Environment for Inclusive Digital Transformation in Africa.

Poor or missing legal protections for personal data, abuse of existing laws by state agencies including security agencies and by private companies, and poor digital security practices by citizens, are exacerbating the erosion of many African citizens’ data rights. With increased data collection has come increased state surveillance and data privacy breaches. Worryingly, many African states are increasingly using data to undermine citizens’ digital freedoms, such as by conducting real-time monitoring, surveillance of citizens’ social media and intercepting telephone communications. In some instances, this has led to arbitrary arrests and prosecutions of individuals.

Moreover, telecoms and internet service providers are required by law to comply with user information requests or requests for assistance from the government, including the common requirement to install software to facilitate the state’s conduct of surveillance and monitoring of citizens’ communications. Many governments are indeed accessing subscribers’ data from telecom companies with limited oversight and hardly any transparency. Even where service providers feel constrained about regulator directives, they are often overcome by the need to continue operations and agree to restrict data rights. 

In such countries, digital rights are under threat and, resultantly, citizens are losing the appetite to participate in public affairs, and they often practice self-censorship in their engagements over digital platforms. This undermines the philosophy of a free and open internet that drives innovation, enables the enjoyment of rights and improvement of livelihoods.

In many countries, the digital rights situation worsened during the Covid-19 pandemic, as governments suspended respect for several rights, collected lots of private data and conducted surveillance without sufficient oversight, safeguards, or transparency.

The State of Internet Freedom in Africa 2020 Report found that the fight against Covid-19 has had a fundamental impact on digital rights and freedoms including freedom of expression, access to information, privacy, assembly and association. It has also undermined civic participation and, in many countries, deepened the democracy deficit.

In responding to the Covid-19 pandemic, countries across the continent adopted a series of Covid-19 regulations and practices, including deploying surveillance technologies and untested applications, to enable them conduct lawful collection and processing of personal data for purposes of tracing, contacting, isolating and treating those found to be positive or their contacts. These measures were quickly adopted and the collection of personal information continues, and in some cases without adequate regulation or oversight – State of Internet Freedom in Africa 2020: Resetting Digital Rights Amidst the Covid-19 Fallout

In several African countries, there are inadequate safeguards and limited oversight to guard against potential violations of digital rights arising out of the implementation of laws, regulations, systems, and practices imposed to fight Covid-19. According to the United Nations, the use of emergency powers and tools of surveillance technology to track the spread of Covid-19 must be non-intrusive, limited in time and purpose and abide to the strictest protections and international human rights standards governing privacy and personal data.

Concerns over data handling during the fight against Covid-19 and how that harmed digital rights informed the formation of the #RestoreDataRights movement, that is promoted by a group of African and international civil society, academic and philanthropic partners. Launched at the end of 2020, it is premised on the conviction that our fundamental human rights – including those exercised in cyberspace and over our personal and sensitive data – should be respected and upheld during and after the Covid-19 public health emergency. Furthermore, decision-making processes around how sensitive data are collected, shared and used to tackle the Covid-19 pandemic in Africa should be transparent, inclusive and accountable.

There has also been a proliferation of retrogressive laws, procedures and practices such as the systematic criminalisation of online communication and dissent, the arbitrary arrest, illegal detention, flawed prosecution and excessive punishment of government critics. On a continent where digital authoritarianism is rising, the legitimisation of surveillance, censorship, and breaches in the rule of law during the coronavirus crisis could create a new normal that erodes internet freedom for years to come. 

There is therefore a need to have strong data protection laws; to educate citizens to protect their data and to demand their digital rights; and to have strong, well-resourced and independent data protection authorities. It is also crucial to establish clear and well-publicised complaint mechanisms in cases of data privacy breaches. Meanwhile, private companies should institute stringent measures to protect data privacy and integrate ‘privacy by design’ in any applications they develop, partner with civic actors and public officials to promote digital rights, and be transparent about their data handling practices.

These measures would enable accountable data governance that respects citizens’ data rights and advances wider internet freedoms in Africa. Further, they would enable robust protection of digital rights and data rights, while providing scope for data openness that enables harnessing of data to serve the legitimate public interest.

#WithoutFear: Confronting Online Abuse Against Women In Somalia

By CIPESA Staff Writer |

In commemoration of International Women’s Day (IWD), Digital Shelter has launched the #WithoutFear campaign to raise awareness about the challenges faced by Somali women online.

The campaign features an audiovisual poem in English and Somali, by award-winning Poet, Activist and Digital Storyteller Zahra Abdihagi Mahamed. The poem was inspired by a December 2020 crowdsourcing survey conducted by Digital Shelter on women’s experiences of online shame, harassment, and abuse. The survey resulted in 82 respondents sharing stories ranging from having accounts hacked, blackmail, through to receiving unsolicited indecent images from men.

The #WithoutFear campaign also features a digital safety and security platform which enables Somali women to download and receive regular reminders about the status of their online and social media accounts via a calendar.

https://twitter.com/DigitalShelter/status/1368996194953920515

Online harassment carries similar harms as street harassment, yet, as one respondent in the survey put it, “This kind of thing is not even considered abuse in our society, which is disheartening.” As a result, Somali women’s voices are suppressed and even silenced online, with a third (34%) of the survey’s respondents confirming they now spend less time on social media.

According to Mahamed, “No woman should be put in a situation where she is ridiculed and shamed — especially online, where information travels very far and abuse continues to grow more and more each day. It is mentally and emotionally disturbing”

The survey highlighted Facebook as “the worst platform to be a girl” in Somalia, with 57% of respondents experiencing abuse on a Facebook-owned platform (Facebook, Messenger, WhatsApp, or Instagram). This echoes sentiments in a 2020 research on the online experiences of women conducted as part of the Women At Web initiative which reiterates Goal 5 of the Sustainable Development Goals which calls for an end to all forms of discrimination against all women and girls.

https://twitter.com/cipesaug/status/1368863933147283462

Abdifatah Hassan Ali, co-founder of Digital Shelter, stated that the campaign is vital because online spaces should be open, safe, and inclusive for all. He added that, “We need Somali women to be able to openly express their views without being challenged and without being harassed.”

The campaign was developed as part of data literacy institutional support in the context of the Africa Digital Rights Fund (ADRF), led by Data4Change with support from the Collaboration on International ICT Policy for East and Southern Africa (CIPESA).

South Africa’s Parliament Rejects Plan to Introduce e-Voting

By Tusi Fokane |

As South Africa prepares to hold local government elections in 2021, parliament’s Portfolio Committee on Home Affairs has rejected two proposals contained in the Electoral Laws Amendment Bill, which could have seen the introduction of electronic voting in the country.

The rejected proposals were contained in clause 14, which suggested that the country’s Independent Electoral Commission (IEC) “may prescribe a different voting method” under the 1998 Electoral Act and clause 21 which sought to make a similar amendment to the Local Government: Municipal Electoral Act, 2000. The electoral body had intended to use these amendments to progressively introduce e-voting.

A report adopted by the Committee on December 1, 2020 notes that the introduction of different voting methods is a policy matter that “cannot be left to the IEC alone to decide” and emphasised that “explicit clarity must be given to the effect that the amendments do not authorise e-voting upon signing of the bill into law.”

The proposals were part of the Electoral Laws Amendment Bill which was introduced in September 2020 to amend legislation governing national, provincial and local government elections, including the forthcoming 2021 local government elections. Local government elections are set to take place between August 4 and November 1, 2021, although the final date is yet to be gazetted by the Minister of Cooperative Governance and Traditional Affairs.

The proposed amendments under the Bill seek to align three key pieces of electoral legislation, namely the Electoral Commission Act, the Electoral Act and the Local Government: Municipal Electoral Act. Besides proposals related to methods of voting, the other proposed amendments relate to  procedures regarding the registration of parties, the submissions of candidate lists by parties, the casting of votes in a district where a voter is not registered, and the protection of voters’ personal data against disclosure pursuant to the Protection of Personal Information Act.

 Proposals for electronic voting were first tabled by the IEC back in July 2020, when it indicated that electronic voting considerations were still in early stages and would first be trialled as a pilot. The Commission stated that electronic voting would help increase efficiencies in the existing system including counting and capturing of election results. There is currently no provision for online or postal voting in South Africa, as its prevailing electoral laws provide that voters must vote in person at their voting station.

The decision of the Portfolio Committee on Home Affairs to reject alternative methods of voting proposals followed complaints from various stakeholders. Consultations by the Portfolio Committee via the Dear South Africa platform received over 12,000 submissions from the general public and civil society. Many of the submissions received were against the adoption of the Electoral Laws Amendment Bill, citing constitutional concerns over the introduction of the electronic voting method. Members of the public took exception to the powers delegated to the electoral commission to change electoral policy without proper public participation and parliamentary oversight. Some commentators also criticised the short time-frame given for public input – two weeks –  from mid to end October 2020, although this was subsequently extended to November 6, 2020.

Submissions also raised concerns on the possibility of electoral fraud, hacking and the rigging of election results. There were also concerns raised about the costs of an e-voting system, given South Africa’s current fiscal constraints, as well as exclusion of communities who may not have access to digital technologies. As at January 2020, internet penetration in South Africa was estimated at 62%.

In response to concerns raised by members of the Committee regarding the public submissions, the IEC has argued that the proposed amendments were intended to create a framework for the piloting of electronic voting, as opposed to rolling it out fully in the country.

Whereas the Portfolio Committee acknowledged the beneficial role of technology in enhancing the electoral process, it cautioned against deploying technology without considering the necessary legal and constitutional implications. The Chairperson of the Committee noted that:

The truth of the matter is that technology is upon us and preparation must be started to ensure that we have both the legal framework and the technical experience that will ensure that elections are secure if a decision to vote through e-voting is taken..

In its statement, the Portfolio Committee on Home Affairs requested the IEC to return to Parliament with case studies on the implementation, challenges and successes of electronic voting in other countries.

In the 2009 general elections, the IEC introduced technological solutions to assist with processing of ballots. Four years later in 2013, the electoral body convened a seminar on Electronic Voting and Counting Technologies to assess the feasibility of electronic voting in South Africa. The then Chairperson of the IEC, Advocate Pansy Tlakula, noted that the country had not formally adopted a position on e-voting and that whilst e-voting presented some benefits such as speed and accuracy in vote counting, it would be expensive to monitor and could reduce transparency in the voting process. She also noted that there was no global standard for the verification and auditing of e-voting systems.

Electronic voting was once again put on the national agenda following the outcome of the ruling party’s June 2020 National Working Committee meeting. The African National Congress (ANC) reported that it had discussed “alternative methods of conducting elections, including the use of electronic voting” in light of the Covid-19 pandemic. This was followed by media reports that the IEC was considering launching an e-voting pilot in July, without providing any details on the roll-out. Shortly thereafter, in September, the IEC indicated that it had scrapped its planned pilot due to a lack of budget.

 While the matter is on hold pending a detailed report on international case studies, implications, challenges and successes of e-voting, it is important for the IEC to address the issues raised by stakeholders. These include ensuring the security and transparency of the processing and verification of votes, as well as ensuring that rural voters have access to reliable internet, electricity and networks to cast their e-ballots. Costs of financing the e-voting system also require careful consideration.

Another critical prerequisite is the need to ensure adequate public participation in amendments to laws governing the electoral system. This can be overcome by allowing Parliament to exercise its legislative role and ensuring members of the public are afforded the opportunity to deliberate on and make substantive inputs to proposed changes to electoral policy.

Tusi Fokane is a 2020 CIPESA Fellow focussing on the availability and use of digital technologies to combat the spread of Covid-19 in South Africa. She is also studying the country’s readiness for electronic voting to comply with social distancing and other movement restrictions during the upcoming local government elections.

Advancing Consumer Protection across Africa in the Digital Age

By CIPE Writer |

Consumer protection serves as an avenue for promoting transparency, accountability, and trust in the digital age, helping shield both consumers and small businesses from unfair practices online. According to a report by the International Finance Corporation and Google, “Africa’s internet economy has the potential to reach $180 billion by 2025, accounting for 5.2% of the [Continent’s] gross domestic product (GDP). By 2050, the projected potential contribution could reach $712 billion, 8.5% of the [Continent’s] GDP.” As electronic commerce (eCommerce) grows, consumer protection should be seen as an enabler of the digital economy.

Although the United Nations Guidelines for Consumer Protection offer guidance on the main characteristics of effective consumer protection legislation, “consumer protection is often one of the last areas that developing economies focus on regulating as they create frameworks around eCommerce.”

A LONG WAY TO GO

In Africa, very few countries are adequately addressing consumer protection concerns. Of the 54 African countries, only 25 have laws that pertain to online consumer rights and electronic transactions, while only four have draft laws. In other instances, issues pertaining to consumer protection are interspersed between laws.

For example, Uganda enacted laws on electronic transactions, electronic signatures, and computer misuse in 2011, yet gaps still remain in adequately securing online consumer rights. As more African countries develop new legislation and frameworks that seek to govern the digital economy, now is the opportune time for diverse stakeholder groups to engage in policy conversations and ensure that consumer protection is a priority.

IN AFRICA, VERY FEW COUNTRIES ARE ADEQUATELY ADDRESSING CONSUMER PROTECTION CONCERNS.

In addition to identifying opportunities at a local or national level, governments across Africa should work with one another and various stakeholder groups to address Continent-wide consumer protection concerns. The Center for International Private Enterprise (CIPE) and the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) identified the adoption of the Africa Continental Free Trade Agreement (AfCFTA) as an important development that presented an opportune time to advance dialogue and consensus on how to shape and govern the digital economy on the Continent to promote greater regional cohesion, development, and competitiveness.

WORKING TOGETHER TO PROTECT THE DIGITAL ECONOMY

To identify regional opportunities that can positively shape Africa’s digital transformation, CIPE and CIPESA brought together over 35 stakeholders representing the local private sector, civil society, media organizations, and government at the 2019 edition of the Forum on Internet Freedom in Africa (FIFAfrica) in Addis Ababa, Ethiopia. This regional policy dialogue formed the basis of the Roadmap to Reform: Building an Enabling Environment for Inclusive Digital Transformation in Africa.

The Roadmap advocates for the advancement of strong consumer protection legislation across Africa to “help enhance trust in eCommerce across business-to-consumer (B2C) transactions and business-to-business (B2B) transactions that can arise in disputes around digital payments.” Since the multi-stakeholder conversations surrounding the adoption of the AfCFTA in 2019, key recommendations highlighted in the Roadmap to Reform remain timely, as African Union member states begin to implement the agreement after it came into force on January 1 of this year.

There is a unique opportunity for local business communities, civil society, media organizations, and governments to work together and ensure the agreement is implemented in a way that supports an inclusive enabling environment for the digital economy. To read more about the Roadmap to Reform, please visit:  https://cipesa.org/?wpfb_dl=426