Uganda’s Changes On Computer Misuse Law Spark Fears It Will Be Used To Silence Dissidents

By News  Writer |

Uganda’s controversial Computer Misuse (Amendment) Bill 2022, which rights groups say will likely be used to silence dissenting voices online, has come into force after the country’s President Yoweri Kaguta Museveni signed it into law yesterday.

The country’s legislators had passed amendments to the 2011 Computer Misuse Act in early September, limiting writing or sharing of content on online platforms, and restricting the distribution of children’s details without the consent of their parents or guardians.

The bill was brought before the house to “deter the misuse of online and social media platforms.” A document tabled before the house stated that the move was necessitated by reasoning that “enjoyment of the right to privacy is being affected by the abuse of online and social media platforms through the sharing of unsolicited, false, malicious, hateful and unwarranted information.”

The new law, which is also curbing the spread of hate speech online, recommends the application of several punitive measures, including ineligibility by offenders to hold public office for 10 years and imprisonment for individuals who “without authorization, accesses another person’s data or information, voice or video records and shares any information that relates to another person” online.

Rights groups and a section of online communities are worried the law might be abused by regimes, especially the current one, to limit free speech and punish persons that criticize the government. Some have plans to challenge it in court.

Fears expressed by varying groups come in the wake of increasing crackdowns on individuals that don’t shy away from critiquing Museveni’s (Uganda’s longest-serving president, who also blocked social media in the run up to last year’s general election) authoritarian regime online.

Recently, a Ugandan TikToker, Teddy Nalubowa, was remanded in prison for recording and sharing a video that celebrated the death of a former security minister, who led the troops that killed 50 civilians protesting the arrest of opposition politician Robert Kyagulanyi Ssentamu (Bobi Wine) in 2020. Nalubowa, a member of Ssentamu’s National Unity Platform, was charged with offensive communication in contravention of the Computer Misuse Act 2011 amid public outcry over the harassment and intimidation of dissidents. Ssentamu, Museveni’s critic and country’s opposition leader, recently said the new amendment is targeting his ilk.

The Committee to Protect Journalists (CPJ) had earlier called on Museveni not to sign the bill into law, saying that it was an added arsenal that authorities could use to target critical commentators, and punish media houses by criminalizing the work of journalists, especially those undertaking investigations.

The Collaboration for International ICT Policy in East and Southern Africa (CIPESA) had also made recommendations including the deletion of Clause 5, which bars people from sending unsolicited information online, saying that it could be abused and misused by the government.

“In the alternative, a clear definition and scope of the terms “unsolicited” and “solicited” should be provided,” it said.

It also called for the scrapping of punitive measures, and the deletion of clauses on personal information and data, which duplicated the country’s data protection law.

The CIPESA said the law also is likely to infringe on the digital rights of individuals, including the freedom of expression and access to information, adding that the provisions did not address issues, like trolling and harassment, brought forth by emerging technologies as the law sought to do in the first place.

This article was first published by the Ghana Business on Oct 15, 2022.

Participant Reflection on #FIFAfrica22: Effective Engagement in the UPR Process for Digital Rights Promotion

By Murungi Judith |

The Collaboration on International ICT Policy for East and Southern Africa (CIPESA) and Small Media held a workshop on the Universal Periodic Review (UPR) process as part of  Forum on Internet Freedom in Africa (FIFAfrica 22), which was held in Lusaka Zambia from September 26-29, 2022. The workshop is a product of the UPROAR project aimed at advancing the cause of digital rights globally by supporting engagement in international advocacy at the UPR. 

The 32 participants at the workshop represented a diverse array of backgrounds including  civil society, digital rights activism and advocacy, legal, journalism, and academia.  A total of  20 countries were also represented -Benin,Burundi, Botswana,  Cameroon, Democratic Republic of Congo, Ethiopia, Ghana, India Kenya, Mozambique, Nigeria,  Senegal, Sri Lanka, Sudan, Tanzania,Uganda,United Kingdom, United States of America, Zambia, and Zimbabwe.

The workshop entailed an overview of the UPR, its  purpose and the processes. Also included were in depth  discussions on international and regional normative frameworks on digital rights. Specific attention was drawn to the Universal Declaration of Human Rights as the first normative framework on freedom of expression. The International Convention on Civil and Political Rights (ICCPR) was also explored under the core tenets of the right to hold opinions without interference (freedom of opinion), the right to seek and receive information (access to information) and the right to impart information (freedom of expression).

It was noted that the right to freedom of expression is not absolute and that the three-part test is key in determining the circumstances which potentially justify limitations. Under Article 19 (2) of the ICCPR, limitations are specifically listed as ( i) it must be provided for in law (ii) it must pursue a legitimate aim (iii) it must be necessary for a legitimate purpose.

The three-part-test formed the basis of heated debate related to electoral democracy and internet shutdowns in countries like Cameroon and Tanzania when compared to Kenya where the government did not impose an internet shutdown during their recent elections. As a result of the comparative discussions, participants reached  the conclusion that there are still actions of governments that are a threat to internet freedom such as arrests, detention and assassination of some journalists. It is the responsibility of civil society, activists and human rights defenders to hold governments accountable through the use and increased participation in the UPR process. 

The presence of Hon. Neema Lugangira from Tanzania, a member of Tanzanian Parliament and the Chairperson of the African Parliamentary Network on Internet Governance in the sessions was priceless and a beacon of hope in bridging the gap between civil society and policy makers towards promoting digital rights through the UPR.

The workshop also explored various case law on freedom of expression in Africa including precedent such as in Lohé Issa Konaté v Burkina Faso. Participants deliberated on the relevance of evaluating and critically assessing the law and ensuring that cases are framed in a manner that is in line with the jurisdiction of the particular court of law approached without which matters could be thrown out. This session gave the participants a clear understanding of the link between offline and online rights and specific laws that apply to minority and marginalised groups such as children, women, persons with disabilities and other vulnerable communities. 

The session on campaign and advocacy planning aimed at equipping participants with the necessary tools required to engage partners on how to carry out campaigns and to execute advocacy strategies through the UPR. It highlighted the eye-catching and precise advocacy materials that could be used in social media as well as other platforms for the UPR at local level. It led to discussions on the critical role played by local stakeholders in leveraging the UPR for digital rights development in their various contexts. The session helped the participants understand how to engage with local partners and to ensure that there is effective implementation of recommendations made to their respective countries. This involved fact sheets and how to use them during the UPR process. 

Participants engaged in a practical lobbying session where they had to appear before a UN delegate and present the issues affecting digital rights in their respective countries and recommendations for reform. This practical group exercise was very beneficial and informative because it gave the participants a chance to apply what they had learnt in regard to the UPR process. It gave them an opportunity to experience the review process at Geneva. 

Through the UPROAR Website, participants were guided on how to leverage research and social media platforms online for effective design and branding as part of UPR engagements  related to digital rights. The workshop also entailed guidance on what stakeholder mapping is and its importance.

In a subsequent panel entitled ‘Stemming the Tide: Has the Universal Periodic Review Mechanism Contributed to Changes in the Digital Rights Landscape of States Under Review?’ panelists shared experiences from Namibia, Democratic Republic of Congo, Uganda, Rwanda and many others. This gave the participants in the workshop an understanding on how to prepare for stakeholder engagements and how to conduct evidence-based advocacy at the United Nations Human Rights Council.

It was noted that the Covid-19 pandemic led to the imposition of travel restrictions which caused difficulties in traveling to Geneva to physically participate in the UPR process. Online opportunities were a welcome alternative but the lack of reliable internet access among civil society on the continent during the sessions presented an additional barrier

Beyond making submissions and engaging during review sessions, participants were urged to also take part in monitoring recommendations. Experiences were shared about governments such as that of Uganda which rejected all the recommendations that were given in regard to digital rights. In such instances participants were encouraged not to give up and draw back due to such government response but to keep doing the work of advocacy in line with digital rights since the same is also a notable step in the right direction. They were also encouraged to collaborate with law and policy members to ensure that they know about the UPR process and that they are able to positively respond to the recommendations given. They were also encouraged to ensure that there is in-country pressure from civil society to ensure that governments act on the recommendations given to them. It was noted that in Tanzania there has been a significant increase in the acceptance of recommendations after there has been collaboration between civil society and parliamentarians.  

The UPR sessions at FIFAfrica22 were very informative and intriguing as it engaged well-equipped workshop trainers. Experiences from those who had participated in Geneva engagements on digital rights stirred the urge for proactive engagement and participation by those coming up for review like Botswana.

How the MTN Group Can Improve its Digital Human Rights Policy and Reporting

CIPESA Writer |

These proposals are made to the MTN Group in respect of its Digital Human Rights Policy. The proposals commend the positive elements of the Policy including the proclamation to respect the rights of users including in privacy, communication, access and sharing information in a free and responsible manner. The submission points to areas where the telecoms group can further improve its role in the protection of human rights.

The United Nations Guiding Principles on Business and Human Rights (UNGPs) enjoin corporate entities to act with due diligence to avoid infringements on human rights. They also provide ways through which adverse impacts on human rights can be addressed. It is therefore commendable that MTN developed a Digital Human Rights Policy and is open to commentary and suggestions for  strengthening its implementation. It is imperative that MTN takes proactive and consistent measures to comply with international human rights instruments such as the UNGPs, the leading global framework focused on business responsibility and accountability for human rights, which were unanimously endorsed by States at the United Nations in 2011.

Some of the Principles that MTN needs to pay close attention to include the following:

 Principle 11: Business enterprises should respect human rights. This means that they should avoid infringing on the human rights of others and should address adverse human rights impacts with which they are involved.

Principle 13: The responsibility to respect human rights requires that business enterprises (a) Avoid causing or contributing to adverse human rights impacts through their own activities, and address such impacts when they occur; (b) Seek to prevent or mitigate adverse human rights impacts that are directly linked to their operations, products or services by their business relationships, even if they have not contributed to those impacts.

Principle 15. In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including:

(a) A policy commitment to meet their responsibility to respect human rights;

(b) A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights;

(c) Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.

Principle 23:  In all contexts, business enterprises should:

  1. Comply with all applicable laws and respect internationally recognised human rights, wherever they operate;
  2. Seek ways to honour the principles of internationally recognised human rights when faced with conflicting requirements;
  3. Treat the risk of causing or contributing to gross human rights abuses as a legal compliance issue wherever they operate.

Respect for digital rights is also stipulated in the Declaration of Principles on Freedom of Expression and Access to Information in Africa of 2019 which MTN needs to be cognisant of as part of efforts to ensure that it upholds respect for human rights.

CIPESA Proposals to the MTN Group
The MTN Group is a market leader in various service areas in several countries where it has operations. It is also a key employer and tax payer, and by facilitating the operations of other sectors,  MTN is a key contributor to the Gross Domestic Product (GDP) and to the health of the respective countries’ economies. It is crucial that the company develops and effects a robust Digital Human Rights Policy. Notably, MTN has trailed other operators, such as Orange, Millicom and Vodafone in rolling out a digital rights policy, and in transparency reporting.

While MTN last year issued its inaugural transparency report as part of its annual reporting, there are areas of concern for which we make the following recommendations:

  1. Provide more granular and disaggregated data about the number and nature of requests MTN receives from government agencies. At present, it is not clear how many of those requests relate to the release of users’ identifying data, how many were on metadata, and how many were on rendering support to communication monitoring and interception. Besides providing such a breakdown, MTN should also explain how many requests, if any, were not adhered to and why. Further, the report should indicate which particular government departments made the requests and whether all their requests were backed by a court order.
  2. Provide more nuanced information in reporting on the Digital Human Rights Policy to enable the contextualisation of country-specific explanations of government requests. In the last report, for instance, it is difficult to comprehend the information on government requests from Uganda. Given that Uganda is one of the countries where MTN has the largest number of subscribers, and given that country’s human rights record, the numbers are inexplicably few (12 in total) compared to Congo Brazzaville (1,600), eSwatini (3,661), Ghana (1,642), Guinea Conakry (6,480), Ivory Coast (4,215), Nigeria (4,751), Rwanda (602), South Africa (15,903), South Sudan (1,748), Sudan (5,105), and Zambia (8,294).
  3. In its transparency reporting on implementation of its Digital Human Rights Policy, MTN should reflect on the role of local laws and regulations in enabling or hampering the realisation of digital human rights. What elements are supportive and which ones are retrogressive? Which grey areas need clarification or call for repeal of laws?
  4. Include in the MTN transparency report a detailed and analytical section on network disruptions, as these are highly controversial and have wide-ranging economic, public service and human rights impacts yet they are becoming endemic in many of the countries where MTN operates. Further, MTN should include information on whether it received (or demanded – as we propose it should) written justifications from regulators (or government officials and bodies who issue shutdown orders) for the shutdown orders, including citation of the specific laws and provisions under which they are issued and the situation that warranted invoking the disruption. Additionally, the MTN Group should commit to scrutinise each demand, order or request and challenge them if they are not clear, specific, written, valid or do comply with national laws. It should also keep a written record of such demands, orders or requests.
  5. The MTN Policy and reporting should have a section and actions dedicated to inclusion of marginalised groups, a key area being enabling access and accessibility for persons with disabilities. Research conducted by CIPESA showed that, in countries where it operated, MTN had not taken any deliberate efforts to make its services more accessible to persons with disabilities. Beyond the additional section, MTN should appoint / designate Inclusion and Human Rights Ambassadors, and build the capacity of internal teams to facilitate engagement and compliance with digital accessibility obligations.
  6. MTN should take a proactive stance in making its Digital Human Rights Policy, including country-specific transparency information, well publicised among users, civil society and government officials in the respective countries. This will aid the growth of knowledge about MTN policies, inspire other companies to respect human rights, and draw feedback on how MTN can further improve its human rights policies and practices.
  7. MTN should develop relationships with, and have proactive and sustained engagements with civil society, consumer groups and governments on the implementation of its Digital Human Rights Policy. Such engagements should not only be post-mortem after-the-fact reviews of reports after their publication but should be continuous and feed into the annual reporting. This engagement should also include external experts and stakeholders in the conduct of regular human rights due diligence as envisaged by Principle 15 of the UNGPs. Such engagements could also relate to raising concern on the national laws, policies and measures which pose a risk to digital rights.
  8. As part of due diligence, MTN should periodically assess and examine the impact of its enforcement of its terms and service, policies and practices to ensure they do not pose risks to individual human rights, and the extent to which they comply with the UNGPs and are consistent with its Digital Human Rights Policy. Such assessments are essential to determining the right course of action when faced with government requests and other potential human rights harms.
  9. MTN should add to its Policy and make public its position on network disruptions and outline a clear policy and the procedures detailing how it handles information requests, interception assistance requests, and disruption orders from governments.
  10. Support initiatives that work to grow access, affordability, and secure use of digital technologies, and speak out about any licensing obligations and government practices that undermine digital rights.
  11. Join key platforms that collaboratively advance a free and open internet and respect for human rights in the telecommunications sector, such as the Global Network Initiative (GNI), endorse the GSMA Principles for Driving Digital Inclusion for Persons with Disabilities, and align with local actors on corporate accountability (such as the Uganda Consortium on Corporate Accountability).
  12. MTN should at a minimum, provide simple and clear terms of service, promptly notify users of decisions made affecting them, and provide accessible redress mechanisms and effective remedies.
  13. MTN should institutionalise its commitment to digital rights by putting in place a governance structure at the country level with oversight at a senior level, train its employees on the policy, and create awareness among its customers to ensure the realisation of the policy.

CIPESA stands ready to continue to engage with MTN on ways to improve and effect its Digital Human Rights Policy. We can be contacted at [email protected].

South Sudan’s Cybercrimes and Computer Misuse Order 2021 Stifles Citizens’ Rights

By Edrine Wanyama |

South Sudan has enacted the Cybercrimes and Computer Misuse Provisional Order 2021 aimed to  combat  cybercrimes. The country has a fast-evolving technology sector, with three mobile operators and 24 licensed internet service providers. Investments in infrastructure development have propelled internet penetration to 16.8% and mobile phone penetration to 23% of the country’s population of 11.3 million people, which necessitates a law to curb cybercrime.

The Order is based on article 86(1) of the Transitional Constitution of South Sudan 2011, which provides that when parliament is not in session, the president can issue a provisional order that has the force of law in urgent matters.

The Cybercrimes and Computer Misuse Order makes strides in addressing cybercrimes by extending the scope of jurisdiction in prosecuting cybercrimes to cover offences committed in or outside the country against citizens and the South Sudan state. The Order also establishes judicial oversight especially over the use of forensic tools to collect evidence, with section 10 requiring authorisation by a competent court prior to collecting such evidence. Furthermore, the Order attempts to protect children against child pornography (section 23 and 24), and provides for prevention of trafficking in persons (section 30) and drugs (section 31).

However, the Order is largely regressive of citizens’ rights including freedom of expression, access to information, and the right to privacy.

The Order gives overly broad definitions including of “computer misuse,” “indecent content,” “pornography,” and “publish” which are so ambiguous and wide in scope that they could be used by the state to target government opponents, dissidents and critics. The definitions largely limit the use of electronic gadgets and curtail the exercise of freedom of expression and access to information.

Article 22 of the Transitional Constitution of South Sudan 2011 guarantees the right to privacy. The country has ratified the International Convention on Civil and Political Rights (ICCPR) that provides for the right to privacy under article 17 and the African Charter on Human and Peoples Rights, whose article 5 provides for the right to respect one’s dignity, which includes the right to privacy. The Order appears to contravene these instruments by threatening individual privacy.

Despite a commendable provision in section 6 imposing an obligation on service providers to store information relating to communications, including personal data and traffic data of subscribers, for 180 days – a period far shorter compared to other countries – personal data is still potentially at risk. The section requires service providers and their agents to put in place technical capabilities to enable law enforcement agencies monitor compliance with the Order. With no specific data protection law in South Sudan and without making a commitment to the leading regional instrument, the African Union Convention on Cyber Security and Personal Data Protection, privacy of the citizens is at stake.

The section on offences and penalties lacks specificity on fines which may be levied on errant individuals or companies. On the other hand, some of the offences provided for under the Order potentially curtail freedom of expression and the right to information. For instance, the offence of spamming under section 21 could be interpreted to include all communications through online platforms including social media platforms like Facebook and WhatsApp. Under the provision, virtually all individuals who forward messages on social media stand the risk of prosecution. This also has a chilling effect on freedom of expression and the right to information.

The offence of offensive communication under section 25 potentially has a chilling effect on freedom of expression, media freedom and access to information. A similar provision under section 25 of the Computer Misuse Act, 2011 of Uganda has been widely misused to persecute, prosecute and silence political critics and dissidents. Section 25 of the South Sudan Cybercrimes Order could be used in a similar manner to target government critics and dissidents. 

In CIPESA’s analysis of the Order, we call for specific actions that could ensure the prevention of cybercrime while at the same time not hurting online rights and freedoms, including:

  • Deletion of problematic definitions or provisions from the Order.
  • Enactment of a specific data protection law to guarantee the protection of data of individuals.
  • Urgent drafting of rules and regulations to prescribe the procedures for implementing the Order.
  • Ratification of the African Union Convention on Cyber Security and Personal Data Protection.
  • Service providers should not be compelled to disclose their subscribers’ information to law enforcement agencies except on the basis of a court order.
  • Amendment of the Order to emphasise the oversight role of courts during the processes of access, inspection, seizure, collection and preservation of data or tracking of data under section 9.

Read the full analysis here.

Mauritius’ Social Media Regulation Proposal Centres State-Led Censorship

By Daniel Mwesigwa |

In Sub-Saharan Africa, Mauritius leads in many aspects. It is the only country on the continent categorised as a “full democracy” by the Economist Intelligence Unit Democracy Index for 2020. Additionally, it has the second highest per capita income (USD 11,099) and one of the highest internet penetration rates in the region (72.2%).

However, the recently published consultation paper on proposed amendments to the country’s Information and Communications Technology (ICT) law, purportedly aimed at curbing abuse and misuse of social media, could place Mauritius among the ranks of regressive states. The proposed establishment of a National Digital Ethics Committee (NDEC) to determine what content is problematic in addition to a Technical Enforcement Unit to oversee the technical enforcement of NDEC’s measures has potential surveillance and censorship implications.

The social media regulation proposals by Mauritius are made in light of increasing calls for accountability of technology platforms such as Google and Facebook by western countries. Indeed, the consultation paper cites Germany’s Network Enforcement Act (colloquially known as the Facebook Act), which requires social media platforms to remove “illegal content” from their platforms within 24 hours of notice by users and complaint bodies. Non-compliance penalties are large – with fines ranging between five  million and 50 million euros.

The paper states that, unlike in Germany and other countries like France, the United Kingdom, and Australia, complaints by Mauritian local authorities to social media platforms “remain unattended to or not addressed in a timely manner”. Moreover, it adds, cooperation under the auspices of domestic laws and regulations is only effective in countries where technology companies have local offices, which is not the case in Mauritius. As such, according to the Authority, “the only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.”

However, the Authority’s claims of powerlessness appear unfounded. According to Facebook’s Transparency report, Mauritius made two requests for preservation of five user accounts pending receipt of formal legal processes in 2017. In 2019, Mauritius made one request to Facebook for preservation of two accounts. Similarly, the country has barely made any requests for content take down to Google, with only a total of 13 since 2009. The country has never made a user information or content takedown request to Twitter. In comparison, South Africa made two requests to Facebook for preservation of 14 user accounts in 2017 and 16 requests for preservation of 68 user accounts in 2019. To Google, South Africa has made a total of 33 requests for 130 items for removal since 2009 while to Twitter, it has made six legal demands between 2012 and 2020.

Broad and Ambiguous Definitions

According to section 18(m) of Mauritius’ Information and Communication Technologies Act (2001, amended multiple times including in 2020), the ICT Authority shall “take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services”.

Although the consultation paper states that the Authority has previously fulfilled this mandate in the fight against child pornography,  it concedes that it has not fulfilled the part of curtailing illegal content as it is not currently vested with investigative powers under the Act. The consultation paper thus proposes to operationalise section 18(m) through an operational framework that empowers the Authority “to carry out investigations without the need to rely on the request for technical data from social media administrators.”

The amendments to the ICT Act will relate to defining a two-pronged operational framework with the setting up of: i) a National Digital Ethics Committee (NDEC) as the decision making body on illegal and harmful content; and ii) a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

However, neither the existing Act nor the consultation paper define what constitutes “illegal content”. Whereas the consultation paper indicates that the Chairperson and members of NDEC would be “independent, and persons of high calibre and good repute” in order to ensure transparency and public confidence in its functions, the selection criteria and appointing Authority are not specified, nor are recourse mechanisms for fair hearing and appeals against the decisions of the proposed entity.

An Authoritarian Approach to Internet Architecture

Through a technical toolset (a proxy server), proposed under section 11, the regulator will be able to identify social media traffic which will then be automatically decrypted, archived, and analysed. For instance, the technical toolset would undermine HTTPS in order to inspect internet traffic. This means that information of all social media users pertaining to device specifics, content type, location, among others, would be available to the authorities. The regulator expects that once a complaint regarding social media is received, they will be able to block the implicated web page or profile without necessarily needing the intervention of social media platforms.

Additionally, the Authority expects social media users to accept installation of a one-time digital certificate on their internet-enabled devices to facilitate the re-encryption of traffic before it is transferred to the social networking sites. In other words, the Authority wants internet users in Mauritius to replace their own padlocks used for their home security with ones given to them by the Authority, which it has open and unfettered access to.

On the other hand, Mauritius’ commitments to freedom of expression, data protection and privacy potentially collide with these social media regulation proposals. In particular, Mauritius’ Data Protection Act (2017) requires informed consent of users, prohibits disproportionate collection of user data, and mandates fair and lawful processing of user data. The Data Protection Act was enacted to align with the European Union’s General Data Protection Regulation (GDPR). In March 2018,  Mauritius also ratified the African Union Convention on Cybersecurity and Personal Data Protection, although the Convention is yet to be enforced due to lack of quorum. Moreover, in September 2020, Mauritius signed and ratified the Council of Europe’s Convention for the Protection of individuals with regard to automatic processing of personal data.

Indeed, the Authority is aware of the potential infractions of the proposed technical measures on basic freedoms — stating in the paper that “the proposed statutory framework will undoubtedly interfere with the Mauritian people’s fundamental rights and liberties in particular their rights to privacy and confidentiality and freedom of expression”. Its seeking views and suggestions of “an alternative technical toolset of a less intrusive nature” may very well be an open solicitation for more surreptitious ways of monitoring social media data, with fundamental rights still at stake.

 Democracy and Local Investment

While Mauritius runs a multiparty system of government, its human rights record has been steadily deteriorating, according to the United States Department of State’s Human Rights Report 2020. Moreover, basic freedoms such as freedom of expression are being curtailed through digital taxation and clampdown on social media dissent. Recently, Twitter cited stability and democracy as the key reasons for the opening of its first Africa offices in Ghana. Although Mauritius is strategically placed as a regional and economic hub in Africa, and has been positioning itself as a “Cyber Island”, legal frameworks such as the proposed ICT law amendments and mixed rankings on democracy alongside high rankings on internet access and ease of doing business may likely undermine the country’s international competitiveness and internet freedom standing.

Accordingly, the Authority would do well to immediately discontinue these plans to employ technical measures to monitor social media and internet traffic as they would amount to multiple breaches of fundamental freedoms. The proposals also run counter to the Data Protection Act which prioritises minimisation of data collected and informed user consent. Moreover, the technical proposal would promote self-censorship and undermine the basic workings of the institutions of democracy.

Further, although social media regulation could be paved by good intentions such as the need to stamp out inflammatory content, it could be more beneficial to explore alternative options with a range of stakeholders to promote more fair and transparent content moderation practices in line with international human rights law. Mauritius has already proved that aligning domestic and international laws and practices is necessary by fashioning its data protection law along the lines of the GDPR. Additionally, Mauritius could leverage existing partnerships with other countries of regional economic blocs such as The Common Market for Eastern and Southern Africa (COMESA) to form a coalition of fact-checkers that have direct access to social media platforms.

Finally, the Authority could collaborate with technology platforms such as Facebook to support Creole language human moderators. This could be a necessary step to enhancing content moderation through automated decisions and more so for “low resource” groups of languages including Mauritian Creole.