Aperçu de la Responsabilité des Intermédiaires au Sénégal

By Astou Diouf |

Parmi ses homologues ouest-africains, le Sénégal fait partie des leaders dans les efforts de numérisation. Ses classements en matière de liberté de la presse sont élevés et le pays a également enregistré des progrès positifs en matière de protection des données. Les acteurs du secteur des télécommunications comprennent les entrants de 2018, ARC Telecom, WAW Telecom et Africa Access, aux côtés de l’entreprise  Sonatel, Free (initialement sous licence SENTEL, rebaptisée plus tard Tigo) et Expresso Sénégal

De plus, l’accessibilité d’Internet reste un défi, le pays se classant 25ème  sur 72 pays évalués selon l’indice d’accessibilité. En décembre 2020, la pénétration d’Internet au Sénégal était estimée à 88,7% et la pénétration mobile à 114,2%. Cependant, il existe des inquiétudes concernant les contrôles répressifs prétendument destinés à lutter contre la cybercriminalité, la désinformation et les discours de haine. 

Cet article met en évidence l’état de la responsabilité des intermédiaires au Sénégal à travers l’environnement juridique et réglementaire relatif aux obligations des intermédiaires, y compris la divulgation d’informations/données aux autorités répressives, le filtrage ou le blocage de contenu et les limitations du service. 

Aperçu législatif et réglementaire

La loi sur les transactions électroniques et le décret sur les communications électroniques sont les principales législations qui établissent un cadre de responsabilité des intermédiaires au Sénégal. L’article 3(1) de la loi n° 2008-08 du 25 janvier 2008 relative aux transactions électroniques désigne les intermédiaires comme « les personnes dont l’activité est de fournir au public l’accès à des services par le biais des technologies de l’information et de la communication ».

S’inspirant de la loi française n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique, la loi de 2008 impose aux intermédiaires des obligations limitées de surveillance des contenus, mais les oblige à mettre en place des mécanismes pour supprimer ou empêcher l’accès aux contenus illicites, informer les utilisateurs des restrictions de service et des plaintes.

L’article 3(2) précise les personnes physiques ou morales qui fournissent au public un service de stockage de signaux, d’écrits, d’images, de sons ou de messages «  ne peuvent pas voir leur responsabilité civile engagée du fait  des activités ou des informations stockées à la demande d’un destinataire de ces services si elles n’avaient pas effectivement connaissance de leur caractère illicite ou de faits et circonstances faisant apparaître ce caractère ou si,  dès le moment où elles en ont eu cette connaissance, elles ont agi promptement pour retirer ces données ou pour en rendre l’accès impossible ».

Cependant, sans une définition claire de ce qui constitue un contenu illicite, la loi sur les transactions électroniques laisse place à la restriction de l’accès à un contenu arbitrairement jugé illégal, mais il n’existe pas de dispositions claires sur les moyens de contester les décisions de retrait de contenu. 

En revanche, la confidentialité des informations personnelles est exigée par l’article 5. Le non-respect des dispositions de la loi sur les transactions électroniques constitue une infraction aux articles 431-46 à 431-49 du Code pénal de 2016, passible d’une amende de 250 000 à 1 000 000 de francs (461 à 1 845 USD), d’un emprisonnement de six mois à un an, ou des deux.

Le décret de 2008 sur les communications électroniques considère les intermédiaires comme des parties neutres n’exerçant aucun contrôle sur le contenu, en partant du principe qu’ils ne font que transmettre ou stocker des informations, parfois de manière temporaire. Ainsi, l’article 6 limite la responsabilité des intermédiaires lorsque « 1) ils ne choisissent pas le destinataire de la transmission ; 2) ils n’initient pas la transmission ; 3) les activités de transfert et de fourniture d’accès visent exclusivement à effectuer la transmission ou la fourniture d’accès ; 4) ils ne modifient pas les informations soumises à la transmission ; 5) ils exécutent une décision d’une autorité judiciaire ou administrative de supprimer l’information ou d’en interdire l’accès. 

Alors que la loi sur les transactions électroniques et le décret sur les communications électroniques limitent la responsabilité des intermédiaires, d’autres lois imposent des obligations qui ont des implications sur les droits des utilisateurs, comme détaillé ci-dessous. Il s’agit notamment de la loi sur les services de renseignement, de la loi modifiant le Code de procédure pénale, du Code des communications électroniques et de la loi modifiant le Code pénal.

Interception des communications et divulgation d’informations

La loi n°2016-33 du 14 décembre 2016 relative aux services de renseignement en vertu de l’article 10 précise que dans l’intérêt de la sécurité nationale, les autorités de renseignement peuvent « utiliser des procédures techniques, intrusives, de surveillance ou de localisation pour collecter des informations utiles à la neutralisation de la menace ». L’article 11 exige des prestataires de services qu’ils coopèrent et assistent des « organismes privés concernés » non spécifiés dans leurs activités de renseignement. 

La loi n° 2016-30 modifiant la loi n° 65-61 de 1965 portant code de procédure pénale mentionne également la responsabilité des intermédiaires en matière d’enquêtes pénales. L’article 90-11 exige la coopération d’intermédiaires avec les autorités d’enquête pour collecter ou l’enregistrer « en temps réel » des données et communications électroniques pertinentes. L’article 90-14 prévoit que le procureur de la République doit adresser aux opérateurs de télécommunications et aux fournisseurs de services une demande formelle de coopération. L’enregistrement et l’interception des communications en vertu du code pénal sont soumis à l’autorisation écrite d’un juge.

En outre, l’article 90-17 habilite les juges à ordonner aux intermédiaires de décrypter les données ou de fournir des informations sur le fonctionnement des systèmes cryptés. Les ordonnances ne sont pas susceptibles d’appel et leur validité est limitée à une période de deux à quatre mois renouvelable au cas par cas. L’absence de dispositions permettant aux personnes faisant l’objet d’une surveillance de contester les ordonnances judiciaires va à l’encontre des dispositions de la Convention de Budapest (à laquelle le Sénégal est partie), qui vise à assurer un équilibre approprié entre les intérêts des forces de l’ordre et le respect des droits fondamentaux de l’homme.

L’article 20 du Code des communications électroniques réaffirme l’obligation pour les prestataires de services de coopérer avec les autorités gouvernementales conformément aux dispositions de l’article 90-11 du Code de procédure pénale, notamment en divulguant les informations pertinentes et en offrant une assistance technique sur demande. 

Limitations du service

Le Code des communications électroniques 2018 impose aux fournisseurs de services de « prévenir une congestion imminente du réseau et atténuer les effets d’une congestion exceptionnelle ou temporaire, pour autant que les catégories équivalentes de trafic fassent l’objet d’un traitement égal » (article 27) ». Il ajoute que « l’autorité de régulation peut autoriser ou imposer toute mesure de gestion du trafic qu’elle juge utile pour préserver la concurrence dans le secteur des télécommunications électroniques et assurer un traitement équitable de services similaires ». En application de ces dispositions, les intermédiaires peuvent réduire le débit ou interrompre Internet à des moments et des lieux, sous prétexte de réduire la congestion du réseau. Les dispositions confèrent également à l’Autorité de régulation des télécommunications et des postes (ARTP) des pouvoirs illimités pour autoriser ou imposer des restrictions sur la disponibilité des réseaux de communication numériques. 

Des exigences strictes de confidentialité et de continuité de service sont également imposées aux intermédiaires et à leur personnel en vertu de l’article 167 du Code pénal qui prévoit  que « la suppression ou l’ouverture de correspondance adressée à des tiers de mauvaise foi » est un délit passible d’un emprisonnement de six jours à un an et une amende de 20 000 à 100 000 francs (36 à 185 USD), ou les deux.  

Restrictions de contenu

Il n’y a pas d’obligations spécifiques pour les intermédiaires de surveiller activement les réseaux et les plateformes à la recherche de contenus contrefaits. L’article 3, paragraphe 5, de la loi de 2008 sur les transactions électroniques stipule que les prestataires de services « ne sont pas soumises à une obligation générale de surveiller les informations qu’ils transmettent ou stockent, ni à une obligation générale de rechercher des faits ou des circonstances révélant des activités illicites ». Cependant, la disposition fait l’objet d’une activité de surveillance ciblée et de demandes des autorités judiciaires. En ce qui concerne les crimes contre l’humanité, l’incitation à la haine raciale et la pornographie infantile, l’article 3, paragraphe 5, prévoit que les intermédiaires doivent mettre en place des systèmes « facilement accessibles et visibles » pour permettre que de tels contenus soient portés à leur attention.

Alors que la Constitution du Sénégal garantit la liberté d’expression, l’article 255 du Code pénal prévoit que : « La publication, la diffusion, la divulgation ou la reproduction, par quelque procédé que ce soit, de fausses nouvelles, fabriquées, falsifiées ou faussement attribuées à des tiers » qui entraîne la désobéissance civile, met en danger la population ou discrédite les institutions publiques est un délit passible d’un emprisonnement d’un à trois ans et d’une amende de 100 000 à 1 500 000 francs (185 à 2 770 USD). Sans une définition claire de ce qui constitue une fausse nouvelle et compte tenu des exigences de coopération avec les autorités chargées de l’application des lois, le défaut des intermédiaires de signaler toute infraction peut entraîner des sanctions. 

Aux termes de l’article 431-61 du Code pénal, la condamnation pour une infraction à la loi commise via des communications électroniques entraîne des peines supplémentaires. Elles comprennent l’interdiction d’envoyer des communications électroniques, l’interdiction temporaire ou définitive d’accéder au site utilisé pour commettre l’infraction ou à son hébergeur. L’article impose également aux prestataires de services de mettre en œuvre les mesures nécessaires pour assurer le respect des sanctions, dont la violation est un délit passible de six mois à trois ans d’emprisonnement et d’une amende de 500 000 à 2 000 000 francs (923 à 3 693 USD). 

Cas de responsabilité des intermédiaires 

  1. Plusieurs entités privées et publiques collectent des données à caractère personnel au Sénégal. Par exemple, il existe un enregistrement obligatoire de la carte SIM lié à la base de données nationale d’identité. Cependant, de nombreux cas de non-respect de la loi sur la protection des données et des règlements de la Commission des données personnelles (CDP) ont été signalés. Voir, par exemple, l’avis trimestriel du CDP
  2. Lors des émeutes du début de l’année 2021, le gouvernement a suspendu les chaînes de télévision privées Sen TV et Walf TV pour avoir diffusé à plusieurs reprises des images des troubles qui ont suivi l’arrestation du leader de l’opposition sénégalaise Ousmane Sonko. En outre, l’accès aux plateformes de médias sociaux, notamment YouTube et Whatsapp, a été restreint.
  3. Le 20 juin 2019, le journal en ligne « Pressafrik » aurait été inaccessible pendant des heures après avoir collaboré avec la BBC sur un reportage d’enquête sur des allégations de corruption impliquant le frère du président Macky Sall. Selon le directeur  de publication Ibrahima Lissa Faye, le piratage a été “sponsorisé” étant donné que “60 % des sites d’information sénégalais sont chez le même hébergeur et PressAfrik est le seul site à être inaccessible”. 
  4. Le régulateur des télécommunications, l’ARTP, a par le passé lancé des ultimatums aux opérateurs de télécommunications pour améliorer la qualité des services.
  5. Selon le rapport Transparence de Facebook, le Sénégal a fait six demandes de données d’utilisateurs, concernant sept comptes en 2020  dont aucune n’a été respectée. Les demandes antérieures, au nombre de 21 pour la période 2016-2019, n’ont pas non plus été satisfaites.
  6. Depuis 2009, le Sénégal a fait quatre demandes de suppression de contenu à Google.
  7. En 2016, le Sénégal aurait fait le deuxième plus grand nombre de demandes d’informations sur les abonnés  à Orange  – 18 653, contre 13 557 l’année précédente.  

Conclusion et recommandations

 

L’environnement législatif et pratique de la responsabilité des intermédiaires au Sénégal manque de clarté sur les rôles et les obligations. Dans certains cas, des pouvoirs excessifs sur l’exploitation du réseau sont accordés aux prestataires de services et au régulateur. Dans d’autres, les exigences de coopération avec les autorités répressives sont larges, sans spécifier les voies de recours en cas d’abus des droits des utilisateurs. Alors que la loi sur les transactions électroniques et le décret sur les communications électroniques sont clairs sur le rôle de l’intermédiaire en ce qui concerne le contenu de l’utilisateur, la loi sur les services de renseignement, le code pénal et d’autres documents prévoient des dispositions contradictoires relatives à la surveillance et à l’interception des communications susceptibles de porter atteinte à la vie privée et à la liberté d’expression en ligne.

Une législation spécifique est nécessaire pour déterminer la responsabilité des intermédiaires, y compris avec précision sur les contenus susceptibles de faire l’objet d’un retrait ou d’un blocage, des procédures d’appel pour les décisions et des mesures de rétablissement des contenus supprimés. En l’absence d’un document juridique spécifique entièrement dédié à la responsabilité des intermédiaires, la définition de la responsabilité des intermédiaires, des responsabilités et obligations, ainsi que des contenus illicites devrait être claire et cohérente dans l’ensemble de la législation existante.

Pour leur part, les intermédiaires devraient fournir des conditions d’utilisation des services claires, accessibles et compréhensibles, y compris des options de confidentialité, de sauvegarde et d’anonymisation, dans des formats accessibles afin de promouvoir la confidentialité et la protection des données. En outre, une transparence accrue des prestataires de services devrait inclure une communication préalable des modifications apportées aux politiques d’utilisation pertinentes, des restrictions de service, ainsi que la publication de rapports détaillés sur la coopération avec les autorités. Parallèlement, il est nécessaire de renforcer les partenariats et l’engagement avec la société civile en vue d’un plaidoyer collaboratif pour promouvoir les principes des entreprises et des droits de l’homme.

Astou DIOUF est boursière CIPESA et travaille sur le rôle des intermédiaires et des fournisseurs de services Internet dans la lutte contre la Covid-19 au Sénégal, notamment sur des questions telles que la facilitation de l’accès à Internet, les atteintes à la vie privée et aux données personnelles, et le contenu.

Overview of Intermediary Liability in Senegal

By Astou Diouf |

Among its west African counterparts, Senegal is among the leaders in digitalisation efforts. Its press freedom rankings are high and the country has also recorded positive strides in data protection. Telecommunications sector players include 2018 entrants ARC Telecom, WAW Telecom and Africa Access, alongside the state-owned Sonatel, Free (initially licensed as SENTEL, later rebranded as Tigo), and Expresso Senegal

Moreover, internet affordability remains a challenge, with the country ranked 25th out of 72 countries assessed under the Affordability Index. As at December 2020, internet penetration in Senegal was estimated at 88.7% and mobile penetration at 114.2%. However,  there are concerns about repressive controls purportedly aimed at countering cybercrime, misinformation and hate speech. 

This article highlights the state of intermediary liability in Senegal, including the legal and regulatory environment relevant to intermediaries’ obligations including information/ data disclosure to law enforcement authorities, filtering or blocking content, and service restrictions. 

Legislative and Regulatory Overview

The electronic transactions law and eCommunications decree are the primary legislations that establish an intermediary liability framework in Senegal. Article 3(1) of law n° 2008-08 of January 25, 2008 on Electronic Transactions refers to intermediaries as “persons whose activity is  to provide  the public access to services through information and communication technologies”.

Borrowing from France’s law n° 2004-575 of June 21, 2004 on Confidence in the Digital Economy, the 2008 law places limited obligations on intermediaries to monitor content, but requires them to put in place mechanisms to remove or prevent access to unlawful content, inform users of service restrictions and complaints.

Article 3(2) states that a natural or legal persons who provides  to the public a service of storage of signals, writings, images,  sound or messages “cannot be held liable for the activities or information stored at the request of a recipient of these services if they did not have actual knowledge of their illicit nature or of facts and circumstances showing this nature or if, from the moment they had such knowledge, they acted promptly to remove this data or to make access [to it] impossible“.

However, without a clear definition of what constitutes illicit content, the electronic transactions law leaves room for restriction of access to content arbitrarily deemed illegal yet there are no clear provisions on ways to challenge content takedown decisions. 

On the upside, confidentiality of personal information is required under Article 5. Failure to comply with the provisions of the electronic transactions law is an offence under Articles 431-46 to 431-49 of the Penal Code, 2016, punishable with a fine of between 250,000 and 1,000,000 Francs (USD 461-1,845), imprisonment of between six months and one year, or both. 

The 2008 decree on eCommunications considers intermediaries to be neutral parties with no control over content, assuming that they merely provide transmission or storage of information, sometimes temporarily. Accordingly, Article 6 limits the liability of intermediaries when “1) they do not select the recipient of the transmission; 2) they do not initiate the transmission; 3) the activities of transfer and provision of access are aimed exclusively at carrying out the transmission or provision of access; 4) they do not modify the information that is subject to transmission; 5) they execute a decision of a judicial or administrative authority to remove the information or prohibit access to it.” 

While the electronic transactions law and the eCommunications decree limit the liability of intermediaries, other laws place obligations that have implications on users’ rights as detailed below. These include the law on intelligence services, the law amending the Code of Criminal Procedure, the eCommunications Code and the law amending the Penal Code.

Interception of Communications and Information Disclosure

The law n°2016-33 of December 14, 2016 relating to Intelligence Services under Article 10 states that in the interest of national security, intelligence authorities can “use technical, intrusive, surveillance or location procedures to collect information useful for neutralising the threat’’. Article 11 requires service providers to cooperate with and assist unspecified “relevant private bodies” with intelligence activities. 

Act No. 2016-30 amending Act No. 65-61 of 1965 on the Code of Criminal Procedure also mentions  intermediary liability in relation to criminal investigations. Article 90-11 requires the cooperation of intermediaries with investigative authorities in collecting or recording “in real time” relevant electronic data and communications. Article 90-14 provides that a public prosecutor must issue  to telecommunications operators and service providers a formal request for cooperation. Recording and interception of communications under the criminal code are subject to written authorisation by a judge. 

Further, article 90-17 empowers judges to order intermediaries to decrypt data or provide information on the operation of encrypted systems. Orders are not subject to appeal and their validity is restricted  to between two and four months renewable on a case-by-case basis. The lack of provisions for individuals subject to surveillance to challenge court orders is against the provisions of the Budapest Convention (which Senegal is Party to), aimed at ensuring an appropriate balance between the interests of law enforcement and respect for fundamental human rights.

Article 20 of the eCommunications Code re-emphasises the requirement for service providers to cooperate with government authorities in accordance with the provisions of Article 90-11 of the Code of Criminal Procedure, including through disclosing relevant information and offering technical assistance when asked. 

Service Restrictions

The 2018 eCommunications Code requires service providers to “prevent impending network congestion and mitigate the effects of exceptional or temporary congestion, provided that equivalent categories of traffic are subject to equal treatment” (Article 27)”. It adds that “the regulatory authority may authorise or impose any traffic management measure it deems useful to preserve competition in the electronic telecommunications sector and ensure fair treatment of similar services.” In application of these provisions, intermediaries can reduce the speed or interrupt the internet at times and locations, under the pretext of reducing network congestion. The provisions also give the Telecommunications and Postal Regulatory Authority (ARTP) unchecked powers to authorise or impose restrictions  on the availability of digital communication networks. 

Strict confidentiality and continuity of service requirements are also placed on intermediaries and their staff under the Penal Code Article 167 which states that “deletion or opening of correspondence addressed to third parties in bad faith” is an offense punishable by imprisonment for between six days and one year, a fine of 20,000-100,000 francs (USD 36-185), or both.  

Content Restrictions

There are no specific obligations for intermediaries to actively monitor networks and platforms for infringing content. Article 3(5) of the 2008 electronic transactions law states that service providers “are not subject to a general obligation to monitor the information they transmit or store, nor to a general obligation to search for facts or circumstances revealing illicit activities.” However, the provision is subject to targeted surveillance activity and requests by judicial authorities. In relation to crimes against humanity, incitement to racial hatred and child ponography, Article 3(5) states that intermediaries should set up systems that are “easily accessible and visible” to allow for such content to be brought to their attention. Furthermore, to promptly inform authorities of infringing content and inform users of the policies and practice in place to fight against illegal content. 

Whereas the Constitution of Senegal guarantees free speech, the Penal Code under Article 255 provides that: “The publication, dissemination, disclosure or reproduction, by any means whatsoever, of false news, fabricated, falsified or falsely attributed to third parties” that results in civil disobedience, endangers the public, or discredits public institutions is an offense punishable by imprisonment of one to three years and a fine of 100,000 to 1,500,000 Francs (USD 185 to 2,770). Without a clear definition of what constitutes false news, and considering requirements to cooperate with law enforcement authorities, failure of intermediaries to report any infringements may lead to sanctions. 

Under Article 431-61 of the Penal Code, conviction for an offense under the law that is committed via electronic communications attracts additional penalties. They include prohibition from sending electronic communications, temporary or permanent prohibition of access to the site used to commit the offense or its host. The article also requires service providers to implement measures necessary to ensure compliance with the penalties, violation of which is an offense punishable by six months to three years imprisonment and a fine of 500,000 to 2,000,000 Francs (USD 923 to 3,693). 

Cases of intermediary liability 

  1. Several private and public entities collect personal data in Senegal. For instance, there is Mandatory SIM card registration linked to the national identity database. However, there have been numerous reports of non-compliance with the data protection law and Commission of Personal Data (CDP) regulations. See, for instance, quarterly CDP notice
  2. During riots in early 2021, the government suspended private television channels Sen TV and Walf TV for repeatedly broadcasting images of the unrest following the arrest of the Senegalese opposition leader Ousmane Sonko. Furthermore, access to social media platforms including Youtube and Whatsapp was restricted.
  3. On June 20, 2019, the online newspaper “Pressafrik” was allegedly inaccessible for hours after it collaborated with the BBC on an investigative report into allegations of corruption implicating the brother of President Mack Sall. According to the Publishing Director Lissa Faye, the hack was “sponsored” given that “60% of Senegalese news sites are with the same host and PressAfrik is the only site to be inaccessible”. 
  4. The telecoms regulator ARTP has in the past issued ultimatums to telecommunications operators to improve quality of services.
  5. According to Facebook’s Transparency report, Senegal made six requests for user data, relating to seven accounts in 2020 – none of which was complied with. Earlier requests totaling 21 in the period 2016-2019 were also not complied with.
  6. Since 2009, Senegal has made four requests to remove content to Google
  7. Back in 2016, Senegal is reported to have made the second highest number of subscriber information requests  to Orange  – 18,653, up from 13,557 the previous year.  

Conclusion and Recommendations

The legislative and practice environment for liability of intermediaries in Senegal lacks clarity on roles and obligations. In some cases excessive powers over network operations are granted to service providers and the regulator. In others, requirements to cooperate with law enforcement authorities are broad, without specifying the recourse avenues for abuse of users’ rights. While the eTransactions Act and the Decree on eCommunications are clear about intermediary’s role regarding user’s content, the Intelligence Services Act, the Penal Code and other documents provide conflicting provisions related to surveillance and interception of communications that are likely to infringe privacy and freedom of expression online. 

There is need for specific legislation to determine the liability of intermediaries including with precision on content subject to take down or blockage, appeals procedures for decisions and measures for reinstating removed content. In the absence of a specific legal document entirely dedicated to intermediary liability, definition of intermediary liability, responsibilities and obligations, as well as unlawful content should be clear and consistent across all the existing legislation.

For their part, intermediaries should provide clear, accessible and understandable terms and conditions for service use, including options for privacy, back up and anonymisation, in accessible formats towards promoting privacy and data protection. Furthermore, increased transparency of service providers should include advance communication of changes to relevant user policies, service restrictions, as well as publication of detailed reports on cooperation with authorities.  Meanwhile, there is need for increased partnerships and engagement with civil society towards collaborative advocacy to promote business and human rights principles

Astou Diouf is a CIPESA Fellow, working on the role of internet intermediaries and service providers in the fight against Covid-19 in Senegal, including on issues such as facilitating increased access to the internet, privacy and personal data infringements, and content.

Are Cryptocurrencies the Future of Freedom and Financial Inclusion in Africa?

By Daniel Mwesigwa and Thomas Robertson |

Advances in innovation have ushered in new approaches to digital transformation and financial service provision. With the growth in internet connectivity in sub-Saharan Africa, emerging technologies such as blockchain and cryptocurrencies have the potential to advance financial inclusion. 

Blockchain is the technology underpinning cryptocurrencies such as Bitcoin, Ethereum, and Litecoin, among others. The emergence of cryptocurrencies in Africa is particularly exciting due to the opportunities they provide for Africans in cash-based and informal economies to participate in alternative financial infrastructures. Many traditional financial infrastructures across Africa are often subject to high levels of volatility and ineffective governance. Blockchain financial technology allows for alternative financial infrastructures that increase monetary stability and efficient governance through a decentralised digital financial system.

Exploring the Digital Currencies Landscape in Africa

According to the World Bank, sub-Saharan Africa has one of the highest remittance rates in the world. In 2019, 3.6% of sub-Saharan Africa’s Gross Domestic Product (GDP) was derived from personal remittances- a figure over three times the global average. However, the region also has the world’s largest unbanked population, with only 42.6% of those above the age of 15 having an account at a financial institution. 

With the bulk of remittances on the continent being peer to peer transfers, cryptocurrencies have the potential to revolutionise remittances between Africa and the rest of the world. Cryptocurrency-based remittances would result in faster transfers, less logistical constraints, and lower transaction costs due to advanced Blockchain technology. Whereas remittances cannot be considered a form of financial inclusion, their potential application to digital currency infrastructures could usher in more inclusive financial infrastructures. 

Indeed, in August 2020, sub-Saharan Africa traded USD 18.3 million of the USD 95 million total worth of Bitcoin traded globally in one week – the second highest peer-to-peer Bitcoin trading volume in the world after North America (at USD 28.7 million). While it is argued that Bitcoin trading significantly increased in sub-Saharan Africa due to the need to hedge against the volatility of local currencies amid the effects of Covid-19 lockdowns on local economies, Bitcoin.com’s analysis shows that 86.3% of the volume was contributed exclusively by the continent’s leading economies –  Nigeria, Kenya, and South Africa.

Contrasted against the average weekly mobile money transaction volumes in sub-Saharan Africa of around USD 457 million, Bitcoin’s trading volumes seem dismal. However, it should be noted that since its 2007 debut in Kenya, the M-Pesa mobile money model has been replicated by over 140 mobile money services worldwide. Mobile money itself has positively contributed to financial inclusion on the continent by enabling person-to-person and person-to-business digital transactions, alongside access to savings, credit and investment services via mobile phones. However, it is not without challenges – including high transaction fees and costs associated with interoperability and regulatory gaps. Meanwhile research shows that women are less likely to use mobile apps to conduct financial transactions due to  gender bias in digital financial services (DFS).

Meanwhile, intercontinental financial flows are largely dominated by foreign currencies because Africa’s aspirations for a single currency are often undermined by national currency variations in stability, convertibility, and control. While it is possible to address these issues by pegging unstable national currencies to more stable international currencies, the solution is fraught with structural deficits, as evidenced by the West African CFA franc (Eco), which is pegged to the Euro.

Digital currencies are thus arguably positioned as more appealing and accessible alternatives to the status quo. They attract comparatively lower transaction fees and carry less of the bureaucratic burdens prevalent in existing financial systems, even those between neighboring countries. Further, unlike mobile money and traditional currency, which are prone to interference by authorities, most digital currencies such as Bitcoin are resistant to external suppression because they are not controlled by central banking authorities. For example, during the #EndSARS campaign against police brutality in Nigeria, authorities ordered banks and financial institutions to block donations to the Feminist Coalition, one of the organisations charged with coordinating the protests. The Coalition turned to Bitcoin and other cryptocurrencies to circumvent the blockade. Meanwhile in Kenya, despite calls against virtual currencies by the Central Bank of Kenya, there has been an emergence of community-based initiatives for local cryptocurrencies enthusiastically welcomed by domestic users

Tangible Obstacles to Digital Advancements

In Francophone West Africa, activists are calling for stable, regional currencies independent of European financial institutions that impose economic reliance on the West. Some have speculated that the creation of a regional cryptocurrency based on blockchain would finally emancipate their economic systems from unwanted foreign manipulation. Indeed, the establishment of a legally-recognised digital currency in Senegal – the eCFA – demonstrates that feasibility and a framework for digital currency exists. However, this potential is faced with constraints across the region such as internet disruptions as well as gaps in cybercrime and data protection and privacy legislation Nonetheless, the mobilisation of young enterprises around technological innovation in combination with civil society and government-led innovation in digital economic expansion hold some promise that blockchain utilisation can contribute to Africa’s social-economic development on a country or regional needs basis.

Central banks could either support or develop the blockchain and technology infrastructure upon which third parties could participate, or  sidestep the burden of  technology infrastructure development and maintenance through designing licensing regimes that allow appropriate third parties to issue digital currencies on behalf of their countries. However, to achieve this,  countries must have adequate financial and technology policy, including legislation that incentivises cryptocurrency development, ensures cybersecurity and protects user data and privacy. Furthermore, universal access to the internet and digital services, quality of service provision and infrastructure investments would go a long way in promoting adoption of digital financial technology.

Mauritius’ Social Media Regulation Proposal Centres State-Led Censorship

By Daniel Mwesigwa |

In Sub-Saharan Africa, Mauritius leads in many aspects. It is the only country on the continent categorised as a “full democracy” by the Economist Intelligence Unit Democracy Index for 2020. Additionally, it has the second highest per capita income (USD 11,099) and one of the highest internet penetration rates in the region (72.2%).

However, the recently published consultation paper on proposed amendments to the country’s Information and Communications Technology (ICT) law, purportedly aimed at curbing abuse and misuse of social media, could place Mauritius among the ranks of regressive states. The proposed establishment of a National Digital Ethics Committee (NDEC) to determine what content is problematic in addition to a Technical Enforcement Unit to oversee the technical enforcement of NDEC’s measures has potential surveillance and censorship implications.

The social media regulation proposals by Mauritius are made in light of increasing calls for accountability of technology platforms such as Google and Facebook by western countries. Indeed, the consultation paper cites Germany’s Network Enforcement Act (colloquially known as the Facebook Act), which requires social media platforms to remove “illegal content” from their platforms within 24 hours of notice by users and complaint bodies. Non-compliance penalties are large – with fines ranging between five  million and 50 million euros.

The paper states that, unlike in Germany and other countries like France, the United Kingdom, and Australia, complaints by Mauritian local authorities to social media platforms “remain unattended to or not addressed in a timely manner”. Moreover, it adds, cooperation under the auspices of domestic laws and regulations is only effective in countries where technology companies have local offices, which is not the case in Mauritius. As such, according to the Authority, “the only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.”

However, the Authority’s claims of powerlessness appear unfounded. According to Facebook’s Transparency report, Mauritius made two requests for preservation of five user accounts pending receipt of formal legal processes in 2017. In 2019, Mauritius made one request to Facebook for preservation of two accounts. Similarly, the country has barely made any requests for content take down to Google, with only a total of 13 since 2009. The country has never made a user information or content takedown request to Twitter. In comparison, South Africa made two requests to Facebook for preservation of 14 user accounts in 2017 and 16 requests for preservation of 68 user accounts in 2019. To Google, South Africa has made a total of 33 requests for 130 items for removal since 2009 while to Twitter, it has made six legal demands between 2012 and 2020.

Broad and Ambiguous Definitions

According to section 18(m) of Mauritius’ Information and Communication Technologies Act (2001, amended multiple times including in 2020), the ICT Authority shall “take steps to regulate or curtail the harmful and illegal content on the Internet and other information and communication services”.

Although the consultation paper states that the Authority has previously fulfilled this mandate in the fight against child pornography,  it concedes that it has not fulfilled the part of curtailing illegal content as it is not currently vested with investigative powers under the Act. The consultation paper thus proposes to operationalise section 18(m) through an operational framework that empowers the Authority “to carry out investigations without the need to rely on the request for technical data from social media administrators.”

The amendments to the ICT Act will relate to defining a two-pronged operational framework with the setting up of: i) a National Digital Ethics Committee (NDEC) as the decision making body on illegal and harmful content; and ii) a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

However, neither the existing Act nor the consultation paper define what constitutes “illegal content”. Whereas the consultation paper indicates that the Chairperson and members of NDEC would be “independent, and persons of high calibre and good repute” in order to ensure transparency and public confidence in its functions, the selection criteria and appointing Authority are not specified, nor are recourse mechanisms for fair hearing and appeals against the decisions of the proposed entity.

An Authoritarian Approach to Internet Architecture

Through a technical toolset (a proxy server), proposed under section 11, the regulator will be able to identify social media traffic which will then be automatically decrypted, archived, and analysed. For instance, the technical toolset would undermine HTTPS in order to inspect internet traffic. This means that information of all social media users pertaining to device specifics, content type, location, among others, would be available to the authorities. The regulator expects that once a complaint regarding social media is received, they will be able to block the implicated web page or profile without necessarily needing the intervention of social media platforms.

Additionally, the Authority expects social media users to accept installation of a one-time digital certificate on their internet-enabled devices to facilitate the re-encryption of traffic before it is transferred to the social networking sites. In other words, the Authority wants internet users in Mauritius to replace their own padlocks used for their home security with ones given to them by the Authority, which it has open and unfettered access to.

On the other hand, Mauritius’ commitments to freedom of expression, data protection and privacy potentially collide with these social media regulation proposals. In particular, Mauritius’ Data Protection Act (2017) requires informed consent of users, prohibits disproportionate collection of user data, and mandates fair and lawful processing of user data. The Data Protection Act was enacted to align with the European Union’s General Data Protection Regulation (GDPR). In March 2018,  Mauritius also ratified the African Union Convention on Cybersecurity and Personal Data Protection, although the Convention is yet to be enforced due to lack of quorum. Moreover, in September 2020, Mauritius signed and ratified the Council of Europe’s Convention for the Protection of individuals with regard to automatic processing of personal data.

Indeed, the Authority is aware of the potential infractions of the proposed technical measures on basic freedoms — stating in the paper that “the proposed statutory framework will undoubtedly interfere with the Mauritian people’s fundamental rights and liberties in particular their rights to privacy and confidentiality and freedom of expression”. Its seeking views and suggestions of “an alternative technical toolset of a less intrusive nature” may very well be an open solicitation for more surreptitious ways of monitoring social media data, with fundamental rights still at stake.

 Democracy and Local Investment

While Mauritius runs a multiparty system of government, its human rights record has been steadily deteriorating, according to the United States Department of State’s Human Rights Report 2020. Moreover, basic freedoms such as freedom of expression are being curtailed through digital taxation and clampdown on social media dissent. Recently, Twitter cited stability and democracy as the key reasons for the opening of its first Africa offices in Ghana. Although Mauritius is strategically placed as a regional and economic hub in Africa, and has been positioning itself as a “Cyber Island”, legal frameworks such as the proposed ICT law amendments and mixed rankings on democracy alongside high rankings on internet access and ease of doing business may likely undermine the country’s international competitiveness and internet freedom standing.

Accordingly, the Authority would do well to immediately discontinue these plans to employ technical measures to monitor social media and internet traffic as they would amount to multiple breaches of fundamental freedoms. The proposals also run counter to the Data Protection Act which prioritises minimisation of data collected and informed user consent. Moreover, the technical proposal would promote self-censorship and undermine the basic workings of the institutions of democracy.

Further, although social media regulation could be paved by good intentions such as the need to stamp out inflammatory content, it could be more beneficial to explore alternative options with a range of stakeholders to promote more fair and transparent content moderation practices in line with international human rights law. Mauritius has already proved that aligning domestic and international laws and practices is necessary by fashioning its data protection law along the lines of the GDPR. Additionally, Mauritius could leverage existing partnerships with other countries of regional economic blocs such as The Common Market for Eastern and Southern Africa (COMESA) to form a coalition of fact-checkers that have direct access to social media platforms.

Finally, the Authority could collaborate with technology platforms such as Facebook to support Creole language human moderators. This could be a necessary step to enhancing content moderation through automated decisions and more so for “low resource” groups of languages including Mauritian Creole.

Africa ICS Cybersecurity Conference 2020

Technology and cyberspace are key enablers of Africa’s agenda 2063, Kenya’s Vision 2030 coupled with the current big 4 agenda on Manufacturing, Food security and Health which aims at using technology and innovation to transform Kenya into an industrialized and secure middle-income country.
Click here for more details on the event.