Internet blocking has emerged as an extreme but recurring practice of controlling online communication. Both autocratic and democratic governments in Africa are increasingly resorting to suspensions in response to concerns about the potential of online hate speech to spread election-related disinformation or incite violence. Partial or nationwide disruptions to the grid have also occurred at times when no threat seemed imminent, including during peaceful demonstrations and national exams.
Shutting down the internet seems disproportionate and abusive, especially in terms of citizens and end users who are disenfranchised by a power that is both arrogant and unreliable or incompetent. When leaders long past their time in office, such as Paul Biya of Cameroon or Yoweri Museveni of Uganda, declare their need and right to use coercive measures to ensure peaceful elections or prevent the threat of outside interference, we see aging, despotic men clinging to them. power. But are all their claims illegitimate, just a cover-up to maintain control? What if this and similar evidence came not from them, but from more reputable sources?
What if there was a respected leader like Thomas Sankara who argued that such a response was necessary? Sankara was a revolutionary and pan-Africanist who led Burkina Faso from 1983 until his assassination in 1987. Nigerian literary critic Abiola Irele wrote that Sankara was “a leader with the genuine interest of the people” and “truly leading a revolution”. meaning of the word.” His stature and loyalty were recognized not only by his admirers, but also by his opponents, who saw how his leadership style and commitment to socialism inspired others on the continent. After Burkina Faso’s “example of simplicity, frugality, and integrity,” a telegram received by the U.S. Embassy “in the government won high praise for the lack of corruption”.
Examining the shutdown of the internet through Sankara’s life and thought illuminates an often overlooked aspect of these communication blocks: how these measures are a response to the enormous power of for-profit social media companies to enable unprecedented forms of intervention in national politics. responsibility for it.
This imbalance has become apparent in leaks and disclosures by whistleblowers, adding to the growing body of evidence demonstrating Big Tech’s negligence and bias. Facebook whistleblower Frances Haugen called her former employer’s strategy and behavior hypocritical, expanding into new markets under the slogan of “building community” and “bringing the world closer together.” In practice, social media companies have shied away from responsibility and action when interactions between their platforms and local politics sowed and fueled divisions and antagonisms. Sankara would call this a manifestation of imperialism—a term largely out of fashion, but whose core principles accurately describe the behavior of social media companies—acting in ways that seek to benefit that center of power, regardless of the consequences. in the peripheries.
The profitability model for social media companies is based on attracting and retaining users’ attention, even if that means promoting vitriol and polarizing content. Aware of this trait, but struggling to respond to waves of scandal and criticism, companies have invested in systems to weed out hate speech and misinformation. But these efforts reflect deep inequalities and have been largely driven by financial incentives and disincentives.
The vast majority of content moderation activity is focused on rich markets such as the United States or the European Union, which are able to force companies to take action. There are a few exceptions, such as geopolitical events that are U.S. foreign policy priorities (such as Russia’s intervention in Ukraine) or stories that heighten global public opinion, such as the genocide against the Rohingya in Myanmar. But in 2020, 87 percent of the time spent training disinformation detection algorithms was focused on English content, with only 9 percent of users being English speakers. For low-resource languages, including many languages in Africa, the investment of resources and time can be measured in decimals. As a result, as Haugen points out, the most fragile countries use the least secure version of the platform: one with little or no content moderation.
These double standards in relation to core and peripheral markets are also evident in the way Big Tech companies openly interact with actors perceived as powerful and capable. While Facebook has been forced to comply with demanding and expensive German demands to remove content that violates national laws, it has largely rejected demands from African leaders and lawmakers. This reflects another form of imbalance, with politicians and legislators in the Global North and Global South having vastly different abilities to understand how Big Tech firms operate, and the expertise and resources to both attract and challenge them. Many European countries have specialized government agencies and experienced lawyers ready to challenge companies that monitor online content. In September, the European Union opened an office in Silicon Valley with the intention of expanding EU regulators’ ability to attract American social media companies, a benefit few African countries can afford.
An example of African countries struggling to deal with the rules set and enforced in California occurred just before the Ugandan elections in early 2021. Uganda’s Communications Commission has ordered Google to remove seventeen YouTube accounts it accuses of inciting violence, undermining national security and causing economic havoc. Google rejected this request, citing the lack of a court order. Human rights lawyer Nicholas Opiyo argued that the Ugandan government’s approach to Google reveals a lack of understanding of how major social media companies operate and how content is evaluated. He noted that the government cannot simply point to some regulation and say that the company violated it. “Digital companies operate under legitimate court orders,” he said Observer. “In other words, there must be due process to determine the violation of the law. No digital company would take such a letter seriously. It will be thrown into the trash immediately.”
Meanwhile, in the run-up to the election, Facebook took down a number of pro-government pages for “coordinated inappropriate behaviour”, despite allegations that the opposition was using similar tactics. This step was taken on the recommendation of the Digital Court Research Laboratory, a non-governmental organization that deals with the claims and concerns of the opposition. The government called Facebook’s action a biased and unequal application of the rules, arguing that the company was “discretionary.”[ing] parties” against the government. As Museveni claimed, “We cannot tolerate the arrogance of those who come here to decide who is good and who is bad.” During the election period, the internet was shut down and Facebook was banned for more than six months.
These arguments do not attempt to justify or condone internet shutdowns. But treating stoppages as forms of dispute rather than abuses by despotic leaders can open up alternative ways of responding to them. Here we see the possibilities offered by a leader like Sankara. Many African leaders have—in the words of Cameroonian historian Achille Mbembe—adopted and fetishized the concept of the nation-state from colonial powers, even borrowing terms such as “national interest,” “risks,” “threats,” or “threats.” National security’ . . . [that] refer to a philosophy of action and a philosophy of space based entirely on the existence of an enemy in a world of hostility,” this need not be the case. Rather, Mbembe suggests, African nations must abandon these notions for “our own long-standing traditions of flexible, networked sovereignty.” Mbembe’s conclusions would fit well with Sankara’s precepts.
It is precisely to the adversarial world mentality that internet shutdowns are advocated by leaders as legitimate and proportionate responses, but reliance on network sovereignty may render internet shutdowns unnecessary. Network sovereignty has its roots in pre-colonial Africa, when long-distance trade was one of the driving forces of cultural and political exchange. However, it is strikingly similar to the founding ideas of the Internet. At that time, Mbembe notes, these networks were more important than borders, and what mattered most was the extent to which streams intersected with other streams.
When decolonization took root, the newly independent African states had to monopolize state functions almost immediately after the colonial authorities transferred power to local ruling elites. This led these leaders to use the media, including print, radio, and television, as tools of state and nation-building to create a type of power that had been unattainable during previous revolutions. In the postcolonial era, media control combined original projects of community building, such as large-scale language and literacy projects, with self-serving tactics to maintain the power of the few.
Until recently, the ability of African governments to regulate media outlets to ensure that they adhere to certain national standards seemed possible through coercion, cooperation or negotiation (with the exception of some international broadcasters). However, social networking platforms, which are extremely popular and evoke strong images of activism and competition tools, remain inaccessible to national authorities, thus undermining this control mechanism.
Sankara’s pan-Africanism and Mbebe’s image of Africa’s networked sovereignty may provide a stronger and more lasting response to this loss of control and deep inequality. Facebook and Google are betting on the continent’s exponential growth in data use and production by funding two of the largest undersea cables off the coast of Africa. As a result, greater coordination and solidarity among African leaders and collectives – users, companies and entrepreneurs – can force powerful technological actors to sit at the same negotiating table. If regional cooperation institutions or the African Union could come up with joint guidelines for combating violent online speech, they could not only gain more leverage than the tech giants, but also push back against members who claim the shutdown is the cause. only the means available to stop violent or destabilizing speech.
Carnegie’s Digital Democracy Network is a global group of leading researchers and experts exploring the relationship between technology, politics, democracy and civil society. The network is dedicated to generating original analysis and enabling cross-regional information sharing to fill critical research and policy gaps.