You are currently viewing Algocracy

Algocracy

This article is based on the book Algocratie, written by Arthur Grimonpont, appointed rapporteur of the Paris Charter on AI and Journalism. He also supports Reporters Without Borders (RSF) in its advocacy for the regulation of artificial intelligence.

The Extractivism of Attention

In a world where information is ubiquitous, our attention has become a rare and precious resource. As the political scientist Herbert Simon prophesied in 1971, “the abundance of information implies […] a scarcity of what information consumes, and what information consumes is obvious: it is the attention of its recipients. Consequently, an abundance of information creates a scarcity of attention.” This prediction rings true in the era of social networks and algorithms governing our information access.

Extractivism, the practice of seizing an unexploited resource to turn it into a commodity, now applies to the virtual world with the same voracity as it does to the real world. Our attention has become the raw material on which the advertising empires of the planet’s most powerful companies are built: Alphabet (Google and YouTube) and Meta (Facebook and Instagram). These digital giants derive most of their revenue from advertising, fiercely competing to capture our available brain time.

The Attention Economy

The attention economy is not a new phenomenon; it predates social networks and is the business model of many private newspapers, radio stations, and television channels. However, social networks have an unparalleled ability to know, target, and influence our behaviours to capture our attention. Their main asset lies in their recommendation algorithms, artificial intelligence systems that use our digital footprints to predict which content is most likely to hold our attention at any given moment.

These algorithms have enabled social networks to become the primary use of the Web in terms of time spent and the number of users. On average, we spend two and a half hours a day scrolling through platforms like Facebook, YouTube, TikTok, Instagram, and Snapchat. On YouTube alone, over a billion hours of videos are watched daily, three-quarters of which are recommended by algorithms.

The Commodification of Social Life

In concrete terms, a handful of algorithms driven by private interests structure the access to information for half of humanity. The leaders of these companies present their platforms as “global village squares,” but the reality is more prosaic: they function like authoritarian regimes led by autocrats. To expand their empires, these leaders organise the commodification of our social, political, and cultural lives, with no regard for our mental health, the right to reliable information, or democracy.

In the West, search engines and social networks provide about two-thirds of the audience for online news sites. Maria Ressa, a Filipino journalist and Nobel Peace Prize laureate in 2021, stated: “Our lives are sucked into a database, organized by artificial intelligence, and then sold to the highest bidder. This ultra-profitable micro-targeting is designed to undermine human will.”

The Consequences of the Attention Economy

The attention economy systematically favours lies, sensationalism, and hate because this type of content holds attention better. On social networks, false information spreads six times faster than the truth, and algorithms divide and radicalise us. They trap us in echo chambers, digital microcosms where we only access information that confirms our biases. Our shared knowledge and beliefs erode, and with them, the social contract.

Hannah Arendt said: “When everyone lies to you constantly, the result is not that you believe those lies but that no one believes anything anymore. […] And with such a people, you can do whatever you want.”

The Information Bubble

With ever more sophisticated algorithms, digital platforms suggest enticing, entertaining, and captivating content, dubbed “cognitive candy.” Unknowingly, we navigate through social networks in a tailor-made information bubble created from previously collected data. Users find themselves trapped in isolated spheres of information and opinions, reinforcing their respective beliefs.

This phenomenon is based on the large-scale exploitation of our biological appetites and psychological biases, such as confirmation bias, risk aversion, self-enhancement bias, and the mere-exposure effect. Social platforms automate the exploitation of these weaknesses on a massive scale, leading to the polarization, radicalization, and mass disinformation of society.

What to Do About Information Chaos?

In the face of information chaos, one recommendation has become consensus: media and information literacy education. However, shifting the responsibility from multinational corporations that structure the information market to the individuals who consume it is not a viable solution. Societal problems cannot be solved through individual virtue. Awareness only leads to structural change if it results in collective action.

Europe has begun to address this issue through the recent adoption of binding laws such as the Digital Services Act and the AI Act. Although these laws are on target, they do not hit hard enough. Nevertheless, the foundations for democratic regulation of platforms are in place.

A notable project in this area is Tournesol, an open-source initiative aimed at creating a democratic and ethical recommendation system. Tournesol allows users to rate and evaluate online content based on criteria such as reliability, relevance, and quality. By aggregating these evaluations, Tournesol offers an alternative to traditional recommendation algorithms, emphasising transparency and collective participation. This project demonstrates how solutions based on collective intelligence can contribute to more reliable and balanced information.

Towards a Democracy of Information

The author proposes a “meta-solution” to this “meta-problem”: stringent regulation imposed on social platforms by the European Union and its member states, followed by significant resources allocated to the development, deployment, and operation of a democratic recommendation system based on the common good.

“Is such a scenario plausible?” the author asks. “Today, the war for attention is costing lives, weakening our democracies, and rendering us incapable of responding intelligently to contemporary threats. Ending this war for attention does not require sacrificing freedom or large-scale military operations, but simply political will.”

References: