You are currently viewing Who makes the rules? Power, governance and authority in digital spaces

Who makes the rules? Power, governance and authority in digital spaces

When we open a social media app, join a forum or upload a picture or video, it can feel like we are stepping into a public square and freely expressing ourselves to the world; but digital spaces are not “public” in the same way as a town hall or a street. They are privately owned environments, shaped by a mix of platform policies, moderation systems, hidden technical choices and, increasingly, laws that aim to hold platforms accountable. Understanding who makes the rules is a core skill for digital citizenship, especially for young people learning to navigate online life critically and rightfully associating digital realities with civic systems

Platform rules: The law you agree to (even if you never read it)

Most online platforms govern behaviour through Terms of Service and community guidelines, which users often don’t fully read and simply agree to without question. These are not democratic laws; they are contractual rules written by companies that all users should abide by. Platforms can update them quickly, interpret them flexibly and enforce them unevenly, often at massive scale and with only a simple notification.

Platforms thus act as both rule-makers and rule-enforcers.  A user may be removed, “shadow-banned” (when their visibility is significantly reduced without warning), demonetised or have content taken down based on policies that are meant to be clear and strict but are often culturally dependent, difficult to contest and quite broad (such as “harmful content”). This is why many civil society groups call for minimum standards of fairness and transparency in moderation decisions, including clearer explanations and appeals, and why online users should remain aware of what is allowed or forbidden in the spaces they use and understand their rights and responsibilities.

Moderation: Humans, algorithms and scale

Content moderation is often imagined as a person reviewing posts and making sound decisions based on context. However, in reality, moderation is usually a hybrid system: automated detection flags content, humans review some cases and policies, along with workflow tools, shape what is prioritised. At scale, popular platforms rely heavily on automation and triage, which can create predictable problems: false positives (legitimate content removed), false negatives (harmful content approved) and inconsistent enforcement across languages and communities.

Biases and prejudices can strongly influence these processes, particularly when algorithms are poorly designed or trained on biased data, and when human moderators allow personal perceptions or cultural assumptions to affect their decisions. As a result, minority groups are more likely to face stricter moderation, repeated sanctions and more frequent bans, which limit their visibility online and undermine fairness and equal representation.

Moderation decisions influence what is visible and promoted, what becomes “normal” and who feels safe enough to participate in certain spaces. And because moderation is partly automated, rule enforcement can become less like a courtroom and more like a risk-management machine, where specific codes and norms are adopted by a group solely to avoid bans or reports, limiting free speech and falsifying common behaviours and language.

Invisible power structures: What you don’t see shapes what you believe

Even when content is not removed, platforms govern attention through design and ranking systems: recommendation feeds, trending lists, “suggested for you” panels and advertising tools. These choices determine whose voices are amplified and which ideas gain traction. In that sense, authority online is not only about removing content; it is also about organising visibility.

This is why governance debates increasingly focus on observability: the ability for researchers, regulators and the public to understand how moderation and ranking work in practice, not just in theory. The EU’s Digital Services Act (DSA), for example, introduces transparency obligations intended to make platform decision-making more inspectable.

Digital governance: When platform rules meet public law

In recent years, governments have moved from “hands-off” approaches toward stronger regulation for privately owned platforms. In the EU, the Digital Services Act (Regulation (EU) 2022/2065) sets graduated obligations for online intermediaries and higher duties for very large online platforms, including risk assessments, transparency reporting and user-facing redress mechanisms.

A particularly concrete example is the EU-run DSA Transparency Database, which requires platforms to provide “statements of reasons” for certain content moderation decisions, an attempt to push platforms towards clearer justification and accountability.

However, governance is not only about laws. It includes a broader ecosystem: standards, watchdog groups, research access, transparency reporting and public pressure. In 2021, the OECD, for instance, has analysed transparency reporting as a trust-building practice, while also noting persistent gaps and uneven quality.

What this means for digital citizenship and civic education

For learners, the key insight is this: online spaces are governed by layers of authority.

  1. Platform authority (policies + enforcement systems)
  2. Technical authority (algorithms, design, business incentives)
  3. Public authority (laws, regulators, courts)
  4. Social authority (community norms, peer pressure, collective action)

Digital citizenship education becomes stronger when learners can ask: Who benefits from these rules? Who is protected? Who is silenced? What options exist to appeal, report or challenge decisions? To enable these bouts of critical thinking, game-based approaches like the DigiCity project resources, such as video games and escape games where players become part of complex digital and civic situations, are powerful because they let learners experience governance in a safe and controlled manner: navigating rule systems, encountering trade-offs and reflecting on fairness, equity and ethical decision-making.

In short, the rules of digital life are made in boardrooms, coded into systems and increasingly shaped by regulation. Becoming a digital citizen means learning not only how to behave online, but also how online power works and how to respond with agency, respect, ethics and critical thinking.

 

References: