Logo
Thématique
Economie, emploi et travail
Politiques, institutions et démocratie
Libertés publiques et éthique
Inclusion numérique et solidarité

note

Moderating Our (Dis)Content: Renewing the Regulatory Approach

  • Politiques, institutions et démocratie

Moderating Our (Dis)Content: Renewing the Regulatory Approach

  • Politiques, institutions et démocratie

DOWNLOAD THE NOTE [English version]

Télécharger la note [version française]

 

Key Takeaways 

 

  • Public policy tends to consider only a handful of platforms in current efforts to regulate toxic online content. Policy discussions at present also fail to grasp the idiosyncrasies and interconnectedness of content moderation across different platforms. The result is that policy response only addresses a piece of the problem.

 

  • Content moderation is not just about removal, but about finding the right balance, positioning, and process, along with policy makers, civil society, and end users themselves. Moderation must be examined in a broader sense, beyond simply the number of contents removed. 

 

  • Today, co-regulation remains a bilateral process between the major platform operators and governments. If regulation introduces responsibilities and obligations across the board which are calibrated for the world's largest internet companies, these measures will have disproportionate negative impacts on other actors, which could in turn further reduce the diversity of platforms. Regulatory frameworks must be careful not to further reduce the diversity of platforms available to support a wide range of online expression.

 

  • A new regulatory approach is called for, one that accounts for diverse moderation approaches and protects fundamental rights. We need agile indicators that let us measure the responsiveness of platforms to the real moderation challenges they face, challenges which evolve. 

 

  • Nuance, agility, and broad stakeholder and end user participation are necessary to this new regulatory approach. A central question for many is how to look beyond the concept of user threshold, or the number of in-country users on a platform. This concept is inapt, as this figure alone does not illustrate the moderation challenges faced by the platform. Renaissance Numérique advocates for a more process-oriented assessment of platform moderation performance.

 

  • Inherent to user-generated content-hosting platforms is the notion of the co-creation of value. The substantial contribution of end users must be reflected in platform governance of content moderation. A collaborative approach requires genuine discursive processes with end users, not just the outsourcing of moderation labor. 

 

  • Governance structures are needed to facilitate this participation. This kind of user involvement must be part of a broader behavioral shift on online platforms, reframing of the end user as principal agent. 

 

  • Public authorities should reinforce the capacities of all stakeholders to allow for functional collaboration and discursive processes. It is the responsibility of public authorities to establish a general framework to facilitate intra- and inter-sectoral collaboration and knowledge sharing, to work with civil society, researchers and technical experts to find effective methods, and to share these methods with all actors and across all platforms.

 

  • Future regulation in this space, in particular the European Digital Services Act,  must not simply be shaped for and by the most dominant platform operators. Regulation must aim to address content moderation holistically, across all relevant services. 

 

DOWNLOAD THE NOTE [English version]

Télécharger la note [version française]