Publication 10 September 2025
Disinformation, states and serial narratives, with Paul Charon
Before we start, can you tell us what initially drew you to this topic ?
My work is rooted in a long-standing fascination with the power of language and storytelling. This literary passion found its practical application in the analysis of contemporary Chinese propaganda. What particularly intrigues me is the unsettling effectiveness of certain Chinese discourses, which – despite often appearing clumsy to Western eyes – heavy repetition, rigid formulas, opaque cultural references — sometimes manage to deeply influence public opinion. This dissonance reveals that narrative effectiveness does not rely solely on rhetorical elegance, but on deeper mechanisms that I seek to identify.
My research maps the entire ecosystem of state narratives: who produces them (think tanks, state media, affiliated influencers), according to what logic, through which channels (from traditional media to digital platforms), and, above all, how these narratives are processed by their audiences – whether through adherence, resistance, or creative reappropriation. The stakes go beyond China : it is about understanding how, in the digital age, states continue to shape collective imaginations despite the proliferation of information sources.
Can you give us a definition of disinformation ?
Disinformation resists any simple definition. Beyond the classic distinction between disinformation (intentional) and misinformation (unintentional), I propose understanding it as a complex system of misleading or biased information, designed to reconfigure our mental representations and guide our behavior – this is what I explain in my article « Reading Disinformation as a Serial Narrative ».
Paul Charon
Director of the Influence and Intelligence Department at IRSEM and doctor in Political Studies at EHESS
The term « narrative », ubiquitous in debates over disinformation, has become a catchall that obscures more than it clarifies. To see things clearly, narratology offers a much more effective analytical framework by distinguishing three levels :
- Story (diegesis) : the raw events, what actually happened. Manipulation here can invent facts.
- Narrative : the narrative structure – who is the hero, what causality is suggested, what emotions are evoked, the play on temporality. This is where the ideological transformation of facts occurs.
- Narration : the enunciative instance – who is telling the story, with what authority, what legitimacy. Usurping a credible source or multiplying voices to create an illusion of consensus – or, conversely, of chaos – are classic tactics.
Contemporary disinformation can attack all three levels, creating complex narrative architectures where manipulation becomes undetectable to those lacking the appropriate analytical tools. This is why the term « narrative » fails to capture these subtleties and should be abandoned. Understanding disinformation requires analyzing the narrative mechanics themselves.
In a world where everything seems increasingly a matter of « perspective » how can we define what constitutes disinformation ?
The proliferation of contemporary epistemological relativism indeed makes it difficult to qualify disinformation. If any assertion can be relativized as a « point of view », how can we establish discriminating criteria ? We can, however, distinguish between legitimate perspective and informational manipulation by mobilizing three criteria : first, the empirical verifiability of the alleged facts ; second, the transparency of sources and information production methods ; third, the deceptive intent – the deliberate will to deceive rather than inform. Disinformation is not about interpretive pluralism but about deliberate falsification, systematic concealment, or manipulation of meaning. This approach preserves the space for democratic debate while maintaining epistemological safeguards against strategies of informational subversion.
What do you mean by the seriality of disinformation ?
Disinformation operates like a serial fiction – not in the sense of sequential episodes, but according to a logic of « architextuality », a concept theorized by Matthieu Letourneux based on the work of Gérard Genette. The architext is that invisible matrix of references, codes, and conventions that gives meaning to each new narrative before it is even read.
Paul Charon
Director of the Influence and Intelligence Department at IRSEM and doctor in Political Studies at EHESS
Disinformation masterfully exploits this mechanism. Each new conspiracy theory does not need to be complete or coherent – it relies on an already established architext : « hidden elites », « they’re lying to us », « troubling coincidences ». These recurring motifs form a narrative grammar that receivers know by heart. A simple clue is enough to activate the entire interpretive schema. This is why the most effective disinformation narratives are often fragmented, full of gaps. These narrative voids are not weaknesses but strengths : they invite the public to fill in the blanks with their own projections, fears, and pre-existing beliefs. The receiver becomes the co-author of their own manipulation.
Paul Charon
Director of the Influence and Intelligence Department at IRSEM and doctor in Political Studies at EHESS
Can we identify archetypes of architextuality ?
Three major architextual matrices currently structure the space of disinformation, each functioning as an autonomous narrative universe with its own codes, obligatory figures, and mechanics :
- The conspiracy architext.
- The migration architext.
- The binary geopolitical architext, which divides the world into a struggle between good and evil.
These architexts function exactly like the Star Wars expanded universe : each new film, series, book, or video game enriches the mythology without ever fundamentally contradicting it. Similarly, each new piece of disinformation adds to the global narrative edifice while drawing from the established repertoire. A tweet, a TikTok video, a blog post – all become fragments of a larger saga that everyone intuitively recognizes. This transmedia logic explains the extraordinary resilience of these architexts. They evolve, integrate new elements, adapt to contexts, but their deep structure remains.
How is the study of disinformation conducted, and what are its main challenges ?
Research on disinformation suffers from a fundamental imbalance : we excel at mapping its circulation but remain blind to its real effects. Most studies meticulously trace the flows – which networks, which sites, which vectors of propagation – like epidemiologists tracking a virus. This quantitative approach, appealing in its apparent rigor, masks our profound ignorance of the mechanisms of cognitive influence.
France has developed recognized expertise on state actors – deciphering Russian, Chinese, and more recently American strategies. But two critical blind spots persist : the fine semiotic analysis of content (beyond simple fact-checking) and, above all, the real reception by audiences. The latter faces structural obstacles : the prohibitive cost of longitudinal studies, the need for massive samples, and the methodological difficulty of distinguishing between exposure, comprehension, and adherence.
Laurent Cordonnier‘s study illustrates this paradox. Among 4,000 people surveyed, major Russian or Chinese geopolitical narratives show low penetration in French society. While these results are reassuring, we must push our investigation further, because the vocabulary of these narratives insidiously infiltrates our discursive space. The term « Global South », for example, a rhetorical construction by Russia and China to redraw the world order, is now naturalized in French debate, used without awareness of its ideological charge. This is the real blind spot : we poorly measure this gradual lexical contamination. Words are never neutral – they carry conceptual frameworks, presuppositions, and worldviews.
Paul Charon
Director of theInfluence and Intelligence Department at IRSEM and doctor in Political Studies at EHESS
This dissociation between the conscious rejection of narratives and the unconscious absorption of vocabulary reveals the sophistication of contemporary influence strategies. The narrative battle is not necessarily won by massive adherence but by the gradual infiltration of categories of thought.
Why is Chinese disinformation less known in France than Russian disinformation ?
France is not a priority target for China, which focuses its efforts on areas of immediate strategic interest – Taiwan first and foremost, where the intensity of influence operations is incomparable to what we observe in Europe. This geographical hierarchy explains a less aggressive Chinese presence than Russia’s on our territory.
More fundamentally, the two powers partly diverge in their tactical approaches. China retains a pronounced taste for traditional influence operations in the physical space – economic networks, institutional partnerships, influence entrepreneurs linked to the United Front – whereas Russia may have shifted more massively to the digital realm. This difference in operational terrain contributes to making Chinese influence less visible, and thus less identified as an immediate threat in the French public debate.
In the context of raising public awareness, should liberal democracies also adopt narratives ? And do these narratives necessarily involve fiction ?
Liberal democracies face a fundamental narrative asymmetry. While authoritarian regimes deploy coherent and seductive narratives about Western decline, civilizational multipolarity, or the relativity of democratic values, our responses remain defensive and fragmented : punctual fact-checking, media education, support for independent journalists. These tools, while essential, fight symptoms without addressing the root cause.
The battle is fought at the architextual level. Russian and Chinese narratives do not merely contest facts – they propose alternative cosmogonies where Western universalism becomes cultural imperialism, democracy becomes chaos, and authoritarianism becomes stability.
Paul Charon
Director of the Influence and Intelligence Department at IRSEM and doctor in Political Studies at EHESS
Could you say a few words about what is called « cognitive warfare » ?
Cognitive warfare refers to a form of conflict that directly targets the thought and decision-making processes of individuals and populations. It attacks perceptions, beliefs, and cognitive mechanisms themselves. This form of confrontation exploits several fundamental human vulnerabilities, well identified by advances in cognitive science. What makes cognitive warfare particularly insidious is that it often operates below the threshold of consciousness. The targets do not realize they are under attack. It exploits our natural neurological mechanisms – the way our brain processes information, forms memories, and makes rapid decisions.
That said, when we talk about « cognitive warfare », we are dealing with a very fuzzy field. This concept, while widely used, remains difficult to pin down. All states are interested in it in one way or another. To be interested in cognitive warfare is ultimately to be interested in the receiver, in how information is perceived, remembered, and interpreted.
What are the effects of AI on disinformation ?
Beyond the generation of manipulated content by AI – such as the fake images proliferating on social media – this new technology poses other informational problems. Thousands of fake news sites generate manipulated content that ends up being absorbed into the training corpora of AI systems. These systems, unable to distinguish reliable sources from fraudulent sites, ingest this false information and then regurgitate it in the form of apparently credible responses. This circularity thus produces an exponential amplification effect.
Do short video formats have an impact on narratives ?
Short video formats radically transform contemporary narrative architecture. We are witnessing an atomization of narrative : stories no longer unfold in a classic linear fashion but explode into micro-sequences of 15 to 60 seconds, each of which must instantly capture attention while fitting into a larger framework. This fragmentation operates in several ways. First, temporally, with narratives built by accumulating capsules released over days or weeks. Then, in a transmedia way, where the same story is dispersed across TikTok, Instagram, YouTube, each platform adding its own narrative layer. This dissemination creates a dotted narrative experience, where the receiver must themselves reconstruct the overall coherence.
A narrative no longer exists solely through its initial content but through its variations and reappropriations. Disintermediation allows each user to become a node of propagation and narrative transformation. A disinformation narrative can thus mutate through thousands of iterations, making its original source untraceable. Faced with this new narrative ecology, the development of artificial intelligences capable of tracing these dispersed narrative threads becomes crucial. It is not only about detecting isolated fragments but, above all, about reconstructing the recurrences and progressive mutations that constitute the true narratives of influence in the contemporary digital space.
-
News 14 February 2025
-
Publication 14 April 2022
How to work together to preserve our online information space?
-
Event
Covid-19 Mis/Disinformation: Addressing the Viral Challenge
17 November 2022, 9.00 am to 12.30 pm CET
Residence of the British Ambassador to France