Can we develop our intuition to counter misinformation?
- Disinformation, the intentional production and distribution of fake news with the aim of causing harm, raises the question of trust in sources.
- These practices create information chaos, threaten democratic life and reduce people's critical faculties in favour of dichotomous thinking.
- Combating disinformation through legal regulation raises the question of the balance between freedom of expression and censorship.
- To understand why and how disinformation spreads, we need to study the concept of epistemic beliefs.
- To avoid falling into the trap, it is important to fight against one's intuition, to trust evidence more than one's own opinion and to look beyond one's socio-political ideologies.
Propaganda is a global strategy put in place by a state, an institution or a community to destabilise a target. Misinformation is the unintentional sharing of fake news, erroneous or obsolete information, through error, or lack of vigilance or knowledge of the subject: there is no actual intention to cause harm here. Disinformation, on the other hand, is a propaganda tool that works by deliberately creating and sharing false information with the intention of causing harm. This article focuses on disinformation because, in addition to the question of the truth of information, this concept raises the issue of veracity, and therefore of trust in information sources. We will argue that combating misinformation means asking three questions about knowledge: what to trust, how to trust and who to trust.
The breeding ground for disinformation: our society’s present-day vulnerabilities
The rise of disinformation on social networks is generating informational chaos that threatens democratic life: saturation of automated advertising and harvested data, promotion of shocking and conspiracy-oriented information, discrediting of authority figures, algorithmic logic at the source of thought bubbles. “For example, 120,000 years’ worth of videos are viewed on YouTube every day. Of these, 70% are watched because the platform’s artificial intelligence recommends them”1. Social networks have also come to be seen as one of the most reliable means of consulting the news2. The misinformation of young people in particular sends out worrying signals: one in 4 young French people subscribe to creationist theories, 16% think that the earth could well be flat, 20% that the Americans have never been to the moon, and 49% that astrology is a science. A large proportion of them believe that an influencer’s popularity is a guarantee of reliability (representative sample aged between 18 and 24)3. Trust in science is strong and stable in all European countries, except in France where it has fallen by 20 percentage points in 18 months4. This drop in confidence in science is correlated with support for fake news and conspiracy theories5. At the same time, illectronism (a contraction of illiteracy and electronics) is creating a new area of exclusion, with 14 million French people experiencing difficulties in using digital tools, at a time when dematerialisation is becoming widespread6.
These vulnerabilities, combined with powerful forces of influence, have damaging effects on our democracies: reduced critical thinking and credulity on the part of citizens, inability to resist seduction by, and support for, dubious ideas, selective exposure to information and prevalence of hypothesis-confirmation bias, dichotomous thinking and reduced ability to make arguments7. Admittedly, these flaws are nothing new (cf. Orson Welles’ radio hoax “War of the Worlds”), but the infiltration of supra-national powers, the power of technological tools and the availability of our slumbering brains make this a critical risk.
The levers for combating disinformation and misinformation are therefore a priority for our democracies. They fall into two distinct categories: limiting the production and dissemination of fake news, and limiting its impact.
Can we limit the production of misinformation: regulation and moderation?
350,000 messages are posted on X (formerly Twitter) every minute, for 250 million active users. There are an estimated 2,000 moderators, i.e. one moderator for every 175,000 users8. The same inflation is observed for other social networks. These figures call into question the very possibility of moderating information, which is increasingly managed by algorithms, a black box whose transparency is often questioned9. Elon Musk, via his company X, filed a suit against California on 8 September 2023, accusing the American state of hindering freedom of expression by forcing platforms to be transparent about content moderation.
To be a scientist is to fight one’s brain
Legal regulation (ARCOM, DSA) is now being debated, and political institutions are taking up the issue, but the balance between freedom of expression and censorship has not yet been achieved. In France, the Autorité de Régulation de la Communication Audiovisuelle et Numérique (ARCOM) acts effectively but remains limited in terms of resources, since it has 355 employees working on a wide range of issues (protection of audiences, media education, respect for copyright, information ethics, supervision of online platforms, developments in radio and digital audio, VOD distribution). With the Digital Social Act, Europe is putting a system of accountability in place for the major platforms as from 2024, based on a simple principle: what is illegal offline is illegal online. The aim is to protect Internet users by a number of practical means: making the way in which the recommendation algorithm works accessible to users, as well as the possibility of deactivating it, justifying moderation decisions, setting up an explicit mechanism for reporting content, and allowing appeals. Certain types of targeted advertising will be banned. Penalties for non-compliant platforms are set to match the stated ambitions: 6% of global turnover.
The fact remains, however, that if we take into account the vulnerabilities mentioned above, the rapid growth in the amount of information exchanged and the difficulties in regulating and moderating platforms, a complementary approach is needed: not just limiting misinformation, but reducing its impact on its targets by strengthening their capacity to resist. But how do we know whether a piece of information is true?
How do we know we know something: epistemic beliefs
Epistemic beliefs relate to the ideas we have about knowledge and the processes by which knowledge is created: what makes us think we know things? What factors contribute to a misperception of knowledge? These questions are central to understanding the spread and impact of misinformation, as well as ways of countering it.
Kelly Garrett and Brian Weeks, from Ohio and Michigan Universities, carried out a vast study in the United States in 2017 with the aim of gaining a better understanding of some of the determining factors in adherence to misinformation and conspiracy theories. Initially, they measured the opinions of participants on controversial subjects in certain conspiracy networks: the fact that the Apollo Mission never went to the moon, that AIDS was an intentional creation to harm the homosexual community, that the 9/11 attacks were authorised by the US administration to justify political decisions (military invasion and reduction of civil rights), or that JFK, Luther King or Princess Diana were assassinated on the orders of institutions (governments or secret agencies). They also measured participants’ opinions on highly sensitive contemporary social issues where there is a counter-discourse to the current scientific consensus: the role of human activity in global warming, or the fact that certain vaccines cause diseases such as autism.
This data was correlated with other measures of the same participants’ epistemic beliefs. The results are indisputable: participants are more likely to subscribe to conspiracy theories and are more suspicious of scientific discourse the more:
- they trust their intuitions to “feel” the truth of things,
- they believe that facts are not sufficient to call into question what they believe to be true,
- they consider that all truth is relative to a political context.
Since this study, a great deal of research has shown the extent to which these three elements constitute vulnerabilities in the fight against disinformation. In the next article, we will explain how each of these epistemic beliefs works, in order to identify the psychosocial skills we need to develop to sharpen our critical thinking skills.
What to trust: the intuition trap
The first important result of the study by Kelly Garrett and Brian Weeks concerns the trust placed in our intuition to understand the world around us, with the strong idea that certain truths are not accessible rationally. Instinct, first impressions and a diffuse “gut” feeling are said to be excellent indicators for guiding our judgements and decisions. This epistemic belief is widely held today in mainstream publications and personal development methods: “enter the magic of intuition”; “develop your 6th sense”; “manage with intuition”; “the powers of intuition”: These titles support the idea that there is a “little je ne sais quoi” that allows us to access hidden truths and understand the world directly by “reconnecting” with ourselves and our environment (the cosmos, pseudo quantum vibrations, etc.). Inspired by the New Age10, these approaches, which often relate to health and well-being, do not shy away from advocating a return to common sense and our ability to know things in an emotional way, without the need for proof, thanks to a “gift”. Yet science has often developed contrary to common sense and first intuitions: a heavy body does not fall faster than a light one, hot water freezes faster than cold water…
Admittedly, scientific research does not question the role of intuitive knowledge, and numerous works and publications are devoted to it11, many of them in medicine under the aegis of “Gut Feeling”12. But what this cognitive science research says is very different from what we find in personal development books, primarily because intuition is described as a form of reasoning that is part of a fairly rational process. In fact, scientists have shown (with the help of empirical research carried out with professionals who have developed intuitive knowledge, such as company directors, doctors, firemen, chess players, sportsmen and soldiers) that intuition is all the more effective in experts who have had a great deal of previous experience, thanks to opportunities to make hypotheses based on the analysis of their environment, to test them in a real situation, to benefit from feedback (success or failure), to make corrections, to retest, etc.… until they arrive at an implicit, efficient and rapid know-how known as intuition. There’s nothing esoteric or “quantum” about it, but practice, discipline, and feedback13 enable us to make rapid decisions when the context demands it. If 82% of Nobel Prize winners acknowledge that their discoveries were made thanks to their intuition14, it is above all because they have accumulated such a wealth of scientific knowledge and methodological experience that they end up aggregating clusters of clues to arrive at an insight: “eureka”!
The first psychosocial skill to develop in the fight against misinformation is therefore to distrust one’s own intuitions by resisting oneself15: “to be a scientist is to fight one’s brain” said Gaston Bachelard. It’s not a question of suppressing our intuitions, but rather of taking the necessary time to question them, audit them and validate their basis, and in this way to carry out metacognitive work on ourselves in a non-indulgent and modest way: on what past experience is my intuition based, have I had the opportunity to have a lot of feedback on the effects of my actions linked to this intuition, and to what extent am I not being influenced by my desires, my emotions or my environment? This is all the more difficult because an impression is above all… impressive: what matters most is not so much its content as the mental process of its construction and its consequences on the way we think and act16.
How to trust: the method
The second important result of Kelly Garrett and Brian Weeks’ study relates to the importance we attach to consistency between facts and opinions. Put another way, can we maintain a belief in the face of a demonstration that contradicts it? Some of us need factual evidence to form an opinion, distrust appearances and are concerned about the method used to produce data. Others not so much: the study mentioned above shows that the latter are much more likely to subscribe to false information and conspiracy theories. We remember the “alternative facts” the day after Trump’s election, symptomatic of the post-truth era. These strategies for distorting reality are only possible because they find an audience who, while not being fooled by them, do not feel the need for consistency between facts and beliefs. On the contrary, the coherence they seek tends to adjust the facts in favour of their beliefs, a rationalisation effect that is well known from work on cognitive dissonance. Hugo Mercier and Dan Sperber17have recently examined this issue in a book which defends the thesis that our reason serves us above all… to be right, not only in relation to others, but also in relation to ourselves! Hence the cognitive biases with a self-justifying function: hypothesis confirmation, anchoring, loss aversion, retrospective bias, etc.18. It’s easy to see why combating this is a dauntingly complex task, yet one that is both necessary and possible if we make the effort to teach the scientific method and its components, and not just to students destined for scientific careers! These alternative facts call into question the very notion of truth and knowledge recognised as fair19, and lead to the sordid conclusion that science is an opinion like any other20: this posture undermines the very foundations of our democratic institutions, which is why knowledge of the scientific method has now become a common good and a genuine psycho-social skill according to the WHO: “abilities that enable the development not only of individual well-being, but also of constructive social interactions”
Who to trust: back to basics
The latest result from Kelly Garrett and Brian Weeks’ study shows that the more individuals believe that facts are dependent on the political power in place or the socio-political context in which they are produced, the more readily they adhere to disinformation and conspiracy theories. This type of epistemic belief, which is resolutely relativistic, is facilitated by the fact that our beliefs also serve to reinforce our identifications with the groups to which we belong: we evaluate the information to which we are exposed according to our socio-ideological proximity to its source. The underlying problem here is therefore one of veracity rather than truth: it is a question of the moral quality of the author of a piece of information and therefore of the trust we place in him or her. Francis Wolff21 shows that this relativist stance is now a stumbling block in the fight against the risks common to humanity as a whole (global warming, economic crisis, shortage of resources, extinction of species, epidemics, terrorism, etc.) because of local demands (identity-based, communitarian, nationalist, xenophobic, religious radicalism, etc.) that hamper our ability to engage in dialogue and find ways of moving forward collectively. So what psycho-social skills do we need to develop if we are to know who we can trust and build common projects that transcend communitarian divisions? To answer this question, Philippe Breton22 carried out a number of empirical studies during experimental argumentation workshops. His results suggest that we need to develop what he calls a “democratic ability”, which is currently far too lacking to build trust, and which is based on three skills:
- Speaking in front of others: practising overcoming the fear of speaking in front of an unfamiliar group. Scientific research shows that this fear is one of the most widespread among adults (55%23). This fear hinders the very possibility of establishing the conditions for cooperation.
- Cognitive empathy: practising defending opinions contrary to one’s own. The aim is to learn to identify the quality of the arguments and thus regulate one’s less solid epistemic beliefs. This strategy is part of the psychological inoculation methods24 designed to strengthen mental immunity.
- Combat “consensual palaver”: soft consensus is a way of avoiding debate that gives the illusion of bringing people together. Practising “frank and peaceful conflictuality”25 is not easy, but it does provide the necessary democratic vitality.
Conclusion
“Il faut voir comme on se parle. Manifeste pour les arts de la parole” (We need to look at how we talk to each other. A Manifesto for the Arts of Speech) is the title of the latest book by Gérald Garutti, founder of the “Centre des Arts de la Parole”, a third-party centre that restores the psycho-social skills needed to build a space for shared dialogue and combat the misinformation that is undermining our democracies. These third-party centres, spaces for science and discovery, and citizens’ laboratories for experimentation, share the common goal of developing democratic skills in the form of operational know-how: knowing how to argue and make a counterargument, knowing how to listen, suspending one’s judgement and eliciting that of others. They also help us to understand how scientific truth is constructed and how this knowledge can be biased: these are the levers of free will and living together.