Are cognitive biases compatible with the scientific method?
Cognitive biases are well documented in cognitive science research1. These systematic – and therefore predictable – errors are not only a sign of our limited rationality, but they also explain the way our judgments and decisions work. As such, cognitive biases are a group of inherent processes controlling the mind so that it can: manage large flows of information, compensate for the limits of memory, preserve cognitive economy, make quick decisions, access meaningful explanations, protect integrity of the self and reassure ourselves in our decisions.
Studying cognitive biases in a scientific way consists of building up a rational body of knowledge to better understand our irrationality. To do this, the scientific method relies on the one hand on the description of objectifiable facts, which we identify using quantitive methods. And on the other, by using invariant explanatory models (a.k.a. theories) – which must correspond with known facts and be subsequently used to predict, test, and compare – through which we seek to understand the causality of phenomena made possible by an experimental method2.
Since the truth is not always easy to find, science is full of controversies. This is why the scientific method is based on the fundamental principle of “dispute”, i.e. debate of the results obtained, between peers, with demonstrated publicly-available proof. It is therefore collective, subject to criticism and replication, with nuance, over a long period of time and independent from political influence so that we may converge towards truth.
However, it should be said that the ingredients of the scientific method and cognitive biases are sometimes (or even, often) antagonistic. Without claiming to be exhaustive, let’s identify some significant stumbling blocks that may help us to better understand certain contemporary issues around mistrust in science.
Confirm vs. deny
Imagine that I have a rule in mind that I suggest you guess. I inform you that the sequence of numbers “2, 4 and 6” respects this rule. To guess it, you can propose other sequences of three numbers, and I will tell you if they conform to my rule or not. When we carry out this experiment3, the participants will logically make a hypothesis about the rule (for example “sequence of numbers increasing by two each time”), and test it positively, with a large majority of justification series such as “16, 18, 20” and then “23, 25, 27”.
The purpose of these confirmatory statements is not to test IF the hypothesis is true, but THAT the hypothesis is true. Only the series that will invalidate the hypothesis formulated by the participants (e.g. here “3, 6, 9”) will make it possible to verify IF it is true. This “hypothesis confirmation bias” explains why we spontaneously and carefully avoid looking for arguments that go against our beliefs: the aversion to losing our certainties outweighs the possibility of gaining new knowledge. As someone once said, “Insanity is doing the same thing over and over again and expecting a different result”.
We tend to overestimate the probability of an event when we know that it has taken place.
The scientific method, on the other hand, is counter-intuitive, and teaches us to beware of this bias thanks to the double-blind technique designed to limit self-persuasion, and to an “infirmatory” posture: testing hypotheses by multiplying experiments likely to refute them. Hence, a theory “resists” the facts until proven otherwise. Nevertheless, the process of research is not entirely free from confirmation bias because positive results are considerably well-valued by publications, especially in the so-called “social sciences”. Moreover, reproducibility studies are not always popular, especially when they reveal how many research results in the humanities and social sciences cannot be reproduced[/pi_note]Larivée, S., Sénéchal, C., St-Onge, Z. & Sauvé, M.-R. (2019). « Le biais de confirmation en recherche ». Revue de psychoéducation, 48(1), 245–263[/pi_note].
The power of hypothesis confirmation bias lies in the fact that it does not only concern the present but also… the past! Indeed, we tend to overestimate the probability of an event when we know that it has taken place: after the fact, we often behave as if the future were obvious to predict (“that was bound to happen”), and as if uncertainty or the unknown did not intervene in the events. This “retrospective” confirmation bias4 is all the more salient in tragic situations and may explain criticism of scientists’ or politicians’ intentions once the human toll of a pandemic, a terrorist attack or an economic crisis is known.
The retrospective bias relies on the extraordinary capacity of the human mind for rationalisation, i.e. the justification of events after the fact. We can never resist telling ourselves a good story, even if it means distorting reality5. As a result, the frantic search for causes is preferred to simple correlations, pseudo-certainties to probabilities, the denial of chance to the consideration of hazards, dichotomous thinking to nuance, the overestimation of low probabilities to the neutral observation of facts: precisely the opposite of what the scientific method teaches us.
Hard science vs. Humanities
Can the scientific method be applied to the study of humans by humans? In a vast series of research studies in experimental social psychology, Jean-Pierre Deconchy and his team explored a fascinating subject: the way humanity thinks about humanity, and the way humanity thinks about the study of humanity. With the help of ingenious experimental set-ups collected in the year 2000 (published in Les animaux surnaturés6), researchers showed how, in the absence advanced scientific culture, some of our cognitive filters convince ourselves that our thoughts and behaviours are not based on natural determinants. And that, consequently, by virtue of these cognitive filters, science would be unfit to understand and explain deep human “nature”.
Thus, humans construct a definition of humanity, which separates themselves from the idea that they are creatures of nature, determined by the same laws as other living beings. And that behind this biological form hides another “thing”, a “super-nature”, and thus a defiance of the very idea that science has a word to say on what humanity is.
In this research, we find the idea of limited rationality, in the sense that the knowledge of humanity would be something other than rationality. It is also incredible to see that, at the same time as we progress in cognitive and neurosciences, we are also witnessing several pseudo-human sciences flourishes, adding a little extra soul to the “super-nature” studied by Deconchy. These include a revival of shamanism, energetic ‘medicine’ and personal development techniques. They adopt scientific vocabulary that has an authoritative (another cognitive bias) effect – something that we have recently seen in fanciful extrapolations claiming terms from quantum physics to justify alternative medicines or other mysterious phenomena7.
Thinking against oneself
Our brain draws quick and cheap conclusions to do us a favour. Most of the time, they are sufficient and roughly relevant to our immediate needs. But sometimes, they do us a disservice and lead us down a path that discredits the very idea of free will. To fight against oneself, against the natural slope of cognitive biases that weaken our discernment, requires minimal training in what scientific method is – not only for those who are destined to a scientific profession. It also requires an understanding of the shortcuts our brain uses to make our lives easier, and sometimes to lull us into an illusion of understanding.
Charities such as “La main à la pâte” (in France) and, more globally, the projects dedicated to scientific outreach, in connection with universities and research organisations, are feeding a real societal need to reinforce psycho-social skills, not only of schoolchildren but also of citizens. This is the price to pay so that science is not perceived as a belief like any other, so that doubtful or misleading opinions do not take precedence over the truth, and thus so that our democracies maintain their emancipatory skills.