It needs reminding that this is not normal. The amount of misinformation and lies that we encounter has grown exponentially in recent years, thanks in no small part to the rise of social media and other news echo chambers. Yes, misinformation, disinformation, and lies have been around since time immemorial, and yes, we all lie on a daily basis. But it seems to me we have become habituated to people telling lies.
Indeed, in a telling study, researchers from University College London showed that if people kept on telling lies, their brain centre responsible for activating emotions of fear and shame (the amygdala) reduced activity and became desensitized. People got used to being dishonest and what is more, as they became desensitized to telling lies, they tended to tell bigger and more harmful lies.
This descent into dishonesty is a little bit like the proverbial frog in a pan, a metaphor which is in itself based on misinformation. But the idea is still valid. People are used to other people telling white lies and small fibs. But once they encounter many of them, they start to believe more and more of them, and the lies become bigger and bigger. After a while, we hardly react to lies and misinformation that would have shocked us previously, but because the path was a gradual normalisation and repetition of ever more deviant behaviour, we became used to it.
Indeed, another experiment from the same lab showed that if misinformation is repeated just once, it is more likely perceived as true and more likely to be shared on social media platforms. The result is that misinformation becomes a self-propagating lie that becomes more powerful, the more often it is repeated.
In today’s world, where many millions of people use social media, this effect is no longer limited to individual behaviour but shapes entire societies. The chart below is taken from a study among scientists who participated in a well-known experiment about misrepresentation.
Correlation between lying on a coin toss experiment and scientific misconduct
Source: Drupp et al. (2024)
Participants are asked to toss a coin four times without showing the results to anyone. Then, at the end, they are asked to report how many times tails came up in their coin tosses. For every time tails came up, the participants got a bonus. Obviously, the incentives are such that participants can overreport the number of tails they tossed and gain a higher bonus, and nobody could prove them wrong. It’s a small lie with no downside.
On average, one would expect tails to come up 50% of the time or two times in the four coin tosses. Yet, the chart shows that in every country where the experiment was done (except Poland), participants reported on average more than two tails. In fact, countries that have a higher ranking in scientific misconduct ranking also show larger overreporting of the number of tails in the coin toss experiment. Scientists who live in countries like China, Iran, or Pakistan engage more in scientific misconduct and lie more about their coin tosses. Or rather, they live in societies where lying has become more normalised.
But there is a powerful remedy against this normalisation of deviance. In the same experiment with scientists, some participants were asked through questionnaires about their roles as scientists and the search for truth that scientists have committed their careers to. Once this identity as a scientist has been made salient to the participants, the overreporting of tails in the coin toss experiment disappeared. The scientists, when they thought of themselves as scientists stopped lying and became more honest again.
This is why I like the scientific method and the process of peer-reviewing scientific research. It’s not that peer reviews discover scientific errors. Most of the time the peer reviewer hasn’t got access or the time to look at the original data and the methodology used to analyse it. But it keeps the scientists honest. Knowing that their research is going to be reviewed elicits their identity as scientists and the ethical requirement to be honest and accurate. And that alone reduces misconduct.
This is why code of conduct and ethics training is so valuable. Not because the ethics training makes dishonest charlatans honest or enables organisations to discover misconduct. But it prevents otherwise honest ‘normal’ people from sliding down the slippery slope towards normalised deviance.
Honesty in the coin toss experiment and elicitation of identity as scientist
Source: Drupp et al. (2024)
I have no solution for the bane of our society that is social media except to abolish it altogether (which is impossible). But in professional circumstances, we can and should increase the ethics training, constantly remind people of their responsibility to be truthful, and publicly name and shame transgressions. I know ethics training, peer reviews, etc. can be annoying but they have a vital function that we will only notice once it has disappeared and it is too late.
I think this excellent post also illustrates why the rise of anonymous posts is harmful. A letter to the editor used to be curated and non-anonymous. Nowadays, folks write BS on FB or via disqus with no fear of being shamed. What used to have the status of scribblings on the walls of toilets now has a megaphone and world-wide reach.
People lie. We will always lie. The only way to judge the relative merit of information is to looks at the incentives and back-skew your read on things. Ethics training falls into the endless mash of mandated bureaucrospeak that we all ignore while we think of all the things we have to get done.