Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.
—Lee Ross and Craig Anderson, Judgment under uncertainty: Heuristics and biases
I recently watched an interesting documentary on Netflix called ‘Made you look – a True Story about Fake Art’. It followed the tale of a New York art dealer / gallery director who was trading (allegedly unknowingly) in forgeries of modern art for vast sums of money – forgeries which were, in fact, painted by a Chinese Maths professor in his shed. Whether or not, and for how long, the dealer had known they were fake was much debated, but she repeatedly rejected any and all evidence that they were, and kept insisting they were genuine, even when this was comprehensively disproven. For me, the most interesting part was an interview with a psychologist, Maria Konnikova, who has written extensively on why people fall for scams, cults and other such nonsense. She made the point that, even when confronted with irrefutable facts, it is commonly-observed that many instead (as she phrased it) ‘double down’, sticking to their existing opinion or belief even more fervently, and simply rejecting all evidence to the contrary.
Apparently, this reaction – a form of what is known as ‘cognitive dissonance’ – is due to the psychological stress caused by learning new information which challenges deeply held values / beliefs / passions. The discomfort is triggered by the person’s belief system clashing with new information, as the individual tries to find a way to resolve the contradiction to reduce their discomfort / embarrassment, and this is typically done by avoiding exposure to any additional information which might not support their preferred outlook. Examples of such behaviour include refusing to read books that might offer a different view point / disclose inconvenient facts, only socialising with people who share their preferred world view (both of which are forms of ‘confirmation bias’), or simply sticking their fingers in their ears, and screaming ‘la-la-la – I can’t hear you’.
The magnitude of the reaction is related to how important the matter is to that person (ie. how ‘invested’ they are in their preferred version being right), and how much discomfort that person can handle in being proven wrong; a very proud, narrow-minded, or arrogant person, or someone with a professional / academic reputation at stake, is much less likely to simply accept the newly-presented facts, adjust their outlook and beliefs accordingly, and move forwards.
The extent to which many people reject anything that challenges what they want to believe – even after it is proven to be false – is remarkable. As one writer on the subject put it, ‘I knew how easy it was to make people believe a lie, but I didn’t expect the same people, confronted with the lie, would choose it over the truth… No amount of logic can shatter a faith consciously based on a lie’.
Writing about this extreme attachment to things which have been disproven, and the rejection of anything that challenges their preferred outlook, physicist Max Planck noted, ‘the new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it’. In extreme cases, this can take many decades or even centuries; for example, the heliocentric theory of the great Greek astronomer, Aristarchus of Samos, had to be rediscovered about 1,800 years later, and, even then, there was still a major struggle before it was accepted by mainstream astronomers.
An article in The Journal of Educational Psychology also noted that students often ‘cling to ideas that form part of their world view even when confronted by information that does not coincide with this view’. For example, students may spend months studying the solar system and do well on related tests, but still prefer to (erroneously) believe that moon phases are produced by Earth’s shadow. The facts they had been taught were simply not able to intrude on the beliefs they held prior to being exposed to that knowledge.
Given that some students of astronomy choose to reject the reality of what causes the phases of the Moon, it is perhaps not surprising that there are so many in South Africa who will – and with much greater vehemence and passion – reject any and all information which challenges the way they prefer to view the Boer War. As previously noted, this is especially true if the person in question is very proud and narrow-minded, or – worse still – considers himself an authority on the subject with a reputation to jealously uphold.
It is unfortunately true that if such a person passionately wants to believe that the nasty old British bullies started the war, and that the Boers were the poor, defenceless, innocent victims, then nothing will ever change their mind. They will simply reject, for example, all the evidence that Kruger had been planning for an attack against the Empire since at least as early as 1887, or dismiss / frantically try to explain away that it was the Boer republics which declared war and invaded Imperial territory – not the other way around. If they are invested in believing that the British army was ‘massing troops on the border’, they will simply ignore that it was actually the Boers who were poised on the border of Natal, and that they outnumbered the scattered Imperial garrison by about 2.5 to 1. If believing that the British ‘wanted to steal our diamonds’ helps them sleep at night, they are certainly not going to take a moment to look a map, and learn that the Kimberley diamond fields were already in British territory, and were – in reality – a much coveted target of the invading Boers.
Perhaps needless to say, but anyone daring to introduce these inconvenient historical realities into the discussion will simply be howled down and abused for shattering their preferred myths and fairy stories.
Indeed, as Konnikova suggested, being presented with new evidence can instead lead to a ‘doubling-down’, prompting a particularly closed-minded and invested person to retreat into adopting an even more entrenched (and thus increasingly untenable and risible) position – a phenomenon dubbed the ‘backfire effect’. Planck’s observations about astronomy also hold true when one considers the Boer War, as there seems little chance that today’s hard core of True Believers will ever be convinced by anything as inconvenient as evidence and historical reality; the sad reality is that they hold their views so passionately, and they form such an important part of their identity, that it is simply too difficult for them to admit they are hopelessly wrong. Unfortunately, and as Planck also suggested, things will only really change when another generation grows up without having been exposed to endless Apartheid-era propaganda (or whose parents don’t gleefully pass it on to them), and who thus approach the subject with somewhat more open minds, and rather less personal attachment to the myths.
Perhaps Jonathan Swift put it best: ‘You cannot reason a person out of a position he did not reason himself into in the first place’.
Add Comment