A terrific article over at Science-based Medicine by David Gorski about why our beliefs arm us against against acquiring knowledge counter to those beliefs. It is worth the read. But for the condensed version, I have selected a few excerpts and used bold to highlight what I think are some of the important points he raises.
If there’s a trait among humans that seems universal, it appears to be an unquenchable thirst for certainty. It is likely to be a major force that drives people into the arms of religion, even radical religions that have clearly irrational views, such as the idea that flying planes into large buildings and killing thousands of people is a one-way ticket to heaven. However, this craving for certainty isn’t expressed only by religiosity. As anyone who accepts science as the basis of medical therapy knows, there’s a lot of the same psychology going on in medicine as well. This should come as no surprise to those committed to science-based medicine because there is a profound conflict between our human desire for certainty and the uncertainty that is always inherent in so much of our medical knowledge. The reason is that the conclusions of science are always provisional, and those of science-based medicine arguably even more so than many other branches of science.
We see this phenomenon of craving certainty writ large and in bold letters in huge swaths of so-called “alternative” medicine. Indeed, a lot of quackery, if not most of it, involves substituting the certainty of belief for the provisional nature of science in science-based medicine, as well as the uncertainty in our ability to predict treatment outcomes, particularly in serious diseases with variable biology, like several types of cancer.
The simplicity of these concepts at their core makes them stubbornly resistant to evidence. Indeed, when scientific evidence meets a strong belief, the evidence usually loses. In some cases, it does more than just lose; the scientific evidence only hardens the position of believers. We see this very commonly in the anti-vaccine movement, where the more evidence is presented against a vaccine-autism link, seemingly the more deeply anti-vaccine activists dig their heels in to resist, cherry picking and twisting evidence, launching ad hominem attacks on their foes, and moving the goalposts faster than science can kick the evidence ball through the uprights. The same is true for any number of pseudoscientific beliefs. We see it all the time in quackery, where even failure of the tumor to shrink in response can lead patients to conclude that the tumor, although still there, still can’t hurt them. 9/11 Truthers, creationists, Holocaust deniers, moon hoaxers — they all engage in the same sort of desperate resistance to science.
(M)ost recommendations of science-based medicine are not “truth” per se; they are simply the best recommendations physicians can currently make based on current scientific evidence. Be that as it may, the problem with the “truth wins” viewpoint is that the “truth” often runs into a buzz saw known as a phenomenon that philosophers call naive realism. This phenomenon, boiled down to its essence is the belief that whatever one believes, one believes it simply because it’s true. In the service of naive realism, we all construct mental models that help us make sense of the world. When the “truth wins” assumption meets naive realism, guess what usually wins? It ain’t the truth.
Skepticism and science are hard in that they tend to go against some of the most deeply ingrained human traits there are, in particular the need for certainty and an intolerance of ambiguity. Also in play is our tendency to cling to our beliefs, no matter what, as though having to change our beliefs somehow devalues or dishonors us. Skepticism, critical thinking, and science can help us overcome these tendencies, but it’s difficult. Perhaps that’s the most important contribution of the scientific method. It creates a structure that allows us to change our beliefs about the world based on evidence and experimentation without the absolute necessity of taking being proven wrong personally.