It’s not often I come across an article that tackles in a straightforward and easily understood way the central theme of Questionable Motives: why should we trust any kind of belief without compelling evidence? Is there some way we can differentiate between what we can know to be true rather than simply believe what’s probably true?
One of the challenges of trying to be scientific, and an honest intellectual, is that judgment is often required in assessing a claim or topic. The problem with relying upon one’s judgment is that it is fraught, even overwhelmed, with personal bias.
That’s a fundamental recognition: that we come predisposed to be biased.
The “default mode” of human behavior (which means most people do this most of the time) is to construct an elaborate rationalization for what we already believe, and want to believe. The more intelligent we are, the more sophisticated and elaborate our rationalizations – giving more confidence in our conclusions, but not necessarily deserved.
The problem is that if we want to know something, how can we try to remove or overcome or compensate for this natural tendency to be biased so that we can better know what’s true rather than what we believe is true?
The solution to this problem is to develop a specific intellectual skill set – knowledge of the many and various ways in which we bias our thinking and the constant application of this knowledge to our own beliefs. In other words, we need to be skeptical, especially of ourselves. But not just skeptical in attitude, systematically skeptical of the process of our own thought.
But since this is necessarily self-referential (we can bias our assessment of our biases) it is also necessary to check your beliefs and thinking against other people, people with different perspectives – from different backgrounds, areas of expertise, and cultures.
This approach is intellectually honest, which must include awareness of our biases inherent in our assertions and assumptions about what we think is true, and a healthy dose of skepticism. And if we aren’t skeptical?
The opposite of this approach is to be insular, to have a self-contained belief system that feeds on itself but which is completely disconnected from logic and reality. Humans seem to have an unfortunate penchant for falling into such self-contained belief systems, cults being the ultimate expression of this tendency.
This kind knowledge based on a self-contained belief system is therefore intellectually dishonest.
How we think, then, in large part determines what we think, or to use more specific language, our epistemology in a large part determines our ontology. How we think of issues and our ability to recognize our biases when we do requires a level of skepticism to be applied in order for us to be able to better trust our knowledge. If we fail to follow an informed epistemological approach and, instead, rely on our assumptions and assertions that inform beliefs to be enough of a guide to lead us to knowledge, then we quite rightly are wide open to the legitimate criticism that our ontology – what we think we know – is suspect. Belief without skepticism, belief without a recognition of our inherent biases – is a sure sign that we are not privy to what is true by means of intellectual honesty but merely what we think is true by means of intellectual dishonesty.