top of page

Accepting Uncertainty

  • Writer: Admin
    Admin
  • Oct 12, 2024
  • 7 min read

We humans are insatiably curious. Our brains actively and passively hunger for information. It’s often this very trait that leaves it prone to error.


Uncertainty is uncomfortable. Our brains try to avoid it at near any cost. Some even view it as a bad ‘trait’, for example in politics – with opposing sides weaponising it cleverly as indecisive, or ‘flip-flopping’ (how terrible that someone can change their mind based on new information!).


The discomfort is driven by a subconscious longing for the mental fulfilment, control, or closure that certainty offers. To try and resolve that discomfort, we tell ourselves all manner of stories and narratives. But rarely does this lead to truth.


This rejection of uncertainty is one huge reason why belief in religion, superstition, and poor logic such as confirmation bias, arise over rationality and a real quest for truth.


Together, these things ‘fill in’ those gaps of uncertainty to give the illusion of fulfilment in our brains – and said brains ravenously snap up these illusions wherever they’re available.


We generate stories to tell ourselves and don’t even realise it’s happening.



Uncertainty is Fine, but it’s not Evidence


People too often use uncertainty as a reason to support a claim or belief – rather than to doubt it, and humbly sit on the fence pending further information. They do it for the same psychological reasons as above. This logical error is known as an appeal to ignorance. Examples include:


UFOs (Alien Visitation)

Ghosts

Psychics and Mediums

Astrology

Religion


These thought patterns also tend to shift the burden of proof:


Person 1: “This UFO video proves alien visitation.”

Person 2: “How? Why does it prove that?”

Person 1: “You don’t know what it is either! Prove to me it isn’t aliens!”

This cartoon depicts the same logical error as the above UFO example.

All these examples fail to weigh evidence fairly. Instead, they pretend to do the hard work of navigating our complex lives, while giving us a false, albeit powerful, sense of certainty, 'knowing', or direction.

But this can lead to its own lack of control (such as being easily misled, poor life judgements) and own lack of responsibility (it’s just ‘meant to be’, it’s because I'm/they're a Virgo, having faith instead of taking action, etc), and countless other real problems.


It’s due to the powerful need for certainty that this bias manifests, psychologically.


However, as I'm about to explain, just because something isn’t definitively disproven, doesn’t make it any truer.



Uncertainty Scale


-3______-2______-1______0______1______2______3


Think of a scale -3 up to +3 and any claim starts at 0, which is ‘don’t know either way’.


Uncertainty doesn’t move that scale either way. The zero point in the scale marks uncertainty.


-3 is disconfirming evidence enough to have low or no confidence in the claim.


+3 is evidence enough to have high confidence in the claim.


Where you start on the scale can depend on the plausibility of the claim too. Plausibility is the estimated likelihood of a claim given all the evidence for it to date. Simply stated, this could be 'probably true' (in the case of vaping harming lungs) or 'most likely not true' (in the case of the Loch Ness monster).


The Process:


Let’s say that someone claims ‘X’ is true (for this example, we readers don’t know if X is true or not, it is unimportant).


Without knowing the answer and before evidence is checked, we start at 0.


The person wants X to be true but presents no evidence to examine. We, the readers, also haven’t been able to give disproving evidence - so does that move the confidence up to +1?


The answer is no. In fact, if the plausibility of the claim before evidence is low, we should probably start at -1 or if it’s really implausible, like turning into a unicorn or violating the laws of physics, for example, -3 is a fair starting point. Positive evidence can always work it back up the scale, but a lot of it is needed to balance such implausible claims.


The only reason for feeling ‘X’ is true given the assessment above, is that they began the claim with a desire for it to be true. This is driven by beliefs, superstition, psychology, confirmation bias, wrong logic – anything but evidence. But the scale is still at 0 or ‘unknown’, objectively.


Therefore, in reality, uncertainty should tilt towards caution of the claim until backed up, it does not lend a claim credibility until disproven. This is how we protect ourselves from being fooled, misled, preyed on, etc.


Furthermore, simply 'not knowing' is a valid option – we don’t need to land on a definite side of the wall.



Objective Thinking Doesn’t Equal Certainty


In fact, thinking objectively requires uncertainty. Or at least acknowledging when it's there. It’s a question of where the overall balance of evidence lies, and not trusting your first impulse to assume an answer. You have to be comfortable accepting uncertainty to do this. Again, an excellent summary from earlier:


“A balanced argument doesn’t weigh two sides equally. It weighs the strongest evidence more heavily. It’s about recognising your biases, and giving serious consideration to facts that contradict your hopes and beliefs.”


Without a definitive answer, you still have to weigh up both sides of true or false objectively and fairly, and whichever side weighs heavier is your best answer, for now. I say ‘for now’ because new information can change the balance in knowledge - but that doesn't justify assuming it will happen, and holding out for it, ignoring the current balance of evidence in the meantime.


It is more accurate to accept uncertainty if we just don’t know something, rather than attempt to fill in the gaps ourselves. The latter is more often wrong, as our beliefs and biases do not determine what is true.



Science and Uncertainty


Science is no stranger to uncertainty. As cynical as some are about scientists (in the broad sense) and the process of it, they are really just cynical about some related issue or specific person – not science. Science is sometimes seen as dogmatic (see made up pejorative words like ‘scientism’) or an arrogant know-it-all’s business.


The process of science is not dogmatic, it's pragmatic. It thrives on uncertainty as an anchor and a measure to avoid our human flaws. This results in a much fairer, more accurate picture of what is true. It has to respect the unknown, not choose it.


While some responses to scientific findings (such as the continued disproof of psychic abilities, for example) might be to call them ‘closed-minded’, uncertainty in science requires the exact opposite mindset. Working harmoniously with uncertainty is by definition, open-minded. This frees us to go wherever objective evidence leads, without making a stand based on what we desire to be the answer.


If there is no good quality evidence for a claim, you can’t be certain of it – it’s that simple. Of course, in the face of a repeated lack of evidence, you can be more confident a claim is not true.


If studies continuously fail to show that CBD has a real effect beyond placebo biases, despite it being marketed for staggering profits, the answer is less and less that it ‘needs more research’ and more that we’re looking at the false/minus side of the uncertainty scale by now. A blatant cash grab, when you weigh up the selling power versus the evidence.



Life's a Jigsaw


Truths do exist. We can really know things, but there’s only one way to do it – and that involves actively confronting our dustbin minds and sifting through the garbage in them piece by piece. Do I actually know this is true? How? Why do I think this? How might I be wrong? How can I check? All of these questions require a degree of uncertainty to even ask them - and being comfortable enough with it to do so.


If you prefer a more sanitary analogy, think of it like a jigsaw puzzle. You don’t have to know the correct piece you seek to know that other pieces are wrong; in fact, you can safely assume all pieces except one are wrong. Since we know the established framework of how jigsaws work (like a scientific theory), there’s no good reason to assume that more than one piece should fit the same gap. (That assumption would start comfortably at -2 or -3 on the scale).


Yet the whole time, you are uncertain of the right answer until discovered and tested. This uncertainty keeps you from assuming each piece you try next is the one, before testing it. This (should) stop us from doubling down when one doesn’t fit flush to our beliefs (denial). It also stops us rationalising why it won’t fit (logical fallacies, special pleading, moving the goal posts), rather than accepting it’s just not true.


You can logically justify not having to try a yellow piece among an all-black part of the puzzle - you can rule out unlikely answers. Importantly, as you sift through the more likely pieces, you still don’t presume to know any one of them is ‘true until proven wrong’ – you just test them, and only become certain when one is proven correct.


Jigsaws are scientific exercises. They are a faithless process, and a simple example of how to weigh reality against uncertainty.


This validly extends to anything based in reality, albeit often more complex. It’s only a matter of the correct approach and the right expertise. The same principles of logic apply.


Uncertainty should not be feared or shunned in favour of a perceived certainty of faith. This is no better than ignorance. 'Uncertain until proven true' is better than 'certain without good enough reason'. The latter is self-deception.


The road to truth can be hard, but uncertainty is a worthy companion.



Comments


© 2023 by Shutter Zone. Proudly created with Wix.com

bottom of page