Thinking About Our Thinking: Is what you believe really true?
A whole new field in the study of economics has developed
in recent years, that of “behavioral economics”. This involves how our
unconscious biases affect our economic decisions and which result in our making
choices that are not optimal. For example, most people avoid losses more than
they seek gains, even if the losses and gains are equal.
A similar field of study is how cognitive biases affect our
political thinking. Cognitive
biases are shortcuts in our thinking that are often useful, but which make our
judgments irrational and/or lack objectivity. See https://yourbias.is/ for a useful list of 24
such cognitive biases.
Here are the ones that appear to most affect our political
thinking.
Because of “confirmation bias” we tend to seek out
information that confirms our opinions and ignore or dismiss information that
is inconsistent with those beliefs. The algorithms of Facebook and Google tend
to accentuate this problem, as we are fed “stuff” that is consistent with our
previous “likes” or searches – thus the “echo chamber” of political thought. This has made the polarization of America
worse. “Belief bias” is related in that if a
conclusion supports our existing beliefs, we will rationalize anything that
supports it.
A useful antidote to this bias is to apply the scientific
method. That is, treat our belief as a hypothesis, and test it by exposing it
to disconfirming data, information and beliefs. If it can withstand that
objective scrutiny, then it is more likely to be true. We need to ask ourselves,
“When and how did I get this belief?” The circumstances in which we came to that
belief may no longer apply, and a new understanding may emerge.
Beware of the “backfire effect”. When our core beliefs
are challenged, it can cause us to believe even more strongly. Related to this
is “reactance” when we would rather do the opposite of what someone is trying
to make us do or make us believe. Dale Carnegie (or was it Ben Franklin?) said,
“A man convinced against his will is of the same opinion still.”
Another hazard is “groupthink”. The social dynamics of a group may influence our thinking,
in a desire to “fit in”. This is very observable in political party functions.
For example, in a Republican party meeting, who dares speak out against Trump?
In the Obama years, who dared speak against him in the Democratic party
meetings?
And, “in-group bias”
may cause us to unfairly favor those who
belong to our group. i.e., who are most like
us, or belong to our groups. Similarly, the “halo
effect” is strong in political parties. If we
like someone, that positively influences our judgments of them and of their performance
or opinions.
But, one of the most dangerous cognitive biases is the “Dunning-Kruger
effect” People with little knowledge tend to think they know more than they actually
do and thus tend to overestimate their ability. Conversely, the more you know, the less confident you're likely to be
about what you know. The problem is that people with little knowledge don’t
know what they don’t know – i.e. they are not even aware of entire fields of
knowledge. It's easy to be over-confident when
you have only a simple idea of how things are.
The antidote to this bias is to be open to a group of people
with diverse opinions, and with people with knowledge of the subject matter
that is pertinent to the decision to be made. Diversity just for the sake of
diversity without knowledge is not enough. Through open dialogue, much may be
learned and better decisions made. The failure of doing so is particularly
critical where the decision-making power is concentrated into one or a few
hands, such as the U.S. Presidency where the President can unilaterally make huge
changes in policy through executive orders. The same is true in China, Russia,
Saudi Arabia and many other countries with autocratic rulers.
Hopefully being aware of our biases will make us more
open to finding common ground.
Comments
Post a Comment