We begin the course by reviewing what scientists call “critical thinking skills.” How do you KNOW what you know? This is also known by the more daunting name: epistemology. We are going to spend today talking about what psychologists refer to as cognitive biases. These are problems with the way we think that, if not accounted for, can result in bad judgments. These biases were certainly, at one point, adaptive and beneficial and despite their high rate of errors, must have resulted in more overall benefits than overall losses. But in a modern post-industrial environment, adaptations that served us well on the savannahs of ancient Africa may not serve us well in 21st Century Raleigh.


For the sake of easy comprehension, I’ve broken down our cognitive biases into three main categories. Two deal with information issues and the bad things that can happen if we have too may or too few facts. The third is more difficult to explain. Speed of decision-making is very important for any human in an uncertain environment. The tradeoff between speed and accuracy must have been tilted in favor of speed for ancestral humans. Which makes sense when you think about it – if you’re faced with a hungry lion or angry cave bear, a quick decision even if not fully accurate is preferred to a slow but precise one. Here’s a comprehensive cognitive bias codex if you are interested.


Crime rates, especially violent crimes, have been declining steadily since the early 1990’s. Yet, because the media is a business that needs viewers to make money, and because crime stories attract viewers, the media prioritizes and pushes stories about these crimes, whether or not the overall rates of violent crime are going up or are going down. The error is assuming that because crime stories are easily available, that crimes most be commonplace. This bias is related to “missing white girl syndrome", defined as the media's undue focus on upper-middle-class white women who disappear, with the disproportionate degree of coverage they receive being compared to cases of missing women of other ethnicities or with missing males.


No one wants to hear about good news – we focus on the bad. Which makes sense, from an adaptive or evolutionary standpoint.


Ignoring evidence that contradicts our beliefs is called “cherry picking” and it underpins a lot of the errors people make in their thinking. Psychics thrive on this type of bias – we ignore all the times they were wrong and focus instead on the few times they were correct. Nostradamus wrote page after page of near-gibberish, none of which has any value as prophecy. But we ignore all that and zero in on his “Hister, the Captain of Greater Germany” line and now, Nostradamus is a prophet. Given how common confirmation is in everyday life, we’re going to watch a short video that explains it in more detail.


Pattern-finding is perhaps the most uniquely human of all our traits. But sometimes we are too good at it. Seeing patterns where none exist is called ”pareidolia”. Sometimes, as with our mop handle, it is just cute with no consequences. Sometimes, the consequences can be more severe. Gamblers often imagine they see patterns in their wins and losses. They keep playing after a win (the “hot hand”). If they lose all night, they think they are “due” for a win to balance things out. This is known as the “gambler’s fallacy”. The outcome of each game is independent from the others. Yes, over long stretches of time, wins should equal losses in a truly random and fair game (like a coin flip). But each result has no connection to the others. Most conspiracy theories involve some aspect of false pattern detecting. Take, for example, the “Monument on Mars”.


Seeing human physical traits in other objects (both animate and, in this case, inanimate) is known as anthropomorphism. It is also a type of cognitive bias.


We now move on to our second section which discusses biases that occur when we have too little information and are forced to fill in the blanks. When we are forced to make assumptions and fill in the blanks regarding people, we tend to rely too much on superficial, surface characteristics like physical beauty. The Halo Effect means we are more inclined to trust good-looking people. Good-looking people are often automatically assumed to be successful (“dress for success”, we are told). Think about politics, especially Presidential politics in America. Almost always, the best-looking candidate wins. There is a negative version of this known as the “horns effect”, which means we are less likely to trust ugly people. We also use stereotypes, which are behaviors that are supposedly shared by all members of a group. “Black people are ALWAYS late” is a stereotype. Is it accurate? Whenever you say ALL members of a group are going to behave a certain way, you are almost always going to be WRONG.


These two biases are two sides of the same coin. For an event we’ve never, ever experienced (like, perhaps, a hurricane coming ashore), we often tend to minimize the risk, as we have nothing reliable in our memories to compare it to. Here, watching hurricanes on TV would actually help prepare someone to better estimate the risks they are facing. On the other end, if something extremely unlikely could happen (and we can think about it and visualize it) we often tend to overestimate its true likelihood of happening. People who “prep” (for the end of the world) are falling victim to this bias. On a personal level, I saw my father struck by lightning in 1980 (he survived). Ever since then, I have dramatically overestimated my own chances of being struck.


Hyperbolic discounting means we value things now WAY more than we value the same thing in the future. Most people, if given a choice between $50 now and the promise of $100 six months from now, will chose the $50 now. Rephrase the question and take the urgency out of it and you get a different result. Most people if given a choice between $50 a year from now and $100 two years from now will take the $100. Think about the last time you put off studying for a test to watch TV or play on social media. You chose the immediate reward over a bigger reward further away in time. This is also known as “procrastination”.


Notice that as we move further into the future, the differences between $50 and $100 stay stable. But in the present moment, $50 NOW is worth more than the promise of $100 in six months. Why is this adaptive and advantageous in a chaotic, unpredictable environment and not adaptive, not advantageous in a stable, predictable environment?


Our final set of biases involves making decisions too fast. In ancient, ancestral times, quick decisions were preferred but in today’s complex world, the wrong decision can make or break you. As confidence is needed for quick decisions, it should be no surprise that we have a bias that makes us overconfident, known as the Dunning-Kruger Effect, after the psychologists that first described it. D-K says ignorance and confidence go hand in hand and the more experience we get, the less confident we become.


Most people will give up as experience erodes their confidence. Those that keep with it become true experts. Weirdly, teaching is a profession where non-teachers have a lot of confidence in their abilities to run school systems.


If you give people a very complex and important task, they tend to spend the most time on the simplest parts. The example often used is that if you put a group of people in charge of designing a nuclear power plant, they will often spend more time on designing the bike shed as they will the reactor. Hence the name “bike shedding”. Perhaps you’ve experienced this in group work if more time was spent picking out the font for a paper than actually writing the paper itself. It makes sense to start with the simplest thing first, that gets people moving quickly. But spending more time on the easy and less time on the complex is a bad adaptation. As for “rhyme-as-reason”, speakers have known about this trick for a long time. Things that sound good are assumed to be true, just like people that look good are assumed to be honest. A well-known example of “rhyme-as-reason” happened during the OJ Simpson trial (“If the gloves don’t fit, you must acquit”).


I call this “doubling down on the crazy”. Jonestown is an extreme example of belief perseverance. Over 900 people followed their cult leader, Jim Jones, in suicide despite years of mounting evidence regarding Jones’ sexual assaults, financial fraud and general criminal behavior. Cults in general tend to see more members join after a failed end-of-the-world prediction. The prediction itself gives the cult lots of media attention and publicity and when it fails, the true believers still believe – perhaps even more so. You also see this belief perseverance with conspiracy theories. Let’s say my friend thinks 9-11 was an “inside job”. I present him with evidence that it was not. He can just double-down and say my evidence must have been planted by the CIA to throw me off the truth.

blog comments powered by Disqus

Critical Thinking Day 2: Resources


MediaYour Logical Fallacy

Read More Description  Website 


Related Lesson Plans

THINKING FAST AND SLOW DAY 1 
STATISTICAL LITERACY DAY 3