We don’t make organize, evaluate, and make judgements from scratch. The enormous inflow of information would overwhelm a limited system. Instead, we build in preset categories, crude rules of thumb that quickly and efficiently interpret new information. These rules are referred to in psychology as cognitive heuristics.
In a 2011 paper, the authors define heuristics as a strategy that “ignores part of the information, with the goal of making decisions more quickly, frugally, or accurately than more complex methods” (2011, p.454). Many times decisions do not need complete analysis. W need an efficient means to process mass amounts of data and make a reasonable effective decision. Heuristics play a primary role in efficient, cognitive resource preserving decisions.
Why We Use Cognitive Heuristics
We don’t use cognitive heuristics because we are lazy but because they are necessary. Life throws too much information, too quick, for us to cognitively process and evaluate each crumb of information intercepted by our senses. Yet, information is valuable to our survival and flourishing. So we funnel all the data through quick efficient rules to assist creating reliable predictions of consequences of reactionary behaviors. We absorb data, crudely process (often unconsciously), and react.
Susan David PhD, a psychologist on the faculty of Harvard Medical School, explains “life is just a hell of a lot easier when you don’t have to analyze every choice” (2016, kindle location 399). David continues, “if human beings lacked the predictive ability of heuristics…and needed to consciously process every facial expression, conversation, and piece of information anew, we’d have no time for actually living life” (Kindle location 408). The accuracy of our predictions and the potency of our behavioral reactions depends on the efficiency of our integrated cognitive heuristics.
Unconscious Processing and Cognitive Heuristics
Cognitive heuristics work so fast because they process information largely beneath consciousness. We hear, see, or feel something and make a snap judgement to the deeper meaning and our appropriate response.
In many ways, the heuristic judgements are similar (if not the same) as intuition. Reasonably reliable when the heuristics are built around abundant and correctly interpreted exposures. Unfortunately, many of us form heuristics, around a paucity of experience and unreliable sources. We make life impacting choices based on a faulty system of judgement. Information processed through faulty heuristics result in erroneous conclusions and motivate harmful or misguided behavioral reactions.
Where do Cognitive Heuristics Originate?
Cognitive shortcuts don’t materialize from nothing. They evolve from experience. We begin integrating societal, family and personal rules almost immediately. Political views, cultural norms, and extensive lists of “good” and “bad” constantly bombard our senses, feeding bias into our developing neuronal connections.
These learnings not only influence judgement but creating distinct feeling responses. We see something, the information is processed through cognitive heuristics, and we emotionally react. Other heuristics, such as “you feel it, it must be true,” then join the chorus strengthening the original judgment. Richard Brodie wrote that we “go through different levels of learning heuristics in your life, each building upon the previous in a kind of pyramid. Stepping from one level of the pyramid to the next requires not just learning a different subject, but jumping to a whole new manner of learning, and in fact a whole new way of looking at the world” (Brodie, 2009).
This pyramid process of learning allows for exponential growth, without starting from scratch with each new piece of information. However, when a flawed cognitive heuristic is adopted, each new level built upon the erroneous heuristic is distorted. Reid Hastie and Robyn M. Dawes explain that “these cognitive tools are acquired over a lifetime of experience.” They suggest that “we learn these cognitive tools from trial-and-error experience, as folklore from our family or peers, and through deliberate instruction” (2009, Kindle location 1,919).
Systematic Errors of Cognitive Heuristics
Leonard Mlodinow wrote “in general, heuristics are useful, but just as our manner of processing optical information sometimes leads to optical illusions, so heuristics sometimes lead to systematic error” (2009, Kindle location 2907). These systematic errors can be blinding. We wrongly process important information that spoils futures and hurts others. Because heuristics operate beneath consciousness, we have no clue how to recalculate our judgements, abandoning the illy processed information, leading to our faulty conclusions.
Three Common Cognitive Heuristics that Influence Interpretations of New Information
Often we think of predetermined conclusions when discussing heuristics. The underlying beliefs that propel judgements. However, a significant force behind many heuristics is the surrounding environment when information is received, priming new judgements for particular interpretations. Our focus of attention, what we are thinking about when new information is obtained, our mood, and what other people are doing at that moment, all contribute to and corrupt interpretations.
Powerful heuristics massage information to conform to currently held beliefs Since we hold conflicting beliefs, information conforms to the most prevalently held belief at the moment new information is encountered. For example if I just finished writing an article on kindness, I am more likely to give to money to the homeless man outside the store. If I just finished paying my bills, I may respond differently.
Depending on our current thoughts and recent experience, we may take completely different meanings away from an event.
Amos Tversky and Daniel Kahneman identified three common cognitive heuristics—anchoring, availability, and representativeness.
The anchoring heuristic is a cognitive bias that processes new information, relying too heavily on a single piece of evidence (often the first encountered) on a given topic. In anchoring, we use a single (and often unproven) sliver of data to compare and interpret new incoming information. An example of anchoring is when we our shopping for an item, perhaps patio furniture. We check the on-line price for a set we like on Wayfair. Let’s say the price tag is $799.00. A visually comparable item is discovered on Amazon for $499.00. We assume the Amazon set is a great value because we compared it to the higher priced item on Wayfair.
The Danger of Anchoring
In his informative book, Guy Harrison explains, “this bias leads us to rely so much on one past experience or one piece of information that we ignore or reject new, more, and better information that contradicts it.” Harrison then warns, “don’t drop anchor in the muck of a past error when you could set sail for more sensible lands” (2013 Kindle location 1,182).
From the informative psychology website Decision Lab, the author furthers our understanding of anchoring this way: “when we are setting plans or making estimates about something, we interpret newer information from the reference point of our anchor, instead of seeing it objectively. This can skew our judgment, and prevent us from updating our plans or predictions as much as we should” (The Decision Lab).
The real devastation to effective thinking when we anchor all new information to a falsehood. Perhaps, a childhood religious teaching becomes the basis for all future investigations of the origins of life. Our religious anchor quickly places every grand new discovery into a category of evil. Our anchor limits learning. We experienced anchoring bias over the COVID-19 pandemic. Many people began with an anchor regarding the nature of the illness. All new information about vaccines was either vehemently rejected or blindly accepted depending on their anchoring belief.
When we encounter new information, we typically jump to conclusions, relying on information stored in long-term memory. Something in the present stirs memories of past encounters, pulling information from long-term memory to make sense of the present.
The problem is we don’t pull all associated memories or even the most relevant information. We on the information most readily available. Daniel Kahneman describes this form of associative thinking as the availability heuristic. We rely on ease of retrieval to make quick, resourceful judgments (2011, Kindle location 2,030). An example of this is giving more weight to recent or personally relevant events then significance evidence suggesting a different conclusion. We may conclude that current crime rates are worse than they were in the 1970’s because we read our news feed about crimes occurring this week. We can’t recall statistics from the 1970’s.
In 1970, Gunfire murders 112 police officers. In 2021, 62 police officers were murdered by the same means. Markedly, these numbers are not oddities. 143 officers were killed by gun fire in 1971 compared to 45 officers killed in 2020 (Officer Down Memorial Page).
Giving More Weight to the Present
Yet, because of availability heuristics, we give more weight to the current year, believing crime is out of control, reaching all time heights. The good old days of the 1970’s, 80’s up to the mid 90’s experienced unprecedented crime rates. Although population has increased by more than 30% since then crimes of homicide, robbery, and burglary are still significantly lower then those earlier decades (United States Crime Rates).
I was a child during the 70’s. The world felt relatively safe. I didn’t watch the news. There was no internet. The only crime I knew of was the relatively few incidents occurring in my neighborhood. The availability heuristic impacts many decisions. When I am researching a topic on a particular aspect, I see it everywhere. I subsequently believe the behavior is more prevalent than it actually is.
The representative heuristic blurs accurate interpretations through comparison. We see something or someone, we compare the person or event to a stereotype, and then assign all the ‘known’ characteristics of the comparative model onto the new person or event.
Hastie and Dawes explain that the common tendency is to “make judgments and decisions about category membership based on similarity between our conception of the category and our impression of the to-be-classified object, situation, or event” (2009, Kindle location 2,231).
This significantly relieves cognitive resources of time consuming evaluations. We judge the unknown by replacing it with the ‘known’. Many stereotypes are accurate. A mechanic working on a car can assume that the engine of this particular car possesses many of the same characteristics of other cars he or she may have worked on in the past. Thaler and Sunstein warn, “again, biases can creep in when similarity and frequency diverge.” They continue, “use of the representativeness heuristic can cause serious misperceptions of patterns in everyday life” (2008, kindle location 491).
Representativeness and Biases
The representative heuristic is responsible for many ugly biases and unjustified treatment. Hastie and Dawes explain the destructive nature of representative heuristics when representative categories are racial, gender, or religious groupings that “perhaps the most troublesome characteristic of these racial, gender, and religious stereotypes is that they automatically evoke emotional reactions that affect our behavior toward members of the category. Once we’ve classified a person into a category with negative associations, we may not be able to help ourselves from reacting negatively to him or her” (2009, Kindle location 2,282).
Kahneman wrote that “there is some truth to the stereotypes that govern judgments of representativeness, and predictions that follow this heuristic may be accurate.” He then cautions, “in other situations, the stereotypes are false and the representativeness heuristic will mislead.”
Over reliance on any time saving heuristic can blindly sabotage our lives and cruelly demean others, justifying inhumane and atrocious behaviors. Kahneman warns, “even when the heuristic has some validity, exclusive reliance on it is associated with grave sins against statistical logic” (2011, Kindle location 2545).
Kahneman names two sins associated with the repetitiveness heuristic:
1. Excessive Willingness to Predict Unlikely Events
We love to predict. Consistently predicting correctly provides a notable advantage in life. Kahneman warns that predictions must be exercised with caution. Unlikely events can’t be predicted with any notable success. Yet, experts consistently comment, publishing predictions based on representative heuristics that miss the many complex contributing factors. Occasionally, they are right.
Kahneman writes, “one sin of representativeness is an excessive willingness to predict the occurrence of unlikely (low base-rate) events” (Kindle location 2549). Accordingly, we should closely examine unlikely events.
We can’t predict the effectiveness or danger of a vaccine from personal experience, from outlandish cries from a social internet post, or because our uncle got sick shortly after receiving a second injection. Predicting global crime rates based on our own recent victimization, leads to gross errors. Generally, we can’t draw all inclusive conclusions for large scale measurements because a personal encounter, emotional reaction, or a outlandish reports on the internet.
2. Insensitivity to Quality of Evidence
Evidence not only can be sparse and insufficient in quantity, it also can be of low quality, allowing undetected contaminates to spoil any conclusions. We may have plenty of data but the collection may be purposely or ignorantly unrepresentative of the whole. Therefore, any conclusions drawn from the data is subject to error. When our thoughts and behaviors rely on representative heuristics, leaning on statistical data that is biased, we incorporate biases in our conclusions.
We must wisely examine the quality of evidence that we allow to sort and process new experience. Unless we skeptically step back, we unconsciously “process the information available as if it were true.” Kahneman teaches that whenever we “have doubts about the quality of the evidence: let your judgments of probability stay close to the base rate” (Kindle location 2,583).
Mitigating Harmful Cognitive Heuristics
Just because heuristics occur beneath the veil of unconsciousness does not mean we must be hapless victims to their impact. We can mitigate the damage of biased and shortsighted heuristics. When judgements spontaneously seep into our psyche, instead of praising the rush of insight as evidence of our intuitive genius, we can pause, reexamine the evidence, reflect on the source of information and incorporate more information.
We may find that our emotional tail is wagging our rational dog. Our emotional reaction triumphs and our rational mind justifies to satisfy any incongruence and discomfort from the cognitive dissonance.
Using Cognitive Heuristics to Your Advantage
Cognitive heuristics are a given. They benefit our lives and improve our responses to the continually onslaught of information flowing in our environment. We can harness their power by creating environments that support changes we wish to make. We can feed our mind with positive mantras, gentle reminders, and lifting examples. By placing uplifting information into the forefront of our minds, we encourage representative and availability heuristics to produce healthier conclusions. In psychology, we call this priming.
Kyra Bobinet, MD explains, “priming is the phenomenon in which something in our environment triggers fast-brain shortcuts, or heuristics, stored within our implicit memory. When we are exposed to certain cues, we behave in certain predictable ways” (2019).
Bobinet, K. (2019). Behavior Design at Home. Experience Life. Published 4-19-2016. Accessed 2-2-2022
Brodie, Richard (2009). Virus of the Mind: The New Science of the Meme. Hay House Inc.
David, Susan (2016). Emotional Agility: Get Unstuck, Embrace Change, and Thrive in Work and Life. Avery; First Edition
Gigerenzer, G., & Gaissmaier, W. (2011). Heuristic Decision Making. Annual Review Of Psychology, 62,
Harrison, Guy P. (2013) Think: Why You Should Question Everything. Prometheus; Illustrated edition
Hastie, Reid; Dawes, Robyn M. (2009). Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making. SAGE Publications, Inc; Second edition
Kahneman, D. (2011) Thinking, Fast and Slow. Farrar, Straus and Giroux; 1 edition
Mlodinow, Leonard (2009) The Drunkard’s Walk: How Randomness Rules Our Lives. Vintage; Illustrated edition
Thaler, Richard H., Sunstein, Cass R. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness. Yale University Press; 1st edition