Cognitive Biases

A light bulb with the reflection of it's filament.


November 20, 2022

In the great bookThinking, Fast and Slow(2011), Daniel Kahneman explains in detail two different ways of thinking that human beings use. Kahneman refers to them as System 1 and System 2, which he defines as follows:

System 1: Operates automatically and quickly, with little or no effort or sense of voluntary control. Makes instinctual or intuitive decisions – typically based on heuristics.

System 2: Allocates attention to the effortful mental activities that demand it, including complex computations involving logic, math, or statistics. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

Heuristics are simple rules we use – via System 1 – to form judgments or make decisions. Heuristics are mental shortcuts whereby we simplify a complex situation in order to jump to a quick conclusion.

Most of the time, heuristics work well. We can immediately notice a shadow in the grass, alerting us to the possible presence of a lion. And we can automatically read people’s faces, drive a car on an empty road, do easy math, or understand simple language. (For more on System 1, see the last section of this blog post.)

However, if we face a situation that requires the use of logic, math, or statistics to reach a good judgment or decision, heuristics lead to systematic errors. These errors arecognitive biases.

Let’s examine some of the main cognitive biases:

  • anchoring effect
  • availability bias, vividness bias, recency bias
  • confirmation bias
  • hindsight bias
  • overconfidence
  • narrative fallacy
  • information and overconfidence
  • self-attribution bias

 

ANCHORING EFFECT

anchoring effect: people tend to use any random number as a baseline for estimating an unknown quantity, despite the fact that the unknown quantity is totally unrelated to the random number.

Daniel Kahneman and Amos Tversky did one experiment where they spun a wheel of fortune, but they had secretly programmed the wheel so that it would stop on 10 or 65. After the wheel stopped, participants were asked to estimate the percentage of African countries in the UN. Participants who saw “10” on the wheel guessed 25% on average, while participants who saw “65” on the wheel guessed 45% on average, ahuge difference.

Behavioral finance expert James Montier has run his own experiment on anchoring. People are asked to write down the last four digits of their phone number. Then they are asked whether the number of doctors in their capital city is higher or lower than the last four digits of their phone number. Results: Those whose last four digits were greater than 7000 on average report 6762 doctors, while those with telephone numbers below 2000 arrived at an average 2270 doctors. (James Montier,Behavioural Investing, Wiley 2007, page 120)

Those are just two experiments out of many. Theanchoring effectis “one of the most reliable and robust results of experimental psychology” (page 119, Kahneman). Furthermore, Montier observes that the anchoring effect is one reason why people cling to financial forecasts, despite the fact that most financial forecasts are either wrong, useless, or impossible to time.

When faced with the unknown, people will grasp onto almost anything. So it is little wonder that an investor will cling to forecasts, despite their uselessness. (Montier, page 120)

 

AVAILABILITY BIAS, VIVIDNESS BIAS,RECENCY BIAS

availability bias: people tend to overweight evidence that comes easily to mind.

Related to the availability bias arevividness biasandrecency bias. People typically overweight facts that are vivid (e.g., plane crashes or shark attacks). People also overweight facts that are recent (partly because they are more vivid).

Note: It’s also natural for people to assume that hard-won evidence or insight must be worth more. But often that’s not true, either.

 

CONFIRMATION BIAS

confirmation bias: people tend to search for, remember, and interpret information in a way that confirms their pre-existing beliefs or hypotheses.

Confirmation bias makes it quite difficult for many people to improve upon or supplant their existing beliefs or hypotheses. This bias also tends to make people overconfident about existing beliefs or hypotheses, since all they can see are supporting data.

We know that our System 1 (intuition) often errors when it comes to forming and testing hypotheses. First of all, System 1 always forms a coherent story (including causality), irrespective of whether there are truly any logical connections at all among various things in experience. Furthermore, when System 1 is facing a hypothesis, it automatically looks for confirming evidence.

But even System 2 – the logical and mathematical system that humans possess and can develop – by nature uses a positive test strategy:

A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis. Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold. (page 81, Kahneman)

Thus, the habit of always looking for disconfirming evidence of our hypotheses – especially our “best-loved hypotheses” – is arguably the most important intellectual habit we could develop in the never-ending search for wisdom and knowledge.

Charles Darwin is a wonderful model for people in this regard. Darwin was far from being a genius in terms of IQ. Yet Darwin trained himself always to search for facts and evidence that would contradict his hypotheses. Charlie Munger explains in “The Psychology of Human Misjudgment” (see Poor Charlie’s Alamanack: The Wit and Wisdom of Charles T. Munger, expanded 3rd edition):

One of the most successful users of an antidote to first conclusion bias was Charles Darwin. He trained himself, early, to intensively consider any evidence tending to disconfirm any hypothesis of his, more so if he thought his hypothesis was a particularly good one…He provides a great example of psychological insight correctly used to advance some of the finest mental work ever done.

 

HINDSIGHT BIAS

Hindsight bias: the tendency, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting the event prior to its occurrence.

Hindsight bias is also called the “knew-it-all-along effect” or “creeping determinism.” (See:http://en.wikipedia.org/wiki/Hindsight_bias)

Kahneman writes abouthindsight biasas follows:

Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events. Baruch Fischhoff first demonstrated this ‘I-knew-it-all-along’ effect, orhindsight bias, when he was a student in Jerusalem. Together with Ruth Beyth (another of our students), Fischhoff conducted a survey before President Richard Nixon visited China and Russia in 1972. The respondents assigned probabilities to fifteen possible outcomes of Nixon’s diplomatic initiatives. Would Mao Zedong agree to meet with Nixon? Might the United States grant diplomatic recognition to China? After decades of enmity, could the United States and the Soviet Union agree on anything significant?

After Nixon’s return from his travels, Fischhoff and Beyth asked the same people to recall the probability that they had originally assigned to each of the fifteen possible outcomes. The results were clear. If an event had actually occurred, people exaggerated the probability that they had assigned to it earlier. If the possible event had not come to pass, the participants erroneously recalled that they had always considered it unlikely. Further experiments showed that people were driven to overstate the accuracy not only of their original predictions but also of those made by others. Similar results have been found for other events that gripped public attention, such as the O.J. Simpson murder trial and the impeachment of President Bill Clinton. The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive illusion.(pages 202-3, my emphasis)

Concludes Kahneman:

The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent that it really is. The illusion that one has understood the past feeds the further illusion that one can predict and control the future. These illusions are comforting. They reduce the anxiety we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. (page 204-5, my emphasis)

 

OVERCONFIDENCE

Overconfidence is such as widespread cognitive bias among people that Kahneman devotes Part 3 of his book entirely to this topic. Kahneman says in his introduction:

The difficulties of statistical thinking contribute to the main theme of Part 3, which describes a puzzling limitation of our mind: our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight. My views on this topic have been influenced by Nassim Taleb, the author ofThe Black Swan. (pages 14-5)

Several studies have shownthat roughly 90% of drivers rate themselves as above average. For more on overconfidence, see:https://en.wikipedia.org/wiki/Overconfidence_effect

 

NARRATIVE FALLACY

InThe BlackSwan, Nassim Taleb writes the following about thenarrative fallacy:

The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, anarrow of relationship, upon them. Explanations bind facts together. They make them all the more easily remembered; they help themmake more sense. Where this propensity can go wrong is when it increases ourimpressionof understanding. (page 63-4)

The narrative fallacy is central to many of the biases and misjudgments mentioned by Daniel Kahneman and Charlie Munger. The human brain, whether using System 1 (intuition) or System 2 (logic), always looks for or creates logical coherence among random data. Often System 1 is right when it assumes causality; thus, System 1 is generally helpful, thanks to evolution. Furthermore, System 2, by searching for underlying causes or coherence, has, through careful application of the scientific method over centuries, developed a highly useful set of scientific laws by which to explain and predict various phenomena.

The trouble comes when the data or phenomena in question are “highly random” – or inherently unpredictable (at least for the time being). In these areas, System 1 makes predictions that are often very wrong. And even System 2 assumes necessary logical connections when there may not be any – at least, none that can be discovered for some time.

Note: The eighteenth century Scottish philosopher (and psychologist) David Hume was one of the first to clearly recognize the human brain’s insistence on always assuming necessary logical connections in any set of data or phenomena.

 

INFORMATION AND OVERCONFIDENCE

InBehavioural Investing, James Montier explains a study done by Paul Slovic (1973). Eight experienced bookmakers were shown a list of 88 variables found on a typical past performance chart on a horse. Each bookmaker was asked to rank the piece of information by importance.

Then the bookmakers were given data for 40 past races and asked to rank the top five horses in each race. Montier:

Each bookmaker was given the past data in increments of the 5, 10, 20, and 40 variables he had selected as most important. Hence each bookmaker predicted the outcome of each race four times – once for each of the information sets. For each prediction the bookmakers were asked to give a degree of confidence ranking in their forecast. (page 136)

RESULTS:

Accuracy was virtually unchanged, regardless of the number of pieces of information the bookmaker was given (5, 10, 20, then 40).

But confidence skyrocketedas the number of pieces of information increased (5, 10, 20, then 40).

This same result has been found in a variety of areas. As people get more information, the accuracy of their judgments or forecasts typically does not change at all, while their confidence in the accuracy of their judgments or forecasts tends to increase dramatically.

 

SELF-ATTRIBUTION BIAS

self-attribution bias: people tend to attribute good outcomes to their own skill, while blaming bad outcomes on bad luck.

This ego-protective bias prevents people from recognizing and learning from their mistakes. This bias also contributes to overconfidence.

 

MORE ON SYSTEM 1

When we are thinking of who we are, we use System 2 to define ourselves. But, writes Kahneman, System 1 effortlessly originates impressions and feelings that are the main source of the explicit beliefs and deliberate choices of System 2.

Kahneman lists, “in rough order of complexity,” examples of the automatic activities of System 1:

  • Detect that one object is more distant than another.
  • Orient to the source of a sudden sound.
  • Complete the phrase “Bread and…”
  • Make a “disgust face” when shown a horrible picture.
  • Detect hostility in a voice.
  • Answer 2 + 2 = ?
  • Read words on large billboards.
  • Drive a car on an empty road.
  • Find a strong move in chess (if you are a chess master).
  • Understand simple sentences.
  • Recognize that “a meek and tidy soul with a passion for detail” resembles an occupational stereotype.

Kahneman writes that System 1 and System 2 work quite well generally:

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate.

“Thinking fast” usually works fine. System 1 is remarkably good at what it does, thanks to evolution. Kahneman:

System 1 is designed to jump to conclusions from little evidence.

However, when we face situations that are unavoidably complex, System 1 systematically jumps to the wrong conclusions. In these situations, we have to train ourselves to “think slow” and reason our way to a good decision.

For the curious, here’s the most comprehensive list of cognitive biases I’ve seen:https://en.wikipedia.org/wiki/List_of_cognitive_biases

 

BOOLE MICROCAP FUND

An equal weighted group of micro caps generally far outperforms an equal weighted (or cap-weighted) group of larger stocks over time. See the historical chart here: https://boolefund.com/best-performers-microcap-stocks/

This outperformance increases significantly by focusing on cheap micro caps. Performance can be further boosted by isolating cheap microcap companies that show improving fundamentals. We rank microcap stocks based on these and similar criteria.

There are roughly 10-20 positions in the portfolio. The size of each position is determined by its rank. Typically the largest position is 15-20% (at cost), while the average position is 8-10% (at cost). Positions are held for 3 to 5 years unless a stock approachesintrinsic value sooner or an error has been discovered.

The mission of the Boole Fund is to outperform the S&P 500 Index by at least 5% per year (net of fees) over 5-year periods. We also aim to outpace the Russell Microcap Index by at least 2% per year (net). The Boole Fund has low fees.

 

If you are interested in finding out more, please e-mail me or leave a comment.

My e-mail: jb@boolefund.com

 

 

 

Disclosures: Past performance is not a guarantee or a reliable indicator of future results. All investments contain risk and may lose value. This material is distributed for informational purposes only. Forecasts, estimates, and certain information contained herein should not be considered as investment advice or a recommendation of any particular security, strategy or investment product. Information contained herein has been obtained from sources believed to be reliable, but not guaranteed.No part of this article may be reproduced in any form, or referred to in any other publication, without express written permission of Boole Capital, LLC.

One thought on “Cognitive Biases”

Comments are closed.