• Daniel Kahneman
    Who
  • March 26, 2019
    When
    Read, recorded or researched
Summary
This book will change how you make decisions. Once you read it, you won’t stop thinking about its implications. But, be warned, there are no shortcuts. Kahneman is a Nobel prize winning psychologist and will tell you “room for error is the only way to navigate a world that is impossible to predict.” You have to read it to believe it.

The Best Points

From
Thinking Fast & Slow
On

Connections:

No items found.

The Lazy Controller

  • System 1 (intuitive, quick responses) often fools you and system 2 (which has a supervisory role) doesn’t override to correct it. People are lazy and if it feels right they generally go with it.
  • Intelligence doesn’t make people immune to biases. An IQ test can’t tell you how susceptible people are to cognitive errors, where their system one is dominant.

The Associative Machine

  • Concepts, words, and gestures can prime us to act in a certain way without us even knowing about it. If we smile then listen to a joke we’re more likely to enjoy it. If we read about concepts of old age, we’ll unconsciously move slower.
  • Priming happens to us all the time but we don’t even know it:

Cognitive Ease

  • Familiarity is not easily distinguishable from truth.
  • System 2 is lazy and mental effort is aversive.
  • Familiarity or repetition makes us think more favourably of the concept.
  • Mood affects the performance of our intuition (system 1).
  • When we’re uncomfortable or unhappy we lose touch with our intuition. We also lose creativity. But it means system 2 has a bigger effect on our performance. We become more vigilant.

Norms, Surprises and Causes

  • Events should have consequences and consequences need causes. System one inappropriately links cause and effects, often where they don’t exist. Statistical thinking is needed but that requires training of system 2.

A Machine for Jumping to Conclusions

  • When we’re faced with uncertainty we jump to conclusions. System one bets on an answer, and the bets are guided by experience – the rules of betting are intelligent, recent events and current context are weighted most heavily.
  • Importantly, a definite choice is made but you don’t know it. And you don’t know there could’ve been other options, or that it was ambiguous. System one doesn’t keep track of what it rejected.
  • Principal of independent judgement: to get the most useful information from a group, make sure each member’s source is independent. Don’t let people influence each other with their own biases. With meetings – ask everyone to form their opinion first before the discussion.
  • WYSIATI – what you see is all there is. System 1 is ignorant of both quality and quantity of information. The confidence people have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the fact that evidence which should be important to our judgment is missing. What we see is all there is.

Answering an Easier Question

  • We’re rarely stumped. Our normal state is to have intuitive feelings and opinions about almost everything which comes our way.
  • Generally, that’s because if we don’t find an appropriate answer to a tricky question quickly, we’ll find another question to answer.

The Law of Small Numbers

  • We’re prone to exaggerate the consistency and coherence of what we see – which means we often use too small a sample – our intuition doesn’t do statistics well.
  • System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence.
  • If you follow your intuition, you’ll more often than not err by misclassifying a random event as systematic. We’re far too willing to reject the belief that much of what we see in life is random.

Anchors

  • Anchoring effects are everywhere, and they make us more suggestible than we’d want to be. And there are many people who are willing and able to exploit our gullibility.
  • People who’ve been exposed to a random or absurd anchor confidently deny the obvious useless information could’ve influenced them, and they’re wrong.
  • Lesson: assume any number that’s on the table has had an anchoring effect on you, and if the stakes are high you should mobilise yourself (your system 2) to combat the effect.

Availability, Emotion and Risk

  • The emotional tail wags the rational dog – our affect heuristic simplifies our lives by creating a world that’s much tidier than reality.
  • The world in our heads is not a precise replica of reality. List of things we get wrong because of availability and other biases – problem with ‘news’ and media:
  • We exaggerate minor threats. We deal with small risks poorly – think of terrorism/shark attacks. We either ignore them or give them far too much weight. That can lead to a misallocation of resources by public policy as they react to the public’s overreaction to small threats – toxic waste, for example.

Tom W’s Speciality

  • Frowning increases the vigilance of system 2 and reduces both overconfidence and reliance on intuition
  • We often overrule statistical facts in favour of representativeness (stereotypes for example) when making predictions.
  • Representativeness gives us a story which might sound coherent. It makes us substitute plausibility for probability, to our detriment.

Linda: Less is More

  • Human nature/biases are more important than training or qualifications:
  • Plausibility trumps probability, which means adding extra detail to scenario makes them more persuasive, even if it also makes them less probable.

Regression to the Mean

  • We often get caught attaching causal explanations to the inevitable fluctuations of a random process.
  • Like a sportsman who’s had a great year and ends up in lots of media coverage. The next year if they do poorly, it’s called a jinx or commentators curse or because of the pressure they’re now under. But it doesn’t need a reason like this, it’s a mathematically inevitable consequence of the fact luck played a part in their success. The more lucky you are the more you’ll regress to the mean.
  • Regression to the mean features everywhere but we often mistake it for plausible stories – it’s not interesting to say, the business did less well this year because it was great last year.

The Illusion of Understanding

  • Hindsight bias (I knew it all along) means we assess the quality of a decision by the outcome rather than the process. So we blame people for things which turned out badly and give people too little credit for things which turned out well. And when outcomes are bad, we blame people for not seeing it before, forgetting that no-one knew what was going to happen before.
  • The worse the consequence the greater the hindsight bias.
  • Hindsight bias generally brings about risk aversion but it also brings underserved rewards to irresponsible risk takers.
  • Sense making machinery of system one makes us see the world as more tidy, simple, predictable and coherent than it really is. The illusion that you understand the past feeds the illusion that you can predict and control the future. These illusions are comforting – they reduce anxieties we would feel if we allowed ourselves to fully acknowledge the uncertainties of existence.

The Illusion of Validity

  • Why, even though maths says otherwise, do investors believe they can beat the market?

Intuitions vs Formulas

  • Experts are inferior to algorithms. Good example of experiment on wine prices based on weather conditions to determine likely price of vintage in future.
  • Why are experts poor at forecasting?
  • They try to be clever – thinking outside the box and consider complex combinations of features to make their predictions. Complexity more often than not reduces validity. Experts even override well founded formulas because they think they have better information. They don’t. Only disregard the formula on the basis of the “broken leg rule” – i.e rarely and when the new facts are decisive.
  • Humans are very inconsistent in making summary judgements of complex information. System one is hugely context dependent. Think of all the priming which happens that we don’t even know about.
  • People don’t accept algorithms are generally better decision makers than them. And this prejudice worsens as decisions become more consequential. The cause of mistake matters to people – they can’t accept an autonomous car killing someone, but they can a human driven car. The difference in emotional intensity is translated into a moral preference.
  • A lot of this research was done by Paul Meehl, who wrote “my disturbing little book”. Simple statistical rules are better than intuitive “clinical” judgements.
  • Tips for hiring:

Expert Intuition: When can we trust it?

  • Confidence people have in their intuitions is a bad guide to whether it’s right.
  • So when can you trust your intuition? When both of these conditions are satisfied, intuitions are likely to be skilled.
  • You’re in an environment that’s sufficiently regular to be predictable.
  • You have an opportunity to learn these regularities through prolonged practice.
  • Intuition can’t be trusted in the absence of stable regularities in the environment.
  • Whether professionals have a chance to develop intuitive expertise depends on the quality and speed of feedback, as well as on sufficient opportunity to practice. For example, it’s easy to learn when to brake while driving – you get instant feedback whenever you do it. Choosing investments takes a long time to prove whether you’re right or wrong, the signals can be mixed and there’s no room for practice.

The Outside View

  • Overly optimistic forecasts of the outcome of projects are found everywhere. They
  • Are unrealistically close to best case scenarios.
  • Could be improved by consulting the statistics of similar cases. We don’t normally do this because when we have information about something we rarely feel the need to know the stats in the field to which that thing belongs.

The Engine of Capitalism

  • The factors behind why entrepreneurs believe they can make it when most fail:
  • Professors at duke university did a survey with CFOs at large companies and asked them to estimate the return of the S&P over the next year. 11,600 forecasts were collected. The correlation between the forecasts and the actual return was slightly less than 0. Alongside this they also asked other questions, which revealed the CFOs were grossly overconfident in their ability to predict the markets return.
  • Optimism has the benefit of making you resilient in the face of setbacks. Hence why it’s the engine of capitalism. It gives you a delusional sense of significance which won’t wilt in the face of multiple small failures and rare successes.
  • Overconfident optimism is difficult to tame. It’s a story you tell yourself, which can’t be solved with facts, but a pre-mortem is useful.
Bernoulli’s Errors
  • This is why real life is different to economic theory life:
  • When calculating expected values and what we prefer, we’re risk averse and so will pay a premium for a certain outcome.
  • Bernoulli’s theory is wrong in that it doesn’t account for someone’s history or past experiences when deciding today. For example if you don’t have much to start with you’re likely to make different choices to someone who has a lot of money.

Prospect Theory

  • Three cognitive features at the centre of prospect theory. They play an essential role when judging financial outcomes and are common to lots of automatic processes of perception.
  • Evaluation is relative to a neutral reference point. Outcomes above this point are thought of as gains. Below, losses. The neutral point is your expectation in many circumstances. It can also be the status quo. Same effect when you have three bowls of water – one hot one cold and one room temperature. If you put each hand in the hot and cold bowl, then after a minute move them into the room temperature bowl. While they’re both at the same temperature, one hand will feel like it’s warming up, one will feel like it’s cooling.
  • Diminishing sensitivity applies both to sensory dimensions as well as the evaluation of wealth. i.e. the difference between £900 and £1000 is less than £100 and £200.
  • Loss aversion. Losses loom larger than gains. Between 1.5 and 2.5 times more. Rooted in evolution – organisms which treat threats as more urgent than opportunities stand a better chance of surviving.
  • Flaw in prospect theory is that they can’t account for disappointment and regret.

The Endowment Effect

  • People place a higher value on items they own. For example, tickets to a sought after sports final. They may have only been willing to pay £400 to get them but they need far more money £2000+ to sell them. The difference is explained by the pain of losing what we want or have to use.
  • It doesn’t apply to objects we can exchange – like shoes or a £5 note for coins. We are more rational and would be happy to exchange the money or trade shoes in for other shoes.

Bad Events

  • Bad is stronger than good. Negativity trumps positivity. This is an extension of loss aversion.
  • A few examples – a single cockroach can ruin a bowl of cherries. But a single cherry does nothing for a bowl of cockroaches. They studied professional golfers to understand whether there was a difference in the % of putts made for birdie or bogey. They found the % was higher for bogeys.
  • Animals, too, fight harder to prevent losses than achieve gains. When a territory holder is challenged by a rival, it’s almost always the owner who wins the contest. Defence > offence.

The Fourfold Pattern

  • Possibility effect = we considerably overweight small risks/unlikely events, like a 2% chance of winning a price.
  • Certainty effect = outcomes that are almost certain are underweighted vs reality. We pay a high price to make something 100% rather than leave it at 98%.
  • Pattern of preferences:
  • Bottom left = lottery tickets | Bottom right = insurance | Top right = where many unfortunate human situations unfold
  • Consistent overweighting of improbably outcomes – a feature of intuitive decision making – is costly in the long run.

Rare Events

  • When it comes to rare probabilities, our mind is not made to get things right.

Keeping Score

  • Sunk cost fallacy. Throw good money after bad rather than accept the humiliation of closing the account of a costly failure.
  • We experience a bigger sense of regret for things we do than for things we don’t do. Even if the outcome is the same. This leads us to favour convention, status quo etc.
  • Kahneman’s hindsight avoidance policy. Either be really thorough or completely casual when making decisions with long-term consequences.
  • People tend to anticipate more regret than they will actually experience.

Two Selves

  • What we experience and what we remember are two different things. For example, we might experience 40 minutes of pure bliss listening to an orchestra but if there is a harrowing sound at the end we will remember it as having ruined the whole experience.
  • When we remember, we place most weight on the worst part and the end. Called the peak end rule. Duration of the experience doesn’t matter. For good or bad.
  • The experiencing self has no voice. The remembering voice does but it’s sometimes wrong. However it’s the one which keeps a score and helps us decide what to do in future.
  • Tastes and decisions are shaped by memory but are memories can be wrong – here’s why that’s not good for believers in rational agent theory.
Life as a Story
  • The remembering self works by creating stories and keeping them for future reference.
  • Odd as it seems, we are our remembering selves, not our experiencing selves – who are the ones living.
  • Think about this: at the end of a holiday all your pictures and videos will be deleted. And you’ll swallow a pill which means you won’t remember your holiday at all. How would this affect where you want to go on holiday or how much you’d pay?

Experienced Well-Being

  • Only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you.

Thinking about life

  • Focusing illusion: “nothing in life is as important as you think it is when you’re thinking about it.”

Conclusion

  • Kahneman cringes when his work is credited with showing that humans choices are irrational. His research only showed that humans are not well described by the rational agent model.
  • System 1 registers the cognitive ease with which is processes information, but it doesn’t generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or heuristics. There’s no simple way for system 2 to distinguish between a skilled and heuristic response. It’s only recourse is to slow down and attempt to construct an answer on its own, which it’s reluctant to do because it is indolent. Many suggestions of system 1 are casually endorsed with minimal checking. That’s how system 1 gets its bad reputation as the source of errors and biases.