thinking-fast-and-slow-book-summary

Thinking Fast and Slow by Daniel Kahneman [Actionable Summary]

This is a comprehensive summary of the book Thinking Fast and Slow by Daniel Kahneman. Covering the key ideas and proposing practical ways for achieving what’s mentioned in the text. Written by book fanatic and online librarian Ivaylo Durmonski.

Dexule printable: Download the interactive sheet for taking notes.

The Book In Three Or More Sentences:

After spending years studying and documenting biases, Daniel Kahneman and his associate Amos Tversky, ​who sadly perished before this book was published, created this masterpiece. A book teaching us valuable things about how our mind is designed to work. In particular, the type of errors our brains are prone to make when making decisions. The text will help you understand why our first reaction to a complex problem is usually not that good, but if we take the time to think, we will find better solutions. Thinking, Fast and Slow is full of examples of how we delude ourselves. And by understanding these flaws, we can prevent doing stupid things in the future.

The Core Idea:

As the title hints, there are two systems that turn the cogs in our heads. System 1 is fast, automatic, reckless, often irresponsible, but extremely effective in keeping us alive. System 2 is the opposite. It’s categorized as slow, deliberate, boring but extremely reliable when complex calculations require our attention our difficult problems arise. By showcasing the various flaws embedded in the way we think, Daniel Kahneman wants to enhance our intelligence and improve our mental stamina.

Reason To Read:

Realize that you are blind about your blindness. Often acting impulsively instead of rationally. Understand the common thinking flaws we all possess and learn how to avoid them.

Highlights:

  • System 1 is fast and reckless. Responsible for handling dangerous situations. System 2 is slow and deliberate. Responsible for preventing you from entering dangerous situations in the first place.
  • Our default response – our intuition – is often wrong. Sadly, we are blindly unaware of this fact. Learning about our inherent errors will make us more flexible.
  • Transforming complicated tasks into nearly effortless activities can happen when we practice and proactively seek feedback from our peers.

Think Workbook #005:

This book summary comes with a Think Workbook included (previously available only for members – now free).

Inside, I explore some of the best concepts from the great work produced by Daniel Kahneman – along with guided writing exercises.

Download the workbook by clicking here.

8 Key Lessons from Thinking Fast and Slow:

Lesson #1: There Are Two Modes of Thinking Governing Our Actions

How can you describe yourself?

Most people, identify as conscious, rational beings who have clear rules and know what they want. Carefully consider all the options before buying something. Summon logic when it’s needed. And, rarely go out of line unless there is a real need to do that.

Based on the finding in the book, you’re most probably wrong about yourself.

In most situations, we are doing things based on how we feel, not on what’s right and reasonable.

And how we feel is part of System 1 where System 2 is about adding a dose of rationality to what we’re doing.

A large portion of the book – as we can sense from the title – is dedicated to explaining what is fast thinking and what is slow thinking. But the underlying goal of the author is to present something else. He wants to showcase that although we are the most sophisticated creatures on the planet. Posses the largest brain amongst all other living things, we are mostly controlled by our fast thinking – that is, we rely heavily on intuition, not reason. As mentioned in the book, “Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book.”

To understand the two modes of thinking, let’s see the main characteristics of our two systems:

  • Fast Thinking (System 1): This system is responsible for keeping you alive. System 1 doesn’t rely on logic. It’s mostly powered by our basic instincts – eat, survive, feel safe, feel good, reproduce. System 1 responds automatically based on our ingrained needs and habits. For example, jump when a car is approaching, sense when someone is angry, smoke when you’re nervous, read and understand simple sentences, find the answer to 2 + 2. You can imagine System 1 as a dumb caveman.
  • Slow Thinking (System 2): Our second system needs time to boost. It requires effortful concentration. We use System 2 to answer difficult questions. Here things like comparing different options, using reason, rationality, stoping yourself before you say something stupid come into play. For example, System 2 will focus on a particular task, stop while walking to consider the possible options, fill out a tax form, think about ourselves from a different perspective. You can imagine System 2 as a cigarette-smoking philosopher.

Everything we encounter first goes through System 1 for filtering. And a lot of times, we never let it pass to System 2 for examination. For example, instead of paying attention to a problem presented to us (use slow thinking), we might answer with the first thing that comes to our mind (use fast thinking).

As you can sense, this approach often leads to undesirably bad outcomes – incorrectly blame someone or say something stupid. That’s why, a lot of times, deliberately waiting so we can activate System 2 is required to reach the best answer to a question.

“I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.” Daniel Kahneman

Lesson #2: Become An Expert To Enhance Your Fast Thinking

There are two ways to ensure that the first answer you give to a difficult question (imaging that you should give an answer to 17 x 25) will be correct.

You can enhance System 1, or you can simply allow yourself more time to think – invite System 2.

A skillful mathematician who is consuming formulas for breakfast will probably solve the calculus problem above with ease – in seconds. This, however, doesn’t mean that he has a computer chip implanted in his head. This simply happens when you master a specific field. Your slow thinking becomes your fast thinking. In other words, you become an expert.

To get a better understanding of this concept. Let’s read a passage from the book: “We have all heard such stories of expert intuition: the chess master who walks past a street game and announces “White mates in three” without stopping, or the physician who makes a complex diagnosis after a single glance at a patient.”

The accurate diagnosis of a doctor or of a car mechanic without looking under the hood involves no magic. It’s based on experience. As chess masters can move quickly on the board because they’ve practiced thousands of moves, we can too, increase our operating memory and boost our intuitive judgments.

How?

Well, it takes time. And, it requires the following simple realization: You can’t turn off System 1. Your intuition will always try to dominate reason. This means that you should often mistrust your first impressions. The first automatic answer generated by your brain. Until, at least, you’ve devoted a large portion of your life towards solving a specific set of problems. When this point is reached, you will, seemingly instantaneously, give correct answers to even difficult problems.

“The psychology of accurate intuition involves no magic. Perhaps the best short statement of it is by the great Herbert Simon, who studied chess masters and showed that after thousands of hours of practice they come to see the pieces on the board differently from the rest of us. You can feel Simon’s impatience with the mythologizing of expert intuition when he writes: ‘The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.'” Daniel Kahneman

Lesson #3: We Miscalculate Whether Events Are Good or Bad

Nature has put mankind under the power of two masters: pain and pleasure. And according to the research in the book, when we’re exposed to a short period of pain that abruptly stops, we’re more likely to remember this incident as a lot more painful if we rather experience pain over time which slowly increases or decreases.

Or in other words, duration doesn’t count when we experience pain or pleasure. The peak (best or worst moment) at the end of the experience is what actually matters to the brain.

Let me try to explain this better:

There are two selves hiding inside us: The experiencing self and the remembering self.

  • The experiencing self asks: “Does it, and how much it hurts now?”
  • The remembering self, on the other hand, asks the following: “How was the overall experience?”

We rely on those two when we make decisions with one tiny comment: The remembering self has greater power when we’re making future decisions.

Even though the average length of fixing a tooth is less than 5 minutes, we remember a visit to the dentist as the worst thing in our lives. Why? Because, for a short period of time, we experience a large portion of targeted pain.

This short additional example will give you a better perspective on how we remember things: A man was listening to a long symphony recorded on a disc, however, there was a scratch at the end of the disk and the end result was a shocking sound. After being interviewed, the man mentioned that the bad ending destroyed the whole experience. But if we look at this objectively, we can conclude that the experience was not destroyed, only the memory of it. The listener judges that the whole experience was bad because it ended badly. However, in reality, only the ending was bad. His assessment ignores the previous musical bliss and remembers only the bad moment.

Our remembering self is convincing us that certain situations, experiences, people even, are bad only because we had one bad moment with them. However, this is often not true and by convincing our minds that something is bad before we’re 100% certain that it’s bad, we might miss out on possible future pleasurable experiences.

“Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.” Daniel Kahneman

Lesson #4: We Skip Deep Thinking and Default to Laziness

There is a good reason people get addicted to their smartphones. Use Uber instead of a regular cab, default to rest, and watch mind-numbing movies when they should be instead exercising and thinking about bigger problems.

The mind wants to stay away from effort. And not just occasionally. We continuously, throughout our days, avoid effortful tasks. Things and tasks that require a lot of brainpower.

Not just because it’s easier to chill and do nothing. But because the mind is never at rest.

As Daniel Kahneman writes: “Whenever you are conscious, and perhaps even when you are not, multiple computations are going on in your brain, which maintains and update current answers to some key questions: Is anything new going on? Is there a threat? Are things going well? Should my attention be redirected? Is more effort needed for this task?”

By default, we choose the easy path. Not necessarily because we are lazy (while actually, it’s often exactly because we’re lazy). But because we want to reach a state of cognitive ease. The brain is deliberately saving energy when possible to allocate more resources to tasks that are considered really important – such that are responsible for our survival. You can consider your brain and its guiding principles as an independent unit. Focused on its own selfish goals.

The cognitive ease the brain is pursuing hopes for the following:

The goals of the brain (Cognitive Ease): 

  • Repeated experiences that feel familiar.
  • Clear rules that feel achievable.
  • Readied ideas that don’t require a lot of preparation.
  • Tasks that feel effortless.

Sadly, the goals of the brain often clash with the modern world.

While a couple of hundreds of years ago it was totally fine to eat as much as possible cholesterol-heavy meals. Now, this habit will cause complications.

In short, to advance in our current reality, we often need to go against our natural instincts. To pursue cognitive strain.

The goals of high-achievers (Cognitive Strain): 

  • New experiences that can lead to more opportunities.
  • Follow rules and tasks that are beyond your comfort zone.
  • Half-baked ideas that require creative thinking.
  • Tasks that require effort.

When you go for the strained activities, your brain will do everything possible to resist and get back to the tasks that require less effort and feel more comfortable. Opposing these built-in desires will make you more durable, creative, and antifragile.

“Easy is a sign that things are going well—no threats, no major news, no need to redirect attention or mobilize effort. Strained indicates that a problem exists, which will require increased mobilization of System 2.” Daniel Kahneman

Lesson #5: What You See Is All There Is (WYSIATI)

Jumping into quick conclusions without a lot of supporting evidence is what we do all the time. That’s partly the reason we buy things that we have just realized exist. Things we usually don’t need.

To picture this, consider the following scenario: You browse online searching for a new job. Suddenly, a site offering to teach you copywriting appears seemingly out of nowhere. The sales page explains that you don’t need to get a new job. You can create your own job by learning copywriting skills. All of a sudden, your quest to get a new job is put on hold. You are now going to “become a freelancer and write copy for other brands!”

This tendency to be quickly persuaded by only what is noticeable is frequently mentioned in the book. The author describes it by using the following strange abbreviation: WYSIATI, which stands for “what you see is all there is.”

The WYSIATI “phenomenon” can be observed everywhere.

If you are interviewing someone, you see/hear what the person is presenting, and for the mind, that’s all there is. Someone presents himself as knowledgeable and passionate to work for you? You don’t see anything else then – the flaws that certainly exist. You see him as indeed knowledgeable and passionate.

Here’s another example: You are about to purchase new accounting software. You visit the website, and you read the sales page. There are, of course, only benefits mentioned there. Good reviews of users and different ways mentioned of how this product can help you save time and (probably) even become astonishingly rich. You don’t see any disadvantages unless you involve System 2. Unless you question the flawless presentation and deliberately search for bad reviews.

So, basically, not only that we don’t see past what is visible, but we also tend to satisfy ourselves with this partial information available. Understanding this flaw in the way we think can make you a more conscious consumer, motivate you to ask difficult questions to test things before you do them, and learn to background check information.

“The statement that “the odds of survival one month after surgery are 90%” is more reassuring than the equivalent statement that “mortality within one month of surgery is 10%.” Similarly, cold cuts described as ‘90% fat-free’ are more attractive than when they are described as ‘10% fat.'” Daniel Kahneman

Lesson #6: Specific Conditions Are Not More Probable Than General Ones

There is a problem in the book. A problem that made the name Linda extremely famous in scholar circles. I’m not talking about a flaw in the book itself. Rather, a flaw in the way we think about probabilities.

The conjunction fallacy (also known as the Linda problem) is an often-cited example of how we fail to think correctly about probable events. In short, we think that specific situations are more likely to occur, when in fact the opposite is true.

To understand this better, let’s take a look at the famous Linda problem directly from the book:

“Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.

Which is more probable?

Linda is a bank teller. Linda is a bank teller and is active in the feminist movement.”

When this problem was presented to different groups of people – undergraduates, then doctors, and then people who are actively involved in studying decision theory. The majority of the participant mentioned that Linda being a feminist bank teller is more likely.

But is it?

Sure, based on the initial context, our mind quickly creates a hard-to-avoid story. We convince ourselves that Linda is indeed a bank teller participating in feminist movements after hours. But is this more probable?

Well, it isn’t.

To show the difference, the author later shares other examples that we’ll likely answer correctly. Like this one:

“Which alternative is more probable? Mark has hair. Mark has blond hair.”

Of course, the answer is the first one. It’s clear that Mark having hair is more probable than Mark having blond hair – simply because having blond hair is a variation of hair. Again, when we are looking at events that have more specific characteristics, they become less possible to occur.

But why do we say that Linda is a feminist bank teller?

Daniel Kahneman explains that most people get this problem wrong because our mind forms this view called representativeness. Our mind finds similar objects and groups them together. We imagine the group as more likely to occur while it’s actually the opposite.

If you’re asking how this can be helpful in your real life, let me add a quick commentary.

For example, someone is more likely to become a business owner than a successful business owner. Also, it’s more likely for a person to become an actor than an Oscar-winning actor. One general event is more likely to occur than two specific events. Still, that doesn’t stop the mind from creating stories. We don’t want to simply become business owners, we want to be successful business owners.

While this can be helpful when we are approaching tasks – after all, no one is starting a business hoping to fail. This fallacy can blur our thinking. We can register our brand convinced that we’ll succeed, which is never guaranteed.

“The twist comes in the judgments of likelihood, because there is a logical relation between the two scenarios. Think in terms of Venn diagrams. The set of feminist bank tellers is wholly included in the set of bank tellers, as every feminist bank teller is a bank teller. Therefore the probability that Linda is a feminist bank teller must be lower than the probability of her being a bank teller. When you specify a possible event in greater detail you can only lower its probability. The problem therefore sets up a conflict between the intuition of representativeness and the logic of probability.” Daniel Kahneman

Lesson #7: The More Luck Is Involved, The Less We Can Learn

We often confuse luck with skill. This is easily noticeable when people are deciding which stock to pick.

A whole chapter in the book is dedicated to explaining that stock-picking is not a skill, is more related to a dice-rolling contest.

The author proves this after taking a closer look at how stocks selected by a big investment firm perform over a long period of time. His conclusion is that the performance is a purely lucky event – amateurs investors often outperformed seniors. Of course, when he shared his findings with the firm, they quickly dismissed his argument.

We are hungry for stories that highlight success or failure. We don’t want to attribute our success to a lucky event. This will mean that what we did is average, insignificant, irrelevant. And if we attach these qualities to ourselves, we will start believing that we are insignificant. Something we surely don’t want to accept. Internally, we want to feel that we made it thanks to our knowledge and skills.

How is this quest affecting our decision-making skills?

We fail to see luck in the situations we observe. And, we distribute all events into two buckets: success and failure.

We try to mimic successful people, and we try to avoid what unsuccessful people are doing.

However, by not including the role of luck in the equation, we may miss an important factor, or we may follow steps that are impossible to emulate.

Instead of seeing the situation objectively. We believe the few events that happened are important. Rather than taking into account all the events that failed to happen.

Take any rich person nowadays. We can read an article highlighting their greatness and try to do the same. Sadly, this is only part of the story. The story we form in our heads about success excludes all events that failed. We construct a short movie with a happy ending highlighting only the good things.

The reality is quite different. The success we see is only part of the story. It’s more like a drama with a lot of ups and downs. And, occasionally, lucky events that are impossible to imitate.

“Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.” Daniel Kahneman

Lesson #8: Create An Environment for Learning

The best way to learn new skills quickly is by doing the following: Practice regularly and find an environment where you get fast feedback on your actions.

For example, here’s what it takes to become an expert chess master: “Studies of chess masters have shown that at least 10,000 hours of dedicated practice (about 6 years of playing chess 5 hours a day) are required to attain the highest levels of performance. During those hours of intense concentration, a serious chess player becomes familiar with thousands of configurations, each consisting of an arrangement of related pieces that can threaten or defend each other.”

To create expert intuition, you need to continuously expose yourself to the information you want to master. In other words, to practice. But this is not enough. You also want to get regular feedback on what you’re doing. 

The more you practice, the more you learn different combinations. As readers become better when they read more because they learn more words. A person who wants to become better at chess starts to read chess boards at a glance. But practicing without asking for feedback is sinful.

If the delay between your action and the outcome is huge, you won’t have enough time to regroup. Besides, you may be stuck in doing the wrong things. As the author writes, “intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.”

But this is also not enough.

You also need to acquire a skill that is quite rare in the field of acquiring new knowledge. You need to know the limits of your knowledge.

Acknowledging that you have holes in your reasoning will prompt you to learn more things. Dive into different areas that will fortify your knowledge. The opposite can be devastating for your career. 

Thinking that you know everything will only shut the door to learning new strategies and from sharpening your brain. 

Don’t become a pseudo-expert focused solely on one thing. You need a collection of skills to make it in our disorganized and often chaotic world.

“Expertise is not a single skill; it is a collection of skills, and the same professional may be highly expert in some of the tasks in her domain while remaining a novice in others.” Daniel Kahneman

Actionable Notes:

  • Answer the difficult question, not the easy one: When facing a difficult question, our minds find and answers the easier question first. A lot of times, without even noticing, we don’t even realize how important the question really is. For example, when wanting to invest. We ask ourselves this: Should I invest in this stock? The possible answers are usually: yes and no. The important question is quite different, however. You must ask yourself: Do I like the company? Do I think that what the company is doing will lead to substantial growth in the long run? Is the recent price high based on real numbers, or based on a trend that will soon pass? To be less wrong, find the difficult questions and answer them.
  • Frame what you sell/do better: Based on what was discussed above about the WYSIATI rule, we can use this to our advantage. Since we neglect what is not visible and favor what is displayed, we can reframe the products we are selling, our ourselves, better. If you run a newsletter, instead of saying, “the unsubscribe ratio of my newsletter for the past year is only 7%.” We can frame it as follows: “93% of all subscribers stay subscribers for over a year.” Or, if you’re applying for a content marketing position. You can focus primarily on what you’ve done – doing right now – in relation to content marketing. Exclude what is not related to the industry.
  • Beware of the planning fallacy: When we make plans, we always focus on the overly optimistic forecasts and we ignore the possibility that what we do is going to fail. We don’t take into account the rational statistics. After all, when you start a new diet, for example, you don’t start thinking about how you’ll fail – this will discourage you. You start with the idea that you’ll become thin and muscular. However, this way of thinking about our projects prevents us from thinking about the problems that will surely occur. And when we exclude the problems from the planning, we won’t have a plan when challenges arise. The author calls this planning fallacy. To overcome it, adopt an outside view. Ask people for their opinion. Also, investigate the results of other people in a similar situation like yours. This will help you spot possible problems and find ways to prevent them early in the planning phase.
  • Determine the base rates: System 1 makes us believe that rare events are more likely to occur. What we explained in lesson 6 – about Linda being a feminist bank teller. Thanks to this, we convince ourselves that we can succeed at our job and get promoted to senior manager. But that’s not the right path. The author explains that if we want to make a better prediction, we should determine the base rates first. For example, ask questions like these: How many people in your organization are being promoted to a senior manager a year? What is their background? This will establish the baseline and help you see things unbiased. Given the information so far, I believe that you’ll agree that it’s more like to become a senior manager, in general, than a senior manager at organization X. Or in other words, you can get the promotion, but it doesn’t necessarily mean that it will be in this organization.
  • The cost of optimism: We all know that optimism can be a good thing. Only through hopefulness and confidence about the future success of our product, we’ll continue to put in the work. Sadly, optimism can be also costly. Daniel Kahneman shares that in a study made, 50% of the founders who were told that their idea is not going to work continue development. They didn’t pay attention to the comments and believed that their idea can reach success despite the negative feedback. This is something common. People who are primarily optimistic genuinely believe that everything they do can turn into gold. Of course, this is not the case. To avoid paying the cost of optimism, we need an unbiased view. To consider the baseline and to seek the opinion of outsiders. People who can judge what we’re doing – or planning to do – without emotional attachment. And finally, when comments do arrive, try to consider them objectively.

Commentary And My Personal Takeaway

I first read Thinking, Fast and Slow in 2018. Back then, I really hated the book. Well, not really hated, but I found it way too complicated and full of repetitive language that wasn’t practically useful.

The problem wasn’t the book. I was. Simply put, I wasn’t ready for what the author was sharing.

Now, 3 years after my first reading, I have the following to say: Thinking, Fast and Slow is dry. Boring at times. I real snoozer at others. Yet, extremely popular and important book – one of the top psychology books – about how our mind is trying to sabotage our existence. Do yourself a favor and read it. When things start to feel boring and repetitious in a chapter, move to the next one.

The fascinating thing about this book is that it perfectly describes our natural fallacies. And once we know that we are prone to make mistakes, we can outmaneuver our commonly wrong intuition and find a different, and better solution to the problem we are facing.

Thinking, Fast and Slow is like a gigantic list full of our default thinking patterns. How we usually react to a situation. The most interesting, and at the same time sad thing, based on the insights. Is that we go with our intuition even when we know that there is no reasonable ground to do so. For example, we continue to invest time and money in dead relationships, products, jobs – i.e., the sunk cost fallacy.

Our native way of thinking is simply deeply embedded in our psyche. That’s why it’s so hard to avoid going with the flow even when you know that what you’re doing is not going to work.

I do believe that by reading the book, you can spot errors in your decision-making process and fix things before it’s too late.

The key takeaway:

To become less wrong in life, expose yourself to more problems. Ask for feedback as much as possible, and always question your first intuition. To make your default response in general better, you need to gain more experience. And you gain more experience by practicing and getting fast feedback.

Notable Quotes:

“If you care about being thought credible and intelligent, do not use a complex language where simpler language will do.” Daniel Kahneman

“A reliable way of making people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.” Daniel Kahneman

“We can be blind to the obvious, and we are also blind to our blindness.” Daniel Kahneman

Share with others: