Thinking Fast and Slow by Kahneman

Ref: Daniel Kahneman (2011). Thinking Fast and Slow. Farrar, Straus, and Giroux Publishing.

_____________________________________________________________________________

Summary­

  • A general “law of least effort” applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.

  • For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential.

  • Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed. The mind does not digest them.

  • People can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.

  • Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.”

  • It is much easier, and far more enjoyable, to identify and label the mistakes of others than to recognize our own.

  • Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

____________________________________________________________________________

System 1 & 2

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

  • S1 & S2 are both active whenever we are awake. S1 runs automatically and S2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. S1 continuously generates suggestions for S2: impressions, intuitions, intentions, and feelings. If endorsed by S2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, S2 adopts the suggestions of S1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually.

  • When S2 is otherwise engaged, we will believe almost anything. S1 is gullible and biased to believe, S2 is in charge of doubting and unbelieving, but S2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted. Cognitive load is not the only cause of weakened self-control. A few drinks have the same effect, as does a sleepless night.

  • S1 generates Impressions, feelings, and inclinations; when endorsed by S2 these become beliefs, attitudes, and intentions.

System 1 (S1): The fast-thinking impulsive and intuitive brain; automatic, quick, effortless. S1 generates impressions, feelings, and inclinations and takes over in times of emergency, assigning total priority to self-protective actions.

  • S1 provides the impressions that often turn into your beliefs, and is the source of the impulses that often become your choices and your actions.

  • S1 has more influence on behavior when S2 is busy.

  • Psychologists believe that all of us live much of our life guided by the impressions of S1—and we often do not know the source of these impressions.

  • Mood affects the operation of S1: when we are uncomfortable and unhappy, we lose touch with our intuition.

  • Intuitive predictions tend to be overconfident and overly extreme.

 

System 2 (S2): The slow thinking, reasoning brain; allocates attention to effortful mental activities that demand it, including complex computations. 

  • Cognitive strain, whatever its source, mobilizes S2, which is more likely to reject the intuitive answer suggested by S1. When S1 runs into difficulty, it calls on S2 to support more detailed and specific processing that may solve the problem of the moment. S2 is mobilized when a question arises for which S1 does not offer an answer.

  • S2 is activated when an event is detected that violates the model of the world that S1 maintains.

_____________________________________________________________________________

Brain

  • The nervous system consumes more glucose than most other parts of the body, and effortful mental activity appears to be especially expensive in the currency of glucose. When you are actively involved in difficult cognitive reasoning or engaged in a task that requires self-control, your blood glucose level drops. The effect is analogous to a runner who draws down glucose stored in her muscles during a sprint.

    • A bold implication of this idea is that the effects of ego depletion could be undone by ingesting glucose.

  • Associative Memory: The brains vast network of nodes (ideas) in which each node (idea) is linked to many others. There are different types of links: causes are linked to their effects (virus cold); things to their properties (lime green); things to the categories to which they belong (banana fruit).

____________________________________________________________________________

Psychology

  • Psychology should inform the design of risk policies that combine the experts’ knowledge with the public’s emotions and intuitions.

  • Rewards for improved performance work better than punishment of mistakes.

  • The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.

  • Optimists are normally cheerful and happy, and therefore popular; they are resilient in adapting to failures and hardships, their chances of clinical depression are reduced, their immune system is stronger, they take better care of their health, they feel healthier than others and are in fact likely to live longer.

  • An optimistic attitude is largely inherited, and it is part of a general disposition for well-being, which may also include a preference for seeing the bright side of everything.

  • Merely reminding people of a time when they had power increases their apparent trust in their own intuition.

  • A basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight—nothing in between.

______________________________________________________________________________

Flow

  • People who experience flow describe it as “a state of effortless concentration so deep that they lose their sense of time, of themselves, of their problems,” and their descriptions of the joy of that state are so compelling that Csikszentmihalyi has called it an “optimal experience.”

  • Flow neatly separates the two forms of effort: concentration on the task and the deliberate control of attention.

  • Attention can be moved away from an unwanted focus, primarily by focusing intently on another target.

  • Too much concern about how well one is doing in a task sometimes disrupts performance by loading short-term memory with pointless anxious thoughts.

  • As you become skilled in a task, its demand for energy diminishes. Studies of the brain have shown that the pattern of activity associated with an action changes as skill increases, with fewer brain regions involved.

_____________________________________________________________________________

Biases

  • Anchoring Effect: Occurs when someone is thinking of a specific value for an unknown amount before estimating the amount. Occurs when people consider a particular value for an unknown quantity before estimating that quantity.

    • In negotiations, you should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your S2) to combat the effect.

    • “Our aim in the negotiation is to get them anchored on this number.”

    • “Let’s make it clear that if that is their proposal, the negotiations are over. We do not want to start there.”

  • Availability Bias: Occurs when people tend to overestimate the importance of information they have.

    • “Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really changed; it is an availability bias.”

    • The importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.

  • Baseline Prediction: The prediction you make about a case if you know nothing except the category to which it belongs.

    • Aim for a prediction that is intermediate between the baseline and your intuitive response. In the default case of no useful evidence, stay with the baseline.

    • “Our intuitive prediction is very favorable, but it is probably too high. Let’s take into account the strength of our evidence and regress the prediction toward the mean.”

    • “Whenever we can replace human judgment by a formula, we should at least consider it.”

    • “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”

  • Confirmation Bias: People look for justification for evidence for pre-existing beliefs rather than truth.

    • Even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience.

    • When people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound. If S1 is involved, the conclusion comes first and the arguments follow.

  • Duration Neglect: A common fallacy in which a relative shorter negative event is assigned equal weighting to a longer positive event. Duration neglect is normal in a story, and the ending often defines its character. The same core features appear in the rules of narratives and in the memories of colonoscopies, vacations, and films.

    • “The length to which he was willing to go for a one-night encounter is a sign of total duration neglect.”

  • Exposure Effect: Familiarity breeds liking.

  • Framing Effect: A bias dependent on how information is framed.

  • Halo Effect: Tendency of people to judge a person based on a personal opinion made quickly and influenced by a first appearance attached to the person being judged.

  • Hindsight Bias: Leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.

    • We cannot suppress the powerful intuition that what makes sense in hindsight today was predictable yesterday. The illusion that we understand the past fosters overconfidence in our ability to predict the future.

    • “The mistake appears obvious, but it is just hindsight. You could not have known in advance.”

  • Law of Small Numbers: A fallacy in which people generalize from a small number of data, which they think can be representative of the whole. 

    • The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify.

  • Outcome Bias: Tendency to judge decisions based on an expected outcome, making it nearly impossible to evaluate a decision properly. 

    • “She has no evidence for saying that the firm is badly managed. All she knows is that its stock has gone down. This is an outcome bias, part hindsight and part halo effect.”

  • Overconfidence Bias: People tend to trust their capability to make the right decisions and overestimate their capabilities as decision-makers. Overconfidence is manifestation of WYSIATI: when we estimate a quantity, we rely on information that comes to mind and construct a coherent story in which the estimate makes sense.

    • The unrecognized limits of professional skill help explain why experts are often overconfident.

    • Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty. Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients.

    • “She is confident in her decision, but subjective confidence is a poor index of the accuracy of a judgment.”

    • Optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers.

    • Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident. We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly.

  • Planning Fallacy: Describes plans and forecasts that are unrealistically close to best-case scenarios. People tend to make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs, spinning scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns—or even to be completed. In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds they face.

    • We focus on a goal, anchor on a plan, and neglect relevant base rates, exposing ourselves to the planning fallacy.

    • Reference Class Forecasting: A form of Tx against the planning fallacy; an outside view is implemented by using a large database, which provides information on both plans and outcomes for hundreds of projects all over the world, that provides statistical information about the likely overruns of cost and time, and about the likely underperformance of projects of different types.

    • Imagine that you (we, the institution, etc) are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.

  • Loss Aversion (Possibility Effect): Tendency to overweigh small risk and pay far more than expected value to value to eliminate them all together. People tend to be risk averse in the domain of gains and risk seeking in the domain of losses. The disadvantages of a change loom larger than its advantages. We are driven more strongly to avoid losses than to achieve gains.

    • “These negotiations are going nowhere because both sides find it difficult to make concessions, even when they can get something in return. Losses loom larger than gains.”

    • Loss Aversion Ratio ~ 1.5-2.5.

    • Loss aversion creates an asymmetry that makes agreements difficult to reach. The concessions you make to me are my gains, but they are your losses; they cause you much more pain than they give me pleasure.

    • “This reform will not pass. Those who stand to lose will fight harder than those who stand to gain.”

  • Repetition Bias: Tendency of people to associate frequent repetition with truth, because familiarity is not easily distinguished from truth.

  • Stereotypes: Representations people hold in memory of a “normal” member of a social group or category.

    • Neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong.

  • Sunk Cost Fallacy: Fear of acknowledging a loss keeps us looking backwards at events we can not change. Throwing more money or time at an investment or strategy in the hope of justifying money already spent, opening the way for larger losses. The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects.

    • “We discovered an excellent dish at that restaurant and we never try anything else, to avoid regret.”

    • “We are making an additional investment because we do not want to admit failure.”

  • What You See Is All There Is (WYSIATI): The tendency to accept the default S1 view of the world.  

    • A mind that follows WYSIATI will achieve high confidence much too easily by ignoring what it does not know.

_____________________________________________________________________________

Judicial System

  • Authors of one study plotted the proportion of approved (legal) requests against the time since the last food break. The proportion spikes after each meal, when about 65% of requests are granted. During the 2h or so until the judges’ next feeding, the approval rate drops steadily, to about zero just before the meal.

_____________________________________________________________________________

Politics

  • The effect of facial competence on voting is about 3x larger for information-poor and TV-prone voters than for others who are better informed and watch less television.

  • Policy is ultimately about people, what they want and what is best for them.

  • We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.

  • Risk regulation and government intervention to reduce risks should be guided by rational weighting of costs and benefits, and that the natural units for this analysis are the number of lives saved (or perhaps the number of life-years saved, which gives more weight to saving the young) and the dollar cost to the economy.

  • The existing system of regulation in the US displays a very poor setting of priorities, which reflects reaction to public pressures more than careful objective analysis.

  • Because adherence to SOPs is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions—and to an extreme reluctance to take risks.

_____________________________________________________________________________

Business

  • If you are serious about hiring the best possible person for the job, this is what you should do. First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on). Don’t overdo it—six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of those questions for each trait and think about how you will score it, say on a 1–5 scale. You should have an idea of what you will call “very weak” or “very strong.”

  • Collect the information on one trait at a time, scoring each before you move on to the next one. Do not skip around. To evaluate each candidate, add up the six scores. Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better.

  • The outcome of a start-up depends as much on the achievements of its competitors and on changes in the market as on its own efforts…The chances that a small business will survive for 5y in the United States are about 35%.

  • The financial benefits of self-employment are mediocre: given the same qualifications, people achieve higher average returns by selling their skills to employers than by setting out on their own.

  • As in many other games, moving first is an advantage in single-issue negotiations—for example, when price is the only issue to be settled between a buyer and a seller. The initial anchor has a powerful effect.

  • My advice to students was that if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear—to yourself as well as to the other side—that you will not continue the negotiation with that number on the table.

_____________________________________________________________________________

Finance

  • A rational person will invest a large sum in an enterprise that is most likely to fail if the rewards of success are large enough, without deluding herself about the chances of success.

  • On average, the most active traders have the poorest results, while the investors who trade the least earned the highest returns.

  • Individual investors predictably flock to companies that draw their attention because they are in the news. Professional investors are more selective in responding to news.

  • Closely following daily fluctuations is a losing proposition, because the pain of the frequent small losses exceeds the pleasure of the equally frequent small gains. Once a quarter is enough, and may be more than enough for individual investors. In addition to improving the emotional quality of life, the deliberate avoidance of exposure to short-term outcomes improves the quality of both decisions and outcomes. The typical short-term reaction to bad news is increased loss aversion. Investors who get aggregated feedback receive such news much less often and are likely to be less risk averse and to end up richer.

_____________________________________________________________________________

Education

  • If you care to be thought credible and intelligent, do not use complex language where simpler language will do.

  • In an article titled “Consequences of Erudite Vernacular Utilized Irrespective of Necessity: Problems with Using Long Words Needlessly,” Danny Oppenheimer showed that couching familiar ideas in pretentious language is taken as a sign of poor intelligence and low credibility.

_____________________________________________________________________________

Statistics

  • Statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.

  • Simple equally weighted formulas based on existing statistics or on common sense are often very good predictors of significant outcomes. An algorithm constructed on the back of an envelope is often good enough to compete with an optimally weighted formula, and certainly good enough to outdo expert judgment. This logic can be applied in many domains, ranging from the selection of stocks by portfolio managers to the choices of medical treatments by doctors or patients.

  • The aversion to algorithms making decisions that affect humans is rooted in the strong preference that many people have for the natural over the synthetic or artificial.

  • To maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments.

  • Several studies have shown that human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula!

  • Multiple Regression: Finds the optimal formula for putting together a weighted combination of predictors.

_____________________________________________________________________________

Media

  • Our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.

  • Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public’s mind. It is no accident that authoritarian regimes exert substantial pressure on independent media.

  • Availability Cascade: A self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement.

    • The amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator—the tragic story you saw on the news—and not thinking about the denominator.

_____________________________________________________________________________

Relationships

  • The long-term success of a relationship depends far more on avoiding the negative than on seeking the positive.-John Gottman.

_____________________________________________________________________________

Misc Quotes

“You will more often than not err by misclassifying a random event as systematic.”

Hedgehogs “know one big thing” and have a theory about the world; they account for particular events within a coherent framework, bristle with impatience toward those who don’t see things their way, and are confident in their forecasts. They are also especially reluctant to admit error. For hedgehogs, a failed prediction is almost always “off only on timing” or “very nearly right.” They are opinionated and clear, which is exactly what television producers love to see on programs. Two hedgehogs on different sides of an issue, each attacking the idiotic ideas of the adversary, make for a good show. Foxes, by contrast, are complex thinkers. They don’t believe that one big thing drives the march of history (for example, they are unlikely to accept the view that Ronald Reagan single-handedly ended the cold war by standing tall against the Soviet Union). Instead, the foxes recognize that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes. It was the foxes who scored best in Tetlock’s study, although their performance was still very poor. But they are less likely than hedgehogs to be invited to participate in television debates (from Isaiah Berlin’s essay on Tolstoy’s “The Hedgehog and the Fox”).

“An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want.”

“I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes.”

“The proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person’s judgment.”

_____________________________________________________________________________

Terminology

  • Adaptation Level: A neutral reference point to which evaluation is based.

  • Bayesian Reasoning: (Modeled on Rev. Thomas Bayes) Anchor your judgement of the probability of an outcome on a plausible base rate and question the diagnosticity of your evidence.

  • Engaged: People who avoid the sin of intellectual sloth; they are more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, more skeptical about their intuitions.

  • Ideomotor Effect: The priming phenomenon; the influencing of an action by the idea.

  • Nudge: The basic manual for applying behavioral economics to policy.

_____________________________________________________________________________

Chronology

  • 1748: In An Enquiry Concerning Human Understanding is published by Scottish philosopher David Hume, in it he reduces the principles of association to three: resemblance, contiguity in time and place, and causality.-Thinking by Kahneman.

______________________________________________________________________________