Table of Contents:
- a. Association
- b. The Priming Effect
- a. Context
- b. Causality
- a. WYSIATI
- b. Substitution
- a. Stereotypes
- b. First Impressions
- c. The Halo Effect
- a. WYSIATI and Confidence
- b. WYSIATI and Estimations
- c. The Two Selves
- d. WYSIATI and Optimism
- a. Causal Errors
i. Mistaking Stats for Causes
ii. Mistaking Luck for Causes
- b. Statistical Illiteracy
- a. Avoiding Losses
- b. Cutting Our Losses
- c. Risk Aversion
The adage ‘you are what you eat’ is no doubt literally true, but when it comes to getting at the heart of what we are it is certainly more accurate to say ‘you are what you think’; for our identity emerges out of the life of the mind, and our decisions and actions (including what we eat) is determined by our thoughts. An exploration of how we think therefore cuts to the core of what we are, and offers a clear path to gaining a better understanding of ourselves and why we behave as we do. In addition, while many of us are fairly happy with how our mind works, few of us would say that we could not afford to improve here at least in some respects; and therefore, an exploration of how we think also promises to point the way towards fruitful self-improvement (which stands to help us both in our personal and professional lives). While thinking about thinking was traditionally a speculative practice (embarked upon by philosophers and economists) it has recently received a more empirical treatment through the disciplines of psychology and neuroscience. It is from the latter angle that the Nobel Prize winning psychologist Daniel Kahneman approaches the subject in his new book Thinking, Fast and Slow.
As the title would suggest, Kahneman breaks down thinking into 2 modes or systems. Slow thinking is the system that we normally think of as thought in the strictest sense. It is deliberate and conscious, and we naturally feel as though we are in control of it (Kahneman refers to it as system 2). System 2 is in play when we actively consider what we want to have for dinner tonight, or when we choose what stocks to buy, or when we perform a mathematical calculation. System 1, by contrast, is automatic and unconscious, and hums along continuously in the background. It constantly surveys the environment, and processes the incoming stimuli with razor speed.
System 1 is informed by natural drives and instincts but is also capable of learning, which it does by way of association (that is, connecting up novel stimuli with known stimuli according to shared characteristics, contiguity in time and place, or causality). The system is designed to give us an impression of our environment as quickly as possible, thus allowing us to respond to it immediately, which is especially important in times of danger. In order to do so, system 1 relies on general rules and guidelines (called heuristics). These heuristics are primarily geared to help us in the moment and are tilted towards protecting us from danger, and in this respect they are mostly very useful. Still, mistakes can be made, and the system was specifically designed to work in the environment in which we evolved, which is quite different from our current one, so this adds to its errors.
Over and above this, the impressions that system 1 forms are also fed up to system 2. Indeed, whenever system 1 senses something out of the ordinary or dangerous, system 2 is automatically mobilized to help out with the situation. And even when system 2 is not mobilized specifically out of danger, it is constantly being fed suggestions by system 1. Now, while the impressions of system 1 are fairly effective in protecting us from moment to moment, they are much less effective in long-term planning; and therefore, they are much more problematic here. Of course, system 2 is capable of overriding the impressions of system 1, and of avoiding the errors. However, as Kahneman points out, system 2 is often completely unaware that it is being influenced (and misled) by system 1; and therefore, is not naturally well-equipped to catch the errors. Much of the book is spent exploring the activities and biases of system 1, in order to make us more aware of how this system works and how it influences (and often misleads) system 2.
This is only half the battle, though, for while system 2 may be naturally poorly equipped to catch the errors of system 1, it is also often poorly equipped to correct these errors. Indeed, Kahneman argues that system 2 is simply not a paragon of rationality (as is often assumed in economics), and could stand to use a good deal of help in this regard. The most glaring deficiency of system 2, according to Kahneman, is that it is naturally very poor with probabilities and statistics. Fortunately, system 2 can be trained to improve here, and this is another major concern of the book.
Here is Daniel Kahneman introducing and discussing his book with Charlie Rose:
What follows is a full executive summary of Daniel Kahneman’s Thinking, Fast and Slow.
We think of ourselves as the executive in control of our minds and bodies. The decision-maker with distinct beliefs who weighs alternative options, deliberates, and comes to choices based on our better judgment—which choices ultimately govern our behavior. This kind of thinking is what Kahneman refers to as System 2: “when we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices and decides what to think about and what to do” (loc. 402). According to Kahneman, though, System 2 is actually much more influenced than we tend to think by a second mode of thought that the author refers to as System 1 (loc. 402) (the terms System 1 and System 2 were originated by the psychologists Keith Stanovich and Richard West [loc. 396]).
Unlike System 2, System 1 is automatic and unconscious, and therefore often goes unnoticed. As Kahneman explains, “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control” (loc. 399). System 1 is constantly monitoring the outside environment (as well as the inner mind) and forming quick and dirty impressions from this information (loc. 1640). Most crucially, System 1 is probing for information that is particularly important for the biological imperatives of survival and reproduction—meaning it is looking out for opportunities, and also (and especially) dangers. As Kahneman explains, “System 1 has been shaped by evolution to provide a continuous assessment of the main problems that an organism must solve to survive: How are things going? Is there a threat or a major opportunity? Should I approach or avoid?… situations are constantly evaluated as good or bad, requiring escape or permitting approach” (loc. 1651).
If nothing of note is detected, System 1 remains calm, and at relative ease, and goes on with business as usual. However, should something of importance come up, System 1 becomes strained and mobilizes System 2 to help out with the situation (loc. 474): “when System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment… You can… feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains” (loc. 474).
For the average person living in the modern world, true emergencies do not come up very often, and most situations do not call for an immediate reaction. However, in the environment in which we evolved (and in which System 1 evolved) this was far from the case, and we have retained this system from that time. As Kahneman explains, when it comes to the survival questions mentioned above, “the questions are perhaps less urgent for a human in a city environment than for a gazelle on the savannah, but we have inherited the neural mechanisms that evolved to provide ongoing assessments of threat level, and they have not been turned off” (loc. 1650). Therefore, while we may not need the quick and dirty impressions that System 1 provides as much as we did in the environment in which we evolved, our brains continue to churn out these impressions just as frequently as ever.
Now, only so much information is available at any given moment, and yet System 1 is expected to continually come up with as accurate an impression as possible as quickly as possible. And so, in order to do this, System 1 must necessarily take short cuts and make educated guesses. These short cuts and educated guesses may be mistaken occasionally, and therefore, it is best for System 1 to err on the side of caution whenever possible. We will now take a closer look at the inner workings of System 1.
It was mentioned above that System 1 calls upon System 2 when the model of the world that it maintains is violated in some way. The model of the world that System 1 maintains is formed out of innate faculties, and its content is also partially innate. For instance, “we are born prepared to perceive the world around us, recognize objects, orient attention, avoid losses, and fear spiders” (loc. 417).
*For prospective buyers: To get a good indication of how this (and other) articles look before purchasing, I’ve made several of my past articles available for free. Each of my articles follows the same form and is similar in length (15-20 pages). The free articles are available here: Free Articles