Daniel Kahneman: Thinking, Fast and Slow
The death of Nobel Prize-winning psychologist Daniel Kahneman on March 29, at age 90, is an appropriate occasion for an appreciative reflection about how much of Kahneman’s work and that of his longtime collaborator Amos Tversky and other psychologists, which Kahneman summarized in his brilliant 2011 book Thinking, Fast and Slow, can be useful in understanding disinformation and misinformation, and countering them.
The book’s title is a reference to the two main ways in which the brain processes information: through the fast, almost effortless, automatic process of intuitive judgements, which are often correct but can also be disastrously wrong, and the slow, labored, deliberate process of logical thought, which takes a great deal more effort but yields superior results when intuition is misplaced.
Kahneman illustrated fast, intuitive thinking by showing the following picture and dissecting how the mind processes it.
In Thinking: Fast and Slow, he wrote:
Your experience as you look at the woman’s face seamlessly combines what we normally call seeing and intuitive thinking. As surely and quickly as you saw that the young woman’s hair is dark, you knew she is angry. … You sensed that this woman is about to say some very unkind words, probably in a loud and strident voice. A premonition of what she was going to do next came to mind automatically and effortlessly. … It was an instance of fast thinking. (pp. 19-20)
As an example of slow thinking, Kahneman discussed how the mind reacts to the problem, what is the product of 17 times 24?
He wrote:
A precise solution did not come to mind. … You experienced slow thinking as you proceeded through a sequence of steps. You first retrieved from memory the cognitive program for multiplication that you learned in school, and then you implemented it. Carrying out the computation was a strain. … The process was mental work: deliberate, effortful, and orderly – a prototype of slow thinking. … the answer … is 408, by the way…. (p. 20)
(Unfortunately, I cannot summarize the many insights in Kahneman’s book as well as I could have some 12 or so years ago. I found the book to be so insightful that I marked it up extensively and read it four times, marking up the most relevant passages with extensive notes each time. In the early 2010s, I was moving offices frequently. I packed up almost all my books and materials, making an exception for Kahneman’s book because I always wanted to have it nearby to consult. After several office moves, it was unfortunately lost, and with it my extensive marginal notes. I have not had the energy to read it all over again.)
Kahneman says “fast thinking” is similar to “jumping to conclusions,” which can be very useful for quickly evaluating familiar situations, but also can lead to serious errors in judgement. He wrote:
Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information. (p. 79)
In another passage, he writes, fast intuitive thinking “is highly adept in one form of thinking—it automatically and effortlessly identifies causal connections between events, sometimes even when the connection is spurious.” (p. 110)
The “Linda Problem”
One example of the human tendency to prefer the ease of fast intuitive thinking to the difficulty of slow, logical thinking is illustrated in what Kahneman and his longtime co-researcher Amos Tversky called the “Linda problem.” Kahneman wrote:
Amos and I made up the Linda problem to provide conclusive evidence of the role of heuristics [mental shortcuts that allow people to quickly make fast decisions, or “rules of thumb”] in judgement and their incompatibility with logic. This is how we described Linda
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in anti-nuclear demonstrations.
Kahneman and Tversky then asked people to rate the likelihood that Linda fits one of eight descriptions best, the key ones being:
Linda is a bank teller
Linda is a bank teller and is active in the feminist movement.
Kahneman wrote:
Everyone agrees that Linda fits the idea of a “feminist bank teller” better than she fits the stereotype of bank tellers. The stereotypical bank teller is not a feminist activist, and adding that detail to the description makes for a more coherent story.
The twist comes in the judgments of likelihood, because there is a logical relation between the two scenarios. Think in terms of Venn diagrams. The set of feminist bank tellers is wholly included in the set of bank tellers, as every feminist bank teller is a bank teller. Therefore, the probability that Linda is a feminist bank teller must be lower than the probability of her being a bank teller. ... The problem therefore sets up a conflict between the intuition of representativeness and the logic of probability.
… we found that 89% of the undergraduates in our sample violated the logic of probability. We were convinced that statistically sophisticated respondents would do better, so we administered the same questionnaire to doctoral students in the decision-science program of the Stanford Graduate School of Business, all of whom had taken several advanced courses in probability, statistics, and decision theory. We were surprised again: 85% of these respondents also ranked “feminist bank teller” as more likely than “bank teller.” (pp. 156-158)
Kahneman continued, “we had pitted logic against representativeness and representativeness had won.”
Kahneman and Tversky’s insight has great relevance to understanding why some misinformation and disinformation claims are often believed (due to defects that can occur in intuitive fast thinking) and how to counter this (by engaging slow, logical thinking).
Exaggerated Fears about Depleted Uranium
One very relevant example involves exaggerated fears about depleted uranium.
In the 1991 Gulf War, U.S. and allied forces used depleted uranium munitions widely for the first time in combat. Months later, on November 10, 1991, an article in a London newspaper, The Independent on Sunday, cited a report by the British Atomic Energy Authority which estimated that 40 tons of depleted uranium remained on the Gulf War battlefield – an amount, the report claimed, that was enough to cause “500,000 potential deaths.”
The 500,000 figure bore no relationship to reality, as the scientists who wrote the report recognized and the newspaper acknowledged. For half a million people to die, all the DU rounds estimated to have been fired would need to somehow have been fully recovered, “pulverized into dust and 500,000 people would have to line up in the desert and inhale equal amounts.” This was obviously a totally unrealistic scenario, but it made a striking sensational claim, of the type that newspapers love. The calculation took no account of the fact that, in the desert, DU rounds fired from aircraft that miss their target typically bury themselves several meters underground.
Saddam’s disinformation and propaganda specialists noticed the stir that the article created with its sensational and misleading claim, and they quickly made depleted uranium a centerpiece of their disinformation campaigns. Saddam’s regime even held a conference on depleted uranium in Baghdad in 1994.
In a 1993 article in The Bulletin of Atomic Scientists, “The Desert Glows—with Propaganda,” defense analyst William Arkin offered some much needed perspective, writing:
Most of the depleted uranium is in uninhabited areas, where the dangers of contact, ingestion, or inhalation are small. … An individual … who might handle or collect a heavy slug would be exposed to low-level radiation, and would have to carry the bullet constantly for a month to have a one percent chance of incurring a cancer from this exposure.
Arkin spoke with the Director-General of Iraq’s Basra Hospital, who told him there had been “no increase in kidney disease (a sign of heavy-metal poisoning) after the war.” Scientists at the physics department of Basra University told him they had found no increase in radiation over background levels since the war.
Nevertheless, Arkin noted that depleted uranium’s “nuclear moniker” had led to “overblown stories.”
Associations with Depleted Uranium
The exaggerated fears about depleted uranium are exactly what Kahneman would have predicted. When people hear the word “uranium,” this sets off a whole chain of alarming associations in their minds. For many, it immediately and subconsciously conjures up thoughts of atomic bombs, Hiroshima, radioactive fallout, radiation sickness, cancer, and birth defects. This subconscious chain of associations is swift, powerful, vivid, and extremely disturbing. This is fast thinking at work.
But counterintuitively, there is ample information that depleted uranium poses little actual health risk. A Canadian government factsheet notes that it is “about 40% less radioactive than natural uranium. For comparison, it is 10 million times less radioactive gram for gram than Americium-241 which is found in domestic smoke detectors.”
When countering exaggerated fears about DU in the 2000s, I used some of the following facts in an article on depleted uranium I wrote for the State Department’s “Identifying Misinformation” website:
In March 2001, a World Health Organization report stated: “no increase of leukemia or other cancers has been established following exposure to uranium or depleted uranium.”
A March 2001 European Commission report concluded, “taking into account the pathways and realistic scenarios of human exposure, radiological exposure to DU could not cause a detectable effect on human health."
A January 2001 NATO report said, "based on the data today, no link has been established between depleted uranium and any forms of cancer."
In 1999, a RAND Corporation study on depleted uranium concluded that “no evidence is documented in the literature of cancer of any other negative health effect related to the radiation received from exposure to natural uranium, whether inhaled or ingested, even in very high doses.”
I found that pointing out these facts had little to no effect on people unless I first drew their attention to the fact that the word “uranium” has many powerful, negative, subconscious associations, which come to mind automatically and involuntarily. It was only when I pointed out the power of these very negative associations, which result from fast, intuitive thinking, and encouraged people to reflect on this subconscious process, that I was able to create a cognitive opening that allowed people to consider the facts listed above, using slow, logical thinking, which were otherwise dismissed because they did not fit with the very negative, fear-filled associations that are automatically prompted when people hear the word “uranium.”
Kahneman noted:
most of the work of associative thinking is silent, hidden from our conscious selves. The notion that we have limited access to the workings of our minds is difficult to accept, naturally, it is alien to our experience, but it is true: you know far less about yourself than you feel you know. (Thinking, Fast and Slow, p. 52).
Kahneman said that slow, logical thinking “normally has the last word” when making decisions—but only if it is engaged. If it is not, fast, intuitive thinking—like the very alarming, subconscious associations most people have with the word “uranium”—makes up people’s minds quickly and, in this case, leads them to dismiss facts that don’t fit with the negative associations that immediately flood our subconscious mind. One limitation of fast thinking is that, as Kahneman wrote, “it cannot be turned off” (both quotations from page 25). Thus, if we recognize that fast thinking may be leading us to a false conclusion, slow thinking can override it, but only if we recognize there is a problem. Otherwise, fast thinking rules.
Exaggerated fears about depleted uranium are common, largely because of the very powerful alarm bells it sets off in people’s minds. Compare your emotional, gut reactions to the phrases “depleted uranium” and “depleted tungsten.” There is no such substance as depleted tungsten, but both tungsten and depleted uranium are heavy metals with similar health effects. But few people have panicked reactions to the word tungsten.
Broader Implications of Kahneman’s and Tversky’s Insights
Kahneman wrote, “System 2 [slow, logical thinking] believes that it is in charge and that it knows the reasons for its choices. … your subjective experience consists largely of the story that your System 2 tells itself about what is going on.” (pp. 56, 57) But, as we have learned, System 1 (fast, intuitive, automatic thinking) is actually in charge much of the time, without our logical mind being aware of this.
As noted above, I believe Kahneman and Tversky’s insights have great relevance for understanding why some misinformation and disinformation claims are widely believed (due to defects inherent in intuitive, fast, automatic thinking) and how to counter this (by engaging slow, logical, effortful thinking).
The topics characteristic of dis/misinformation often play on very powerful negative associations, as I noted in the Global Engagement Center’s Counter-Disinformation Dispatch #13, on “Exploiting Primal Fears,” in which I stated, “One way to make a disinformation story go viral is to shape it around what people fear most.” I gave several examples:
Fear of “the other”
Alleged violations of taboos
Fears about children being victimized
Depleted uranium
Alleged biological weapons
In these cases and others, primal fears activate “hot buttons” in people’s minds, setting off a cascade of fear-based associations, which can easily overwhelm the logical part of our minds.
The problem with a countering approach that relies exclusively or predominately on fact checking is that it may not take account of the fact that powerful negative associations can bypass logic, as shown in Kahneman and Tversky’s “Linda problem.” We mistakenly assume that our logical self is always engaged, whereas, in fact, powerful associations can easily overwhelm it and slow, logical thinking may not have a chance to intervene unless it is specifically prompted, which is often not easy to do.
Relevance for Conspiracy Theories
Such “cognitive illusions,” as Kahneman calls them, may be at the root of many conspiracy theories. We all are familiar with how Western intelligence agencies are often cast as the suspected villains when unexpected catastrophes occur. I remember how the Soviets took advantage of this strong negative associative tendency in 1987. In August 1986, there was an “limnic eruption” at Lake Nyos in Cameroon. A limnic eruption is “a very rare type of natural disaster in which dissolved carbon dioxide suddenly erupts from deep lake waters, forming a gas cloud capable of asphyxiating wildlife, livestock, and humans.” At Lake Nyos,
the eruption triggered the sudden release of about 100,000–300,000 tons … of carbon dioxide. The gas cloud initially rose at nearly 100 kilometers per hour (62 mph) and then, being heavier than air, descended onto nearby villages, suffocating people and livestock within 25 kilometers (16 mi) of the lake.
This accident killed 1,746 people and 3,500 livestock.
In 1987, the Soviets falsely claimed that the CIA had caused the Lake Nyos disaster. This was an example of what might be called “no brainer” disinformation—blame any unexplained, mysterious disaster on the CIA (or the Pentagon). Unfortunately, from what Kahneman and Tversky discovered about the power of associations, such nonsensical claims can be widely believed.
Child Organ Trafficking Allegations
From 1987 to 1996, no false story kept me busier than the horrifying, false claim that Americans (or others) were adopting or kidnapping children from Latin America (or elsewhere), to murder them and use their organs to save their natural-born children by using them in organ transplants. I debunked these false claims in a U.S. Information Agency paper, “The ‘Baby Parts’ Myth: The Anatomy of a Rumor,” last updated in May 1996.
I plan to discuss this debunking effort and the lessons that can be learned from it in more detail in a future post, but one key to successful debunking was to engage the logical mind by framing the phony story as an urban legend, a false but widely believed story, before introducing factual evidence about why clandestine organ transplants are a logical impossibility. This framing cued the logical part of the mind to pay special attention.
I owe a great debt of gratitude to French folklorist Veronique Campion-Vincent, who studied organ theft rumors extensively and in a conversation in Washington in the early 1990s convinced me to view the false child organ trafficking allegations as an urban legend. She authored the book, Organ Theft Legends.
Daniel Kahneman laid out the intellectual framework that undergirds effective debunking efforts in Thinking, Fast and Slow. I believe practical workers engaged in debunking would benefit from reading Kahneman’s book closely and reflecting on ways to prompt people to engage their logical, slow thinking mode, which is the best way to counter many of the cognitive mistakes that fast-but-careless intuitive thinking can make, and which, as Kahneman showed, does not happen automatically.


