The Life Changing Science of Detecting Bullshit by Petrocelli

Ref: John Petrocelli (2021). The Life Changing Science of Detecting Bullshit. St. Martin’s Press.

___________________________________________________________________________

Summary

  • What this book is all about- getting you to recognize that it is to your distinct advantage to seek evidence and truth rather than permitting people to continue bullshitting you, leading you to buy into belief systems of which the centers do not hold. By the end of this book, if you can honestly say that you ask yourself and others important critical-thinking questions that you wouldn’t have asked otherwise, then I will have accomplished my primary goal. 

  • We live in a world that pays increasingly more attention to fake news, social media opinion pieces, and intriguing but unsupported theories, and less attention to science, skepticism, and good old-fashioned critical thinking. I will guide us through how bullshit occurs, and why, which is useful at a time when evidence-based reasoning and rational judgment are failing to keep up with the bullshit generated in this era of mass and rapid communication. 

  • It is my hope that you are motivated to lock arms with allies interested in evidence and truth and in joining the great struggle against bullshit. The truth is, a world with less bullshit won’t happen without a collective effort. We must make calling bullshit part of our communication culture. It is the responsibility of each of us to search for the truth, discern fact from fiction, and communicate what we know to be true, not just what we want or hope to be true. Fighting bullshit on an individual level contributes to the collective struggle to rid our culture of bullshit. 

___________________________________________________________________________

Intro: What is Bullshit?

  • Bullshit: Foolish, deceitful, or boastful language; something worthless, deceptive, or insincere; insolent talk or behavior. 

  • Bullshitting: To speak foolishly or insolently; to engage in idle conversation; to attempt to mislead or deceive by talking nonsense; intentionally or unintentionally, consciously or unconsciously, communicating with little to no regard or concern for truth, genuine evidence, and/or established knowledge; using rhetorical strategies designed to disregard truth, evidence, and/or established knowledge, such as exaggerating or embellishing one’s knowledge, competence, or skills in a particular area or talking about things of which one knows nothing about in order to impress, fit in with, influence, or persuade others. 

    • The only way to determine if the used-car dealer was bullshitting or lying to you is to discover his level of concern with the truth. If the dealer knew the truth, but communicated something other than the truth, then he lied to you, but if he didn’t care at all about the truth, he was bullshitting. 

  • Bullshitter: A relatively careless thinker who plays fast and loose with ideas and information.

  • Scientific Process: First, scientists observe things and develop theories and testable explanations for what they see. These are called hypotheses. Scientists are concerned with genuine evidence relevant to their hypotheses. Genuine evidence is information that reasonably indicates whether a claim, belief, or proposition is valid. Scientists make predictions based on their hypotheses and test them with fair experiments that are designed to put their hypotheses to the most stringent tests possible. Scientists don’t just seek evidence to confirm their hypotheses; rather, they bend over backward to seek evidence that might refute their hypotheses. When a hypothesis has survived very stringent tests of this type- carried out by the proposer and by other, independent scientists- then, and only then, can we draw the tentative conclusion that the hypothesis is probably approximately true. We would then replicate our results through additional experiments. It is perfectly normal for scientists to change their conclusions and opinions after learning new information. This is not a sign of weakness; it is, in fact, an essential feature of the scientific method. When scientists later publicize their experiment-based conclusions, you can be sure that they will be scrutinized. Dozens, if not hundreds, of qualified experts will ask: Are the premises true? Are the conclusions supported by all of the data? Are the arguments and conclusions logically strong? Were all relevant factors considered? It is this stage of critical scrutiny that fortifies the strength of the scientific process. 

    • A distinguishing feature of science is that its claims are expected to be verifiable. 

    • Public misunderstandings about science often follow a familiar route: Scientist conducts an experiment and makes preliminary claims (for example, “eating and digesting celery burns more calories than it adds”) → Reporters take the idea and grossly overgeneralize the findings, adding cute titles to their articles to get attention (for example, “Celery, the New Wonder Food”) → Public consumes incorrect or misleading interpretations of science → Scientist attempts to correct the misinterpretations for the next two decades → Public rarely gets the full story and recalls the incorrect or misleading interpretations → Public feels that scientists change their stories and call “bullshit” on science altogether, dismissing and distrusting any subsequent scientific claims that challenge their ideologies or interests. 

  • Better information doesn’t always result in better decision-making, but better decision-making almost always requires better information.

___________________________________________________________________________

1: Costs of Bullshit

  • Usually, people think of their judgments and decisions as being the result of reason, but often it is the other way around. Intuitions and feelings can shape judgments and decisions, and reasoning comes afterward to support those judgments and decisions…The intuitive way of thinking is often favored because relative to a more formal reasoning system, it is faster, takes less effort, and can draw on context. 

  • Many of our memories, beliefs, attitudes, and decisions are based on bullshit rather than evidence-based reasoning. 

___________________________________________________________________________

2: Bullibility

  • Bullibility: Accepting bullshit as fact by failing to infer from available social cues that the bullshitter has a disregard for the truth or has failed to take reasonable action to find truth. 

  • Bullible: The tendency to be influenced by arguments not grounded in genuine evidence or a concern for truth. 

  • Schemes usually fall apart for one of three reasons: the operator escapes detection and runs off with the remaining investment money, the flow of incoming money dies out as a result of failing to find new investors, or as in both Ponzi and Madoff’s cases, too many investors simultaneously begin to pull out of the fund and request their returns. 

  • Big Five Personality Traits: Agreeableness, extraversion, openness, conscientiousness, and neuroticism. 

    • A person high in agreeableness is often described by others as warm, friendly, tactful, helpful, selfless, sympathetic, kind, considerate, and trusting. They tend to hold optimistic views of human nature and get along well with others. Highly agreeable people tend toward conformity, avoid violating social norms, eschew upsetting other people, and comply with social expectations. Agreeable people tend to see others through rose-colored glasses, trying to find the positive side in everyone. By definition, agreeable people often have trouble saying no. 

    • In replicating Milgram’s procedures, Bègue included a Big Five measure of personality. As expected, those highest in agreeableness were the most likely to continue delivering painful electric shocks at the prods of the game-show host. Being pleasant, warm, and nice, as highly agreeable people tend to be, conflicts with an ability to critically analyze and discard bullshit. This is why the techniques employed by bullshitters should be particularly effective with agreeable people. Agreeable individuals may be more likely to conform to bullshit by endorsing the bullshit publicly and obeying suggestions more readily than others. 

  • Persuasion and influence always work best under conditions of uncertainty because uncertain people are looking for answers and clarity, and frankly, they usually don’t know enough to detect bullshit. 

  • Arguments v. Evidence

    • Argument: A reason or theory provided for a claim. 

    • Evidence: The available body of information and facts supporting a belief or proposition as true or valid. 

    • Very few people form and communicate their beliefs after a rigorous consideration of evidence and existing knowledge. Although arguments for one’s beliefs are often misconstrued as evidence, most attitudes and beliefs are formed on the basis of subjective, emotional reactions rather than a fair assessment of evidence. 

    • When people are asked to explain why they think something is true or why they think something works, it tends to elicit the relatively easier goal of providing opinions and mere arguments. However, when people are asked to explain how they know something is true or how something works, it tends to elicit the relatively challenging goal of providing evidence and proof. Practice asking “how” questions. 

  • The two general styles of thinking are intuitive and reflective thinking. Intuitive thinking is a relatively effortless form of thinking that relies on instinct and reaching decisions quickly on the basis of automatic reactions. Intuitive thinkers often go with their gut feelings. Reflective thinking is the opposite: it requires considerable analytic effort, deliberation, and systematic reasoning. Reflective thinkers often question their first instincts and consider other possibilities. People who score high on the CRT (reflective thinkers) also tend to accurately discern fake news from real news, even for news headlines that align with their political ideologies. 

___________________________________________________________________________

3: When and Why People Bullshit

  • Why People Bullshit?

    • The production of bullshit is stimulated whenever a person’s obligations or opportunities to speak about some topic are more excessive than his knowledge of the facts that are relevant to that topic.

    • People who have some knowledge about a topic or no knowledge about a topic appear equally willing to bullshit when they feel obligated to provide their opinions. 

    • Implicit social pressures to have an opinion can create an obligation to provide one.

    • There are at least four individual factors that impact a person’s motivation to bullshit: the need for evidence; one’s level of domain-specific knowledge; a desire for attention, fame, or wealth; and a need to belong. 

    • If unknowledgeable people feel knowledgeable about a topic, they are especially likely to bullshit. And people are surprisingly bad at assessing their own competence and knowledge. It is true that people who say they have knowledge are less likely to report that they bullshit in their respective domains of “expertise”—but don’t take their word for it. In fact, people are likely to bullshit when they feel more knowledgeable about something than their audience. That is, a feeling of knowing, not actual knowledge, can be enough to produce bullshit. 

  • High-Need-for-Evidence Individuals: People who determine whether something is good or bad only after weighing the evidence. They tend to refrain from contributing their thoughts to a discussion until they have all the facts. People with a high propensity to bullshit possess none of these evidence-based goals. 

  • Low-Need-for-Evidence Individuals: People who appear willing to fabricate evidence or data. They often refer to data that only they have access to—and what often comes out is bullshit. 

  • People who are unskilled in a domain lack the ability to distinguish competence from incompetence. 

___________________________________________________________________________

4: Bullshit Artists

  • Tactics employed by bullshit artists

    • Disregard all evidence that disproves the claim.

    • Derail natural skepticism by distracting people from disconfirming evidence and dazzling everyone with anecdotal evidence.

    • Focus on unreliable, anecdotal evidence that supports the claim. The more anecdotal, the more convincing.

    • Focus on specific, identified victims of unfortunate events over unidentified, general, or “statistical” victims. 

    • Conflate observed relationships (correlation) with the idea that one thing causes the other (causation). 

    • Indirectly attack their critics’ positions using the straw man technique.

  • Dunning-Kruger Effect: In one study, Kruger and Dunning had Cornell University undergraduates complete a 20-item multiple-choice test of English grammar. Instead of giving the students’ performance feedback, they asked them to rate their overall ability to recognize correct grammar and how their performance compared with that of their peers. In this way, Kruger and Dunning could see if those who did poorly would recognize their poor performance. Four to six weeks later, participants received a packet of five completed, but not scored, tests by other students. The packet of completed tests reflected the range of performances that their peers had achieved. Participants then graded each test. Participants were then shown their own test and asked to once again rate their own ability and test performance relative to their peers. Kruger and Dunning were especially interested in what students who performed in the bottom and top quartiles thought about their abilities and performance. Consistent with the now-famous Dunning-Kruger effect, students at the bottom quartile grossly overestimated their test performance and ability. Although their actual test scores put them in the 10th percentile, they initially estimated themselves to be in the 66th percentile. After they reviewed their peers’ tests, they readjusted their perceived grammar knowledge to be at the 63rd percentile. Students at the top quartile underestimated their test performance and ability. Although their actual test scores put them at the 88th percentile, they initially estimated themselves to be in the 71st percentile. But after they reviewed their peers’ tests, top quartile participants readjusted their perceived grammar knowledge up to the 77th percentile. 

___________________________________________________________________________

5: Bullshit Detection

  • Bullshit Detection

    • Collect Data: Have I obtained and reviewed the right types, amounts, and levels of information to comprehend and evaluate the claim? 

    • Recognize Potential Bias: Have I treated the claim or its implications like ideas, refraining from assuming it to be true or false on the basis of my emotional reactions before evaluating the evidence? 

    • Minimize Bias: Have I accurately identified the positions, arguments, and conclusions of the claim and the degree to which any assumptions are sensible, false, or unfair? Have I considered and fairly weighed evidence that opposes, and evidence that supports, the claim? 

    • Assess the Validity of Conclusions: Have I reviewed and drawn logical, valid, and justifiable conclusions provided by the relevant evidence of multiple, independent viewpoints or sources? 

    • Elaborate and Apply: Can I reasonably draw my own well-informed, well-reasoned, and compelling conclusions from the evidence to form arguments convincing to other critical thinkers? 

    • Skepticism: Develop an attitude of skepticism and a practice of questioning; bring a healthy element of polite doubt when evaluating evidence for or against a claim. 

    • Interpret the Claim: What does the claim mean? How is it meant to be understood? Is there anything unclear, ambiguous, or not understood about the claim? How can the claim be best characterized and classified? 

    • Engage in Inference: What does the evidence imply? If the claim is true, what are the implications moving forward? If major assumptions supporting the claim are abandoned, how does the claim’s truth stand? 

    • Evaluate: How compelling is the evidence supporting the claim? Does the claim follow from a reasonable interpretation of the evidence? Do the results of relevant investigations speak to the truth of the claim? 

  • Four assumptions

    • You will encounter bullshit claims frequently—especially when an individual is motivated to influence you. 

    • Individuals rarely recognize and fully appreciate the complexity of the phenomena or systems they’re bullshitting about. 

    • Individuals do not always take the time and effort necessary to evidence-proof their claim(s). 

    • You need to ask questions to verify if claims are based on evidence or bullshit—treat ideas as ideas, not facts. 

  • 1) Question the Claim: What precisely is the claim? Does the claim rest on underlying assumptions or apply only to a certain group of people or in certain situations? Are there any unfair apples-to-oranges comparisons being made? Is the claim extraordinary? If so, is there extraordinary evidence in support of it? Does the claim sound too good or too bad to be true? Of all the ways the individual could have possibly stated the claim, why did they state it this way? 

  • 2) Assess the Individual Communicator: How credible is the individual making the claim? What is their expertise in the area? Do they appear concerned with evidence supporting their claim and willing to engage in a critical conversation without unfairly dismissing evidence that might oppose their claim? How does the individual know this? What is the individual’s motivation for sharing this claim with me? Are there any reasons for them to bullshit? Are they employing rhetorical signs of bullshit: “Some people say…,” “I read somewhere…,” and/or shifting the semantic goalposts when asked for clarification; inaccurately representing competing claims, explanations, and arguments; using tautological reasoning (repeating an idea or statement using different words that essentially say the same thing); or using pseudo-profundity? How does the individual react when asked questions? (Properly diagnose defensive responses by the individual. The individual could sound defensive or irritated for a number of reasons—they may sense that their bullshit was detected, they may feel discomfort realizing they were bullshitting, or they may be frustrated because you don’t appear to comprehend their claim.) 

  • 3) Evaluate the Claim and Its Evidence: What exactly are the plausible reasons for the claim? Do any reasons contradict common knowledge, such as mathematical or scientific laws, semantic or empirical knowledge? How strong or compelling is the relevant evidence? What problems does the claim have? What doesn’t the claim explain? How might the claim be wrong? What significant information might have been omitted? Are the statistics or how they’re displayed deceptive in any way? Are any mere arguments or anecdotal evidence being treated as genuine evidence? Is it possible to conduct a simple thought experiment to test the claim? Given the available evidence and established knowledge, what are the most reasonable conclusions? 

  • 4) Assess the Self: Are my conclusions about the claim based on genuine evidence? Have I sought any information beyond what I was given? Would someone say that I’ve thought this through like a fact-checker? Is there anything about the individual that would cause me to look beyond the substance of their claim and its evidence (for example, attractiveness, liking, disliking, etc.)? Am I being objective in my assessment of the evidence? Have I tried another perspective? Have I honestly considered the possibility that the opposite of the claim is true or am I jumping to conclusions? If I accept or reject the claim as true, what other beliefs of mine might need to change? Do these implications affect my evaluation of the claim’s validity or its evidence? 

  • 5) Clarify the Claim: What do you mean by ___? What can ___ look like? Is there another way to state ___? When and where is ___ most likely to occur? Could you give me an example or illustration? Are there specific conditions under which ___ is true and conditions under which ___ is not true? Ask how questions. 

  • 6) Evaluate the Evidence: How do we know ___ to be true? Why should we believe that ___ is so? What sort of evidence supports the conclusion about ___? What is the best single piece of evidence for ___? How could we check that to find out if that is true? Has anyone tested ___ with an experiment? What were the results of any experiments? Can you tell me a bit about the methods used in those experiments? How were the data collected? Were the right metrics for drawing valid conclusions used? Attack the claim, not the person. 

  • Devil’s Advocate with Disconfirmatory Questions

    • How might ___ be wrong? (Don’t ask, “How might you be wrong?” Keep the discussion about the claim.) 

    • Under what conditions could ___ be false? What would it take for ___ to always be true? What would it take for ___ to be rejected? 

    • What sort of evidence might someone provide—even hypothetical evidence—that would show a need to revise the claim? Have you, or has anyone else, considered alternative possibilities, like ___ or ___? 

    • Suppose ___ was tested and failed. Why might it fail? What kind of evidence would it take for you to conclude your claim was wrong?

    • Is confidence in ___ justified by available evidence? Does available evidence lead us to ___? 

    • Have you considered or ruled out any viable alternative hypotheses?

  • Don’t expect questioning bullshit to result in the pinnacle of rationality and logic. The goal is to reveal understandable errors in reasoning. Doing so is a more efficient method of getting to the truth than making accusations. 

___________________________________________________________________________

6: Expert Bullshit Detectors

  • What distinguishes a high-performing expert from the highly experienced professional is that the expert understands two things. First, experts understand the complexity of the sample space. Second is that all experts describe operating in their domain much like an investigator or scientist would. Experts know how to think critically about their domains of expertise. 

  • Habits of Critical Thinking: 1) Having a passionate drive for clarity, precision, accuracy, relevance, consistency, logic, completeness, and fairness. 2) Having sensitivity to the ways in which critical thinking can be skewed by wishful thinking. 3) Being intellectually honest, acknowledging what they don’t know and recognizing their limitations. 4) Not pretending to know more than they do and ignoring their limitations. 5) Listening to opposing points of view with an open mind and welcoming criticisms of their beliefs and assumptions. 6) Basing beliefs on facts and evidence rather than on personal preference or self-interest. 7) Being aware of the biases and preconceptions that shape the way the world is perceived. 8) Thinking independently and not fearing disagreement with a group. 9) Getting to the heart of an issue or problem without being distracted by details. 10) Having the intellectual courage to face and assess ideas fairly even when they challenge basic beliefs. 11) Loving truth and being curious about a wide range of issues. 12) Persevering when encountering intellectual obstacles or difficulties. 

  • Self-Regulation: Assessing one’s own motivations and biases and asking whether these influence one’s interpretations, analyses, inferences, and evaluations of a claim. Self-regulation works best when engaging in metacognitive thought (thinking about one’s thoughts) by answering questions such as: How good was my method in evaluating the claim? Are my conclusions based on evidence and data, or are they based on anecdotal evidence or what I read in the news? Is there anything I might be missing (or wanting to miss), and are my conclusions about the claim motivated by something other than the truth in any way? 

___________________________________________________________________________

Conclusion: Life without Bullshit

  • The science of detecting bullshit may not change society—but it can have a life-changing impact on you personally. By adopting a critical posture and the power of inquisitive questioning, you awaken the natural scientist and critical thinker inside yourself. If enough people join the collective stand against bullshit, our world will become a very different place. We won’t have to listen to people talk about things they know nothing about. We won’t be exposed to baseless arguments. We won’t have to rely on incompetent people trying to do important jobs. Rather, we will collectively replace bullshit with evidence-based communication and reasoning, making more rational decisions based on facts, evidence, and reality. 

  • To get people to stop spreading bullshit, misinformation, fake news, and the like, we will have to get comfortable asking bullshitters, How do you know this to be true? To the best of your knowledge, is the claim accurate? What sort of evidence supports your conclusion? 

  • Social Norms: Unwritten rules of behaviors that are considered acceptable in society.

    • Descriptive Norms: What we think most people would do in a particular situation.

    • Injunctive Norms: What we think we should do in a particular situation.

    • Calling bullshit is currently an injunctive norm—it is what we should do when we are exposed to bullshit. For calling bullshit to become more commonplace, calling bullshit will either need to transform from an injunctive norm to a descriptive norm or the injunctive norm must become a more salient feature of our society. 

  • Two ways people influence others

    • Normative Social Influence: A result of our social desire to be liked and accepted by other people, which stems from our human identities as social beings with affiliation and companionship needs. We go along with what other people do in order to be liked and accepted by them. We often conform to a group’s beliefs and behaviors publicly without necessarily believing what other people are doing or saying. 

    • Informational Social Influence: Our need for information and our tendency to believe what others tell us results in conformity to what others are doing or saying. These pressures of conformity drive the widespread acceptance and use of bullshit as an acceptable form of communication. 

  • Littering Study: The baseline percentage of littering when the experimenter had modeled neither the descriptive nor injunctive norm was about 38%. It didn’t matter if the area was clean (37%) or a mess (38%). When the descriptive norm was modeled by the experimenter littering, it communicated two different messages, depending on the state of the parking lot. When the parking lot was littered, the experimenter’s behavior reminded patrons that people often litter here. The experimenter served as just one more example of the type of behavior that leads to a messy parking lot in the first place. In this situation, patrons littered about 30% of the time. In the clean parking lot, however, the experimenter’s littering behavior communicated a different message. Now the behavior stood out as unusual, by reminding patrons that most people don’t litter in this area—which is why it looked so clean. In this situation, patrons littered 11% of the time. However, when the injunctive norm was made salient by seeing the experimenter picking up someone else’s litter—suggesting that littering is wrong—littering among patrons was substantially reduced to about 5% in both the clean (7%) and littered (4%) parking lots. 

  • Cialdini’s Energy Study: Neighborhood households were first identified as consuming energy either above or below the average for the neighborhood. Some households were mailed a basic postcard highlighting the descriptive norm by detailing how much energy they had used that week and how much energy the average household in their neighborhood had used. Other households were mailed a more detailed postcard highlighting both the descriptive and injunctive norms by including a smiley face if they had consumed less energy than the average household or a sad face if they had consumed more energy than the average household. Weeks later, Cialdini measured energy usage again. Households mailed the basic postcard significantly cut back and conserved energy if they had been consuming more energy than average, but significantly increased usage if they had been consuming less energy than average. However, households mailed the more detailed postcard, which also highlighted the injunctive norm, significantly cut back and conserved energy if their usage had been above average and did not significantly increase their usage if it was already below average. A simple smiley face reminded people to either join the rest of the neighborhood in conserving energy or that they were already doing the right thing and to keep on doing it.

    • Cialdini’s research demonstrates that what we think we should do (injunctive norms) can become more powerful than what we usually do (descriptive norms) and produce more desirable behavior. 

___________________________________________________________________________

Examples of Bullshit

  • Essential Oils: In a systemic review of 201 published studies on essential oils as alternative medicines, only 10 were found to be of acceptable methodological quality, and even these 10 were still weak by scientific standards. As far as the ability of aromatherapy to alleviate hypertension, depression, anxiety, pain, and dementia, the authors of the review found the evidence unconvincing. 

  • Diamonds: Ever since the early 1940s, the De Beers Diamond Syndicate has fabricated the illusion of diamond scarcity, justifying the cost of diamonds by carefully restricting their supply (they’ve tried to purchase all the diamond mines). Lab-grown and natural diamonds are virtually the same chemically and visually. 

  • MBTI: Developed by Katharine Cook Briggs and her daughter, Isabel Briggs Myers, in the 1940s as a game loosely based upon a conceptual theory by Swiss psychiatrist Carl Jung. Importantly, Jung did not base his theory on systematic data. Jung speculated that people experience the world using four principal psychological functions, including sensation, intuition, feeling, and thinking. Presumably, these functions are modified by two personality types (extroversion and introversion), and individuals tend to rely on one dominant function over the others. 

    • The MBTI purports to offer a psychological breakdown of how respondents perceive the world and make decisions. Respondents are asked to choose one of two possible answers to each of 93 questions. Based on their answers, respondents’ personalities are described by a four-letter code that represents four areas: whether one is outwardly (Extraverted) or inwardly (Introverted) focused; if one prefers to take in information through Sensing or INtuition; if one prefers to make Thinking- or Feeling-based decisions; and if one prefers to live one’s outer life as Judging (preference for conformity to an established structure, neat and orderly) or Perceiving (preference for a flexible, spontaneous, and adaptable lifestyle). 

  • Bernie Madoff: Founded in 1960, Bernard L. Madoff Investment Securities, LLC (BLMIS), operated as a securities broker-dealer in the United States and abroad, providing stock market executions for banks, broker-dealers, financial institutions, and wealthy investors. By 1989, the company had become one of the largest independent trading operations in the securities industry; by 2005, it was handling 5% of the trading volume on the NYSE. But in December of 2008, BLMIS was liquidated. One of the most esteemed men in the financial industry, former NASDAQ chairman, manager of three successful companies, and founder of BLMIS—Bernard Madoff—was accused of, convicted of, and sentenced to 150 years in prison for perpetuating the largest fraud in US history. Instead of running a legitimate investment fund, Madoff was operating the nation’s longest-running Ponzi scheme. 

    • Madoff claimed to achieve his returns by using a split-strike conversion strategy. This involved purchasing blue-chip stock shares of companies listed within the S&P 100 or S&P 500 (that is, large corporations listed on stock exchanges in the United States with a reputation for quality, reliability, and profitability in good and bad markets) and then insuring the stocks with put options, which gave him the right to sell at a specified price by a specified date. By allegedly purchasing put options, Madoff could further obscure his operation by making it harder to determine at what prices he was allegedly selling his stock shares. Madoff could hide the footprints of his alleged stock trades in this way because he wouldn’t be expected to explain how his hedge funds consistently outperformed all the others on the basis of published stock prices. 

    • Madoff’s scheme began to deteriorate when his investors requested a total of $7B back in returns when he only had $300M left in the bank to give back. 

  • Ponzi Scheme: An investment scheme that lures in new investors by guaranteeing unusually high returns. Named for con artists Charles Ponzi who costs investors $20M (~$200M today). 

  • Autism: A developmental disorder characterized by difficulties with social interaction and communication and restrictive and repetitive behavior. It is diagnosed in 1.5% of children in developed countries. 

    • In Wakefield’s original study, he examined 12 autistic children. He collected case histories, blood tests, colonoscopies, and spinal fluid from the children, and his findings revealed that eight of the children had received the MMR (measles, mumps, and rubella) vaccine shortly before experiencing developmental delays. Wakefield published his findings in the highly respected medical journal The Lancet. Shortly following the paper’s publication, Wakefield held a press conference where he openly criticized the “triple‐jab” MMR vaccine. Wakefield claimed that the triple‐jab vaccine could affect children’s immune systems, speculating that the measles virus in the vaccine caused proteins to leak from the intestines and impair neurons in the brain. Although Wakefield’s publication never actually claimed that the MMR vaccine caused autism specifically, he did make this claim consistently afterward. 

    • By 2011, Deer had enough data to prove that his accusations against Wakefield were correct. As it turned out, 5 of the 12 children in Wakefield’s study had case histories reflecting developmental issues prior to getting the vaccine, and three of the children didn’t even have autism at all. Most damning was the fact that before his 1998 article in The Lancet, Wakefield had been funded by Legal Aid Board, a law firm planning to bring a lawsuit against vaccine manufacturers—an apparent conflict of interest. Had he been like other scientists, Wakefield would have disclosed any conflicts of interest, but he didn’t identify any in his report. 

  • Intermittent Fasting: Krista Varady, a professor of kinesiology and nutrition from the University of Illinois at Chicago, offers some clarity. Varady has run dozens of clinical trials on weight loss and fasting in hundreds of people. Her data show that people do lose weight through intermittent fasting methods.3 However, her data also show that the cause of the weight loss with intermittent fasting is not because it puts the body in a special fat-burning mode (as Thurlow suggests). Rather, her data suggest that people on intermittent fasting diets lose weight because they eat less overall—they are consuming fewer calories. 

    • Autophagy: The process by which the body replaces old and damaged cellular components with new ones. 

  • Pennington Plan: Dr. Andrea Pennington, author of The Real Self Love Handbook and The Pennington Plan, is one such example. Pennington followed the latest fad in fast-tracking a career. The track includes five simple steps: (1) get an advanced educational degree from anywhere; (2) read a few books (possibly in quackery similar to what you will try to produce); (3) write a book or two of your own; (4) set up a super fancy website to sell the book(s); and (5) give a clichéd self-help TED Talk to market the book(s)—all to create an illusion of credibility to score guest speaker gigs for thousands of dollars. 

  • Sugar Hyperactivity: People continue to insist that giving children sugar makes them hyperactive, despite the fact that virtually all tests show that sugar does not cause hyperactivity. 

  • Flat Earth: When online surveyor YouGov conducted a survey asking over 8,000 US adults, “Do you believe that the Earth is round or flat?,” only 84% of respondents felt certain that the Earth is round. A total of 5% expressed doubts, 2% affirmed a flat Earth, and 7% weren’t sure. 

    • It wasn’t until after confirmed reports that Magellan and Elcano completed a circumnavigation of the globe from 1519 to 1522 that the true shape of the Earth was commonly accepted. 

___________________________________________________________________________

Problems in Thinking

  • Affect Heuristic: Making snap judgments and decisions on the basis of feelings rather than facts (i.e., going with your gut). 

  • All-or-Nothing Thinking: Endorsing a dichotomous, black-and-white style of thinking by viewing situations categorically and ignoring other possibilities. For example, “If this investment for the future isn’t a total success, it’s a complete failure.” 

  • Anchoring and Insufficient Adjustment: Estimating frequencies or likelihoods based on incomplete computations or insufficient adjustments from arbitrary initial values. 

  • Anecdotal Fallacy: Use of broad generalizations based on anecdotal evidence or vivid examples. For example, “My uncle smoked a pack of cigarettes every day and chewed tobacco for 50 years and he never got lung cancer. Therefore, tobacco products don’t cause lung cancer.”

  • Appeal to Authority: Using arguments made by an “authority figure” in support of an idea, even when the authority figure has no expertise in the area.

  • Appeal to (False) Consensus: Believing and arguing that something is true because everyone in the vicinity appears to believe or like the idea.

  • Appeal to Ignorance: Arguing that something must be true because it can’t be proven false.

  • Appeal to Nature: Arguing that something is good because it is natural or that something is bad because it is artificially produced.

  • Argument from Consequences and a Belief in a Just World: Basing the truth of a claim on its alignment with a dogmatic just-world ideology. For example, “Good things come to those who are good.”

  • Availability Heuristic: Judging the frequency of a category or the probability of an event based on the ease with which it comes to mind. For example, many believe car accidents and homicides are leading causes of death in the United States because the media devotes more coverage to homicides and car accidents than diabetes and stomach cancer, which kill twice as many Americans annually.

  • Backfire Effect: Occurs when people respond to evidence that refutes their misinformed beliefs by doubling down on their original beliefs.

  • Behavioral Contagion: The pervasive tendency of people to unconsciously copy others’ behavior. 

  • Catastrophizing: Predicting negative future outcomes without considering other likely outcomes or falsely enhancing the probability that a negative event will result in a complete disaster.

  • Cognitive Dissonance: The mental tension created when someone simultaneously holds two conflicting thoughts. 

  • Composition Fallacy/Erroneous Induction: Suggesting that a general principle is true because some parts of it appear to be true.

  • Confirmation-Bias: Selectively collecting data and failing to seek out objective facts, interpreting new information to support existing beliefs, recalling details that uphold one’s preexisting beliefs, and ignoring any information that challenges one’s beliefs.

  • Confusing Correlation with Causation: Assuming that because two events or variables co-occur that one causes the other. Correlation does not imply causation.

  • Disqualifying or Discounting: Unfairly and unreasonably concluding that positive or negative evidence does not count.

  • Division/Erroneous Deduction: Recklessly arguing that because something is a member of a group that each member of the group possesses the same characteristics the group is believed to possess.

  • Emotional Reasoning: Making misguided conclusions that something is true because one “feels” it strongly while ignoring or discounting evidence to the contrary.

  • Equivocation: Mistakenly equating two meanings of a word/phrase or failing to see that using a word/phrase in two different ways can make the argument invalid.

  • False Analogy: Using perceived similarities to infer additional similarities that have yet to be observed.

  • False Dichotomy: Assuming there are only two sides to a claim.

  • Fixed Labeling: Putting a fixed, global label on something without considering that the evidence might more reasonably lead to a much different conclusion.

  • Framing Effect: Forming judgments and making decisions on whether the claims are framed negatively or positively. For example, “This frozen yogurt has 20% fat. Nope, I don’t want it. Ah, this frozen yogurt is 80% fat free. Yes, I’ll take three.”

    • Most often, people in a bad mood tend to process information in a more detailed and systematic manner that is conducive to detecting deception. On the other hand, those in a good mood tend to adopt a superficial processing style less conducive to detecting deception. 

    • People tend to evaluate the correctness of information based on their emotional reactions to that information. Do I like or dislike what the information says about me, my loved ones, or our future? 

  • Gambler’s Fallacy: Falsely believing that good things will occur directly following unfavorable/undesirable outcomes.

  • Genetic Fallacy: Arguing solely based on someone’s or something’s history, origin, or source, overlooking any differences to be found in the present situation. 

  • Hasty Overgeneralization: The practice of making sweeping generalizations on the basis of pure coincidence.

  • Hindsight Reasoning: Perceiving events that have already occurred as having been more predictable than they actually were. One can always explain outcomes (or data) better once the results are known.

  • Hot-Hand Fallacy: Falsely believing that good things will occur directly following favorable/desirable outcomes. For example. “I’ve won three hands of poker in a row. In the next hand I’m raising the stakes. Tonight, I can’t lose.” 

  • Illusory Correlation: Perceiving a relationship between two or more things where none actually exists or perceiving a stronger relationship than one that actually exists.

  • Illusory Truth Effect: The tendency to believe false information is correct after repeated exposure. 

  • Impossible-to-Disprove Claims: Forgetting that extraordinary claims require extraordinary evidence and equating things that cannot be effectively or entirely disproven with truth.

  • Mind Reading: Believing that one knows what others are thinking and failing to consider other likely possibilities. 

  • Outcome Bias: Basing the perceived correctness of decisions on the emotional reactions to the outcomes of those decisions.

  • Overconfidence and Bold Statements: Forgetting that just because someone makes a claim with confidence and conviction doesn’t mean that the claim is true.

  • Post-Hoc Fallacy: Illegitimately assuming that an earlier event caused a later event due to the order in which they occurred.

  • Prosecutor’s Fallacy: Assuming the conditional probability of A given B is the same as the conditional probability B given A.

  • Pseudo-Profundity: Use of pseudo-profound language; intentionally obscured through exaggerations, ambiguous references, insider jargon, buzzwords, and the authoritative pretense that the speaker knows about things that no one else can possibly comprehend. 

  • Representativeness Heuristic: Judging the similarity or belongingness of one thing N to a category Z based on the degree to which N is similar to the essential properties of the category Z.

  • Scientific Language: Automatically assuming that something is valid because it sounds science-y.

  • Simulation Heuristic: Judging an event based on the ease with which alternatives to reality are mentally simulated.

  • Slippery Slope Fallacy: Insisting that a relatively small first step leads to a chain of related events culminating in some significant effect.

  • Spurious Correlations from “Big Data”: Believing that statistical anomalies found in large data sets after conducting a large number of random analyses reveals the true relationships between variables.

  • Straw Man Fallacy: Distorting or misrepresenting an argument in order to make it easier to dismiss/defeat; mischaracterizing an opponent’s argument or position in a distorted or oversimplified way, and then knocking down that mischaracterization.

  • Truth-Default-Theory: When people are presented with a new idea, they accept that idea as true—at least temporarily—to aid their comprehension of the idea. When people communicate with each other, they tend to passively presume that another person’s communication is honest and true, independent of its actual truth. After we accept something to be true, we have a hard time unlearning it if it turns out to be false. 

___________________________________________________________________________

Misc Quotes

  • “The truth may be puzzling. It may take some work to grapple with. It may be counterintuitive. It may contradict deeply held prejudices. It may not be consonant with what we desperately want to be true. But our preferences do not determine what’s true”-Carl Sagan.

  • “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passions, they cannot alter the state of facts and evidence”-John Adams.

  • “When you want to help people, you tell them the truth. When you want to help yourself, you tell them what they want to hear”-Thomas Sowell

  • “Falsehood flies, and truth comes limping after it”-Jonathan Swift

  • “Nullius in verba (take nobody’s word for it)”- Motto of the Royal Society. 

___________________________________________________________________________

Terminology

  • Accountability: The condition of having to answer, explain, or justify one’s actions or beliefs to an audience. 

  • Best Alternative to a Negotiated Agreement (BATNA): Whoever has a stronger BATNA wins in negotiation. 

  • Credibility: A person’s ability and motivation to provide accurate and truthful information. 

  • Critical Thinking: A learned process of deliberation, fact-finding, and self-reflection used to comprehend and appropriately evaluate information in order to decide what to believe or what to do. 

  • Cultural Truisms: Beliefs that most members of a society accept uncritically and have never so much as considered defending.

  • Established Knowledge: Semantic knowledge, like the things found in dictionaries; knowledge justified by the rules of logic: if A > B and B > C, then A > C; knowledge justified by a system of information, such as 30 + 11 = 41; knowledge supported by empirical information.

  • Influencer Marketing: Marketing of brands and products by influential and well-known people, like celebrities. 

  • Markup: The difference between the cost of the good or service and its selling price.

    • Purchasing wine at a restaurant will come with a markup of about 400%. As a result, some people opt for soda instead, believing they are saving money. Yet the markup of restaurant soda is outrageously high, at over 1,000%. When the markup is beyond that which covers the costs of doing business and a reasonable profit, it is a bullshit markup. 

  • Neuroplasticity: The brain’s ability to change continuously throughout an individual’s life; neural networks in the brain can form and reorganize synaptic connections. 

  • Pseudo-Profundity: Pseudo-profound language; intentionally obscured through exaggerations, ambiguous references, insider jargon, buzzwords, and the authoritative pretense that the speaker knows about things that no one else can possibly comprehend. 

  • Texas Sharpshooter Fallacy: Refers to a man with a gun, but with no shooting skill. He fires a large number of bullets into the wall of a barn, proceeds to paint a bull’s-eye around a cluster of bullet holes, and declares himself a sharpshooter. 

  • Unclarifiable Unclarity Test: A test posed by philosopher G. A. Cohen that asks: Would the claim have a different effect if it were reversed?

  • Veblen Goods: Luxury goods whose prices do not follow the typical laws of supply and demand. They are in demand because they are expensive; i.e., wine. 

___________________________________________________________________________

Chronology

  • 25 Oct, 2009: Suicide bombers in Iraq detonate two tons of explosives in downtown Baghdad killing 155 people and three ministries (Bullshit by Petrocelli).

  • 1971: Stanford prison experiment; a team of psychologists led by Philip Zimbardo convert their psychology lab into a mock prison. Private cubicles were turned into prison cells, and college students were recruited to fill the prison and were randomly assigned to play the roles of prisoner or prison guard. After making it clear that cruelty on the part of the guards was part of their identity and necessary for the advancement of prison correction, Zimbardo and his colleagues watched what happened. What they saw was healthy and normal functioning college students turn into prisoners who suffered such intense emotional stress reactions that the mock prison began to look like the real thing. Many prisoners acted like zombies, obeying the demeaning orders of other college students who transformed into ruthless and dehumanizing prison guards. So intense and unexpected were the transformations that the experiment- planned to last two weeks- was terminated by the sixth day. The research is an example of what occurs when one group of people is encouraged to take total power over a group of derogated others in a dehumanizing environment such as a prison (Bullshit by Petrocelli).

  • 1962: The Great Chinese Famine; due in large part to ecological destruction caused by Mao’s Great Leap Forward, ~36M Chinese starve to death; the largest famine and man-made disaster in human history (Bullshit by Petrocelli).

  • 1958: CCP Chairman Mao institutes the Great Leap Forward, an economic and social campaign to transform China from an agrarian economy into a communist society through the formation of communes. One of Mao’s plans was to increase agricultural yields and bring industry to the impoverished countryside. In addition to outlawing private farming, Mao declared war on four pests blamed for inadequate grain yields- rats, flies, mosquitoes, and sparrows (Bullshit by Petrocelli).

    • May, 1958: ~4.31M sparrows are exterminated throughout China (16-yo Yang Seh-mun was deemed a national hero for killing 20K sparrows by strangling them with his hands). Without the sparrows, the crop-eating vermin populations explode. The devastation to grain fields by crop-eating insects was so palpable that by April of 1960, Mao ended the Smash Sparrows Campaign, redirecting the fourth focus to bedbugs. At the end, the CCP made peace with sparrows, importing 250K specimens from the USSR (Bullshit by Petrocelli).

  • 1920: Charles Ponzi cons investors out of ~$20M (~$200M today) (Bullshit by Petrocelli).

___________________________________________________________________________