First, I think we should all give proper credit to whoever came up with such a nifty, intriguing title. Second, I think we should all read this book now, and until then we should refrain from watching the news, or voting, or talking to anybody about anything, because ten to one we’ve all been just a bit manipulated. This book will help protect you. This book will open your eyes. This book is where neurology and statistics have a torrid affair and produce a love child called Magic.
Magic, as I see it, is the application of uncommon knowledge to make the apparently impossible possible. If you know how to wave a wand and produce a rabbit where no rabbit was before, that obviously qualifies, but understanding how the mind works, and where our undeniable points of foolishness are qualifies, too. Advertisers and campaign coordinators regularly conjure our money right out of our wallets using these sorts of spells, and while D. Gardner never uses the word “magic,” and might well object to my use of the term in this context, it pleases me no end to think of this little book as a collection of counter-charms. And it’s my blog.
So, you need examples. I open the book, at random, to page 80, and a principle the author calls the Example Rule. The Example Rule is that things seem more real and more important if you can easily picture a specific example. This rule (as explained earlier in the book) drives the sort of silliness whereby a single freak accident can trigger fear all out of proportion to actual risk. Right after I read the book, life provided a perfect illustration when we heard that twenty-some people nationally had gotten sick from contaminated cantaloupe. My husband and I looked at each other in fear; we had a cantaloupe right on the kitchen table! Oh no! Of course, given the number of people in the United States, anything that only sickens twenty is probably pretty safe. So, on this page Gardner describes a study in which highly trained forensic psychiatrists were given a patient assessment and asked whether they would release such a patient. Half the psychiatrists were told the patient had a 20 percent chance of getting violent, the other half were told that “20 out of every 100 patients similar to Mr. Jones” would get violent. Notice that these two statements mean exactly the same thing, and that you can’t get to be a forensic psychiatrist if you don’t know what “percent” means. Yet of the psychiatrists who got the abstract term “percent,” only a fifth of them decided not to release Mr. Jones. Of those who got the easy-to-picture twenty violent people, two fifths kept the fictional Mr. Jones locked up. And these are the guys who make decisions like this for real.
The point isn’t that forensic psychiatrists are idiots, the point is that all of us have psychological pressure-points, and that even those of us who know better will usually buckle when one of these points is touched.
The reason, apparently, is that we’re not living in the world our brains are designed for. Until the invention of the telegraph and the locomotive, communication was slow enough, and most communities were small enough, that if twenty people got food poisoning and you heard about it, they were probably your neighbors and thus not only a large portion of a small group, but also the people likely eating the same food you were. Better throw out the cantaloupe. Since the emotional mind is faster than the rational mind, such intelligent but somewhat clumsy fear can help a person play it safe until the rest of the facts are in and you can think clearly. The Example Rule is only one of several principles Gardner articulates by which our emotional minds make quick judgment calls while we wait for our intellects to catch up. They’re not bad, we just need the means to use our intellects to supplement these short-cuts, since they don’t work all the time.
Here’s where statistics become magical. The Example Rule is a hard-wired psychological tendency and we have it because it works, or used to, anyway. It’s rooted in evolutionary trial and error, not reason, but a rational principle can nevertheless be extracted from it; the more often something has happened in the past, the more confident we can be that it will happen again. Until recently, the most familiar incidents were almost always either the most common (everyone knows somebody who got sick from bad cantaloupe) or the most extreme (no one will ever forget the year when half the tribe died from bad cantaloupe), so the Example Rule provided a neat shortcut to identifying both high-probability events and high-impact events. Probability and impact together make risk. Now look at statistics. The simplest statistic (and the only one I really understand mathematically) is risk expressed as a percent, as we saw earlier with Mr. Jones; you define the situation clearly enough that you can recognize it when it repeats, then divide the number of times patients like Mr. Jones get violent by the number of times they eat cantaloupe, and you get a fraction that expresses the risk involved in feeding cantaloupe to Mr. Jones. Then you convert the fraction to the standard denominator of 100, so you can easily compare the risk of violence in the 37 mental patients who ate cantaloupe with that of the 43 who ate watermelon without bending your brain too badly. Now, silliness aside, this is the Example Rule. By using clearly defined numbers well, the basically sound rationale of native human intelligence is taken out of the shadows of the emotional mind and into the sunlight where you can see whether anyone is monkeying with it. And checking for monkeying is exactly what all of us need to do.
The point of The science of fear is that under certain circumstances we are all overwhelmingly likely to make certain predictable mistakes and yet be absolutely convinced we are right. We feel right. Knowing what these common mistakes are is not enough, because we will still feel right. Feeling right will lead us to believe we must be smarter than all those idiots out there who reflexively fear yellow melons. We’re not. The key is to recognize the weak spots where the head knows better than the heart, and think carefully. Whether there is intentional monkeying or an honest cloudiness, the magic in this book breaks the spell.
I will close, by way of explaining why all this is important, with the same example Daniel Gardner opened with; the attacks of September 11th, 2001. On that day, of course, airplanes full of people were appropriated for use as missiles in order to kill and terrify other people. Naturally, no one wanted to get on a plane for about a year after that; the example of planes exploding was too fresh in the collective mind, and flight seemed too dangerous. Instead, many people drove. Except that the risk of dying in a car crash is so much higher than the risk of dying in a plane crash that, according to Gardner,
“even if terrorists were hijacking and crashing one passenger jet a week in the United States, a person who took one flight a month for a year would have only a 1-in-135,000 chance of being killed in a hijacking—a trivial risk compared to the annual 1-in-600 odds of being killed in a car crash.”Since car crashes are generally considered routine unless you’re in one, they don’t often make the news, and that risk isn’t uppermost in anyone’s mind. But since the risk of dying by car is fairly constant, more people driving means more people dying in cars; if 6000 people drive for a year, then roughly ten will die. If 12,000 people drive, roughly twenty will die. Based on the number of extra people on the roads during the year when few people were flying, 1,595 people died in cars because they were afraid of airplanes. That's a powerful spell, and we need an equally powerful spell to counteract it.
This looks very good -thanks for the recommendation! I will look for it at the library.
ReplyDeletethe misuse of data, stats and science in the media is a huge problem. consumers need to be better educated and we need better mechanisms to hold the media accountable.
ReplyDeleteYes, that's true--I think that's part of the reason the book was written. Maybe one of the most effective mechanisms is for consumers to demand better media coverage? Like, maybe the education needs to come first.
ReplyDeleteBy the way, the first "Caroline" comment was not me--the commenter was using a computer I have used, and accidentally signed in with my profile. I didn't thank myself for the recommendation.
ReplyDelete