Afraid of the Wrong Things

Around the world, people are grappling with the risks posed by the COVID-19 pandemic. How do our minds process that risk, and why do some of us process it so differently? This week, we talk with psychologist Paul Slovic about the disconnect between our own assessments of risk and the dangers we face in our everyday lives. 

Additional Resources:

How Safe is Safe Enough?  A Psychometric Study of Attitudes Towards Technological Risks and Benefits”, by Baruch Fischoff, Paul Slovic and Sarah Lichtenstein in Policy-Sciences, l978

Risk as Feelings” by George Loewenstein, Elke Weber,  Christopher Hsee and Ned Welch, in Psychological Bulletin, March 2001

How Personality and Policy Predict Pandemic Behavior: Understanding Sheltering-in-place in 55 Countries at the Onset of COVID-19” by Friedrich Gotz, Andres Gvirtz, Jon Jachimowicz, and Adam Galinsky,   in American Psychologist, 2020

Money, Kisses, and Electric Shocks: On the Affective Psychology of Risk,” by Yuval Rottenstreich and Christopher Hsee in Psychological Science, 1999

“The Importance of Prior Probabilities in Coronavirus Testing” by Paul Slovic in Medium.com, May 20, 2020 

The more who die, the less we care: Psychic Numbing and Genocide” 

By Paul Slovic and Daniel Vastfjall  in ResearchGate.net/publication, September 2015

If I look at the mass I will never act: Psychic numbing and genocide by Paul Slovic, Judgment and Decision Making Vol. 2, No. 2 April 2006

Insensitivity to the Value of Human Life: A Study of Psychophysical Numbing,” by Paul Slovic and James Friedrich in Journal of Risk and Uncertainty, May 1997

Compassion Fade: Affect and Charity are Greatest for a Single Child in Need,” by Daniel Västfjäll, Paul Slovic, Marcus Mayorga, Ellen Peters in PLOS/one 2014

Judgment and Decision Making,” by Baruch FIschoff and Stephen B. Broomell, Annual Review of Psychology 2020.


Public Understanding of Ebola Risks: Mastering an Unfamiliar Threat,” by Baruch FIschoff, Gabrielle Wong-Parodi, Dana Rose Garfin, E. Alison Holman, and Roxane Cohen Silver, in in Risk Analysis, 2018

The transcript below may be for an earlier version of this episode. Our transcripts are provided by various partners and may contain errors or deviate slightly from the audio.

Shankar Vedantam: This is Hidden Brain. I'm Shankar Vedantam. It's one of the most iconic movie soundtracks of all time. In 1975, a young Steven Spielberg scared the living daylights out of millions of people with Jaws.

Clip from the film Jaws: Everybody please get out of the water.

Shankar Vedantam: A great white shark terrorizes a New England beach town. As one victim becomes two and then three and then four, people respond first with denial, then fear, and finally, outright hysteria. After watching the movie, I remember being scared to even stick my toe in the ocean. And even today, when I go to the beach, I can't help but peer out at the water and ask myself, "Is that a dorsal fin?" This week on Hidden Brain, the disconnect between our fears and the real dangers we face in our daily lives. As the world grapples with a devastating pandemic, we consider how our minds assess risk. What makes us focus on some threats and not on others, and how can we use this knowledge to prepare for the future? Paul Slovic is a psychologist at the University of Oregon. For decades, he has studied how people think about risk and the mismatch between the intuitive feelings we have about risk and the way we analyze risk scientifically. Paul Slovic, welcome to Hidden Brain.

Paul Slovic: Thank you, Shankar. Glad to be here.

Shankar Vedantam: For years, Paul, the movie Jaws made people afraid of going to the beach. Did you ever think twice about swimming in the ocean after watching the movie?

Paul Slovic: I laugh, because I'm not a swimmer. I was a child in Chicago in the 1940s during the polio epidemic. We weren't allowed to go swimming, because they thought it made us susceptible to polio, so I never learned to swim very well, so I stay away from water, so I don't worry about sharks. But it was clear that many people who lived near oceans were quite worried about it.

Shankar Vedantam: There's a serious mismatch, of course, between how afraid people are of sharks and how afraid we ought to be. Sharks kill maybe five or six people a year, and that's worldwide. Meanwhile, humans kill about 100 million sharks a year. If anything, it's the sharks who should be making horror movies about us.

Paul Slovic: Right. The movie created vivid images in our mind and a sense of experience. And so that creates a sense of risk of shark attacks much more powerfully than the statistics do.

Shankar Vedantam: Yeah. And of course, this is true not just of shark attacks. It's true of all manner of things that Hollywood has told us about over the years. Everything from snakes to serial killers, the risks in our minds vastly exaggerate the actual risks of those things affecting us.

Paul Slovic: Yes. What we find is that our sense of risk is influenced by the direct experiences we have and the indirect experiences we have through media such as film or the news media that's very powerful in influencing us.

Shankar Vedantam: When people think about risk, I think many people automatically assume that risk is something that you're analyzing. You analyze what the risks are in a situation. But you and many others argue that, most of the time, when people are thinking about risks, they're actually not using analysis to evaluate risks. You mentioned a second ago that people use their feelings. Can you talk about this idea that, for many people, our emotions, our affect, is closely tied up in our perceptions of risk?

Paul Slovic: We originally thought that people were analyzing risk, doing some form of calculating in their minds about what the probability of something bad happening would be and how serious that would be, and perhaps even multiplying the severity of the outcome by the probability to get some sort of expectation of harm. As we started to study this, we found out that, basically, we can do those calculations, but it's certainly easier to rely on our feelings. It's easy to do, it feels natural, and it usually gets us where we want, except when it fails. And there are certain ways that our feelings deceive us, and that's what my colleagues and I study is, when can we trust our feelings, and when should we stop and think more carefully and reflectively and look to data and argument and science to make a decision?

Shankar Vedantam: I was remembering one time, I was in Costa Rica, I believe, and we were going zip lining. You're attached to this wire that's about 200 feet above the Earth. And I remember the moment at which I was about to step off this ledge. I was just gripped with this sense of lunacy, that what I was doing was absolutely insane. At that point, of course, I was not calculating, "What are the risks that the rope will break? What are the odds that the harness will come loose?" It was entirely what I felt in my stomach that essentially told me, "This is an extremely risky activity."

Paul Slovic: Yes. That's the way it goes. After I had come to appreciate the concept of risk as feelings, I looked back in my own experience and recognized a very dramatic moment when my feelings were guiding me very powerfully. That was a time when I was driving on a busy freeway near Chicago and ran out of gas. I pulled the car off to the side of the road, and then I thought, "Well, okay, I'd better go find a gas station, get a gas can, and fill it and come back." But to do that, I realized, "I have to cross this freeway." And so I started to cross the freeway, and I would take a step onto the pavement and be looking at the cars approaching at 60 miles an hour and how far away. As I put my foot down, I'd be gripped by this fear, and I would retreat back and wait in hopes that I would find a bigger gap where I could step out and wouldn't be afraid.

Shankar Vedantam: Yeah. And in many ways, this makes total sense. As you're telling the story, I'm gripped by a sense of fear of thinking of you, Paul, stepping out across six lanes of traffic in Chicago. And at a certain level, this system works very well much of the time, it's worth saying. I mean the fact that you didn't have to calculate the speed of the moving cars and how much time they would get to get you and write all that down on a sheet of paper. You just were gripped by a sense that this is extremely unsafe, and you stepped back. That kind of fear, that kind of risk perception holds us in good stead much of the time, does it not?

Paul Slovic: Exactly. That's why we do it and we keep doing it, is because most of the time, relying on our feelings works for us, if our feelings have been conditioned properly by experience. It's very adaptive, except when it goes wrong, as it sometimes does.

Shankar Vedantam: I want to talk about some of the times and ways in which this intuitive sense of risk that we have runs up against our analytical approach to thinking about risk. There have been a number of experiments that have teased out this tension very beautifully. I want to start with an experiment that the researcher Christopher Hsee once ran. Volunteers were told they either had a low probability of losing $20 or a high probability of losing $20. They were then asked how much they would be willing to pay to avoid this risk, and the results were exactly what you'd expect. People were willing to pay about a dollar to avoid the low probability risk of losing $20, but were willing to pay about $18 if they faced a high probability risk of losing the $20. Very rational. Then, the researchers tweaked the experiment in a rather cruel fashion. Do you remember what they did, Paul?

Paul Slovic: Yes. They said that, "If the bad event happens, you're going to get a strong electric shock. Not one that is truly dangerous, but it's going to be very unpleasant." One group was told it was 99% chance of the shock. The second group, a 1% chance of the shock.

Shankar Vedantam: What happened? How did people react? Did they react in the same rational way when they confronted the low probability and high probability risk of losing $20?

Paul Slovic: Well, the group that faced a 1% chance of shock were willing to pay almost as much as the 99% group to avoid that shock. The reaction was not sensitive to the probability of the shock.

Shankar Vedantam: What is actually going on here? Why is it that when people are facing a 1% risk of an electric shock, in their minds, it feels as if you're talking about a 99% risk of getting an electric shock? Why is it different when it comes to the electric shock compared to when it comes to losing $20?

Paul Slovic: Well, the loss of $20, we would say, is relatively affect-free. Sure, we don't want to lose $20, but it's not a strongly emotional reaction as much as the potential shock was. When you think about getting a shock, that thought creates a feeling in you of anxiety, and that feeling of anxiety is the same feeling if you're thinking about it with a 1% chance or 99%. You're still thinking about the shock, and therefore, the mind does not modulate or multiply that feeling from the shock image by its probability. That is, our "feeling system" doesn't do multiplication.

Shankar Vedantam: When I read the experiment, I tried to put myself in the shoes of the volunteers, and of course, the moment I tried to do that, the thing that my mind went to was the last time I experienced an electric shock. As you point out, at that point, asking me to multiply that feeling by either 1% or 99%, that's not really possible to do, because my brain now is in the realm of affect and emotion, as opposed to calculation.

Paul Slovic: That's right. One way we can do that multiplication, and that is, if we push that 1% down to the realm of, "It's not going to happen. It's zero," we can then turn off the feeling that way. But that's a rather crude calculation there. Once you get above zero to a probability that you think might actually happen, then it's very difficult to modulate the feeling by that probability.

Shankar Vedantam: Of course, in the real world, most things are not zero probability or 100%. They're usually somewhere in between. When risks produce a feeling of fear or dread, our capacity to think analytically is impaired. Paul says this is why we worry about getting attacked by sharks rather than the far more likely prospect of getting in a car crash on the way to the beach. It's also the case that, sometimes, our brains get so overwhelmed with fear that they can't accurately process any additional fear. This idea builds on an area of psychology called psychophysics.

Paul Slovic: Some of the very earliest experiments in psychology were looking at how we perceived brightness or the loudness of a sound. What they found was that we're very sensitive at very low levels of brightness or very low levels of loudness. In a quiet room, you can hear a whisper. But then, as the loudness of the sound or the brightness of the light increased, took more of a difference to make us notice.

Shankar Vedantam: As we've seen, our feelings about risk are rarely shaped by data or by the data alone. Our feelings are shaped by stories, by images, and by the consensus of our groups. When we come back, we'll look at how our perceptions of risk shape how we think about homicide, climate change, and global pandemics. You're listening to Hidden Brain. I'm Shankar Vedantam.

Shankar Vedantam: This is Hidden Brain. I'm Shankar Vedantam. Over several decades, psychologists have explored how people arrive at their conclusions that something is risky, or that something is not risky. They have identified a number of factors that shape our perceptions of risk. These studies have found a significant gap between the way we analyze risks and the way we feel about risks. The two don't always match. Psychologist Paul Slovic has explored what happens in real life when these two ways of thinking produce different answers. Paul, if you ask Americans how many people are killed by homicide and by terrorism, they are likely to overestimate the risk. If you ask how many people die from heart disease and diabetes, we tend to underestimate the risk. Can you explain how a mental shortcut that's sometimes called the availability heuristic might shape these perceptions, Paul?

Paul Slovic: Yes. The availability heuristic refers to a mechanism whereby we judge the frequency or the probability of something by how easy it is to imagine it happening or to remember it happening in the past. We use imaginability and memorability as a shortcut way of judging probability and frequency.

Shankar Vedantam: Mm-hmm (affirmative). And again, at a very everyday level, this system makes perfect sense. Things that do come more readily to mind might actually be things that are more important to the context that we find ourselves in. And so this rough rule of thumb, this shortcut, this heuristic, is not always a bad thing. In fact, for much of our lives, this might be very useful.

Paul Slovic: Yes, because as you say, imaginability and memorability are related to frequency, but not always. Something that's a very dramatic event that's easy to remember or imagine happening, because we've seen it in a movie or something, will lead us to have a sense that this thing is frequent or likely, when in fact, statistically, it's very unlikely. Particularly, if the event is not only dramatic so that it sticks in our memory, but if it carries affect or emotion, that feeling then amplifies the effects of memorability and makes it seem even more likely.

Shankar Vedantam: Paul, the researcher Tali Sharot once conducted a study where she asked people a number of questions about potential negative life events. She asked them about the likelihood that someone they knew personally was going to die or that they would suffer a serious illness or that they would seriously embarrass themselves. She found that people generally underestimated the likelihood of bad things happening to them compared to the likelihood of the bad thing happening to other people. Does optimism and the optimism bias shape our ability to look danger in the eye?

Paul Slovic: Yes. It leads us to have more confidence in being able to cope with a situation than perhaps is warranted. I think even Professor Sharot would say that, in many cases, the optimism bias is adaptive. It leads us to take action, where otherwise we might just be rather passive in situations. It leads us to take chances that are often beneficial, so it's a good thing, but it can also lead us to be very overconfident in our ability to handle certain types of situations that are really quite dangerous and are beyond our capability. And the fact that we feel often that we're in control of some of these events. We saw also optimism bias greatly with regard to cigarette smoking, where people recognize that smoking is in general not good for your health, but they felt that they could smoke in ways that minimize those risks.

Shankar Vedantam: That's what's so fascinating about this, because the optimism bias is not just that you're underestimating risks in general. You're underestimating the risk for yourself. You think that other people are just as vulnerable to getting killed in a highway crash or getting cancer from smoking. You just think that somehow you're special.

Paul Slovic: Yes, that you're not going to smoke very long or that you're going to smoke the cigarettes that are less harmful or fewer cigarettes per day. All of these things will enable you to control the risk in ways that you don't think other people are doing.

Shankar Vedantam: Talk for a moment about the idea of cumulative risks. I might have a very low probability of getting killed in a car crash if I don't wear a seat belt on one drive, but if I don't wear a seat belt over many years, the cumulative risk might be quite large. How good are we in our minds at keeping track of these kinds of risk that gradually accumulate over time?

Paul Slovic: I don't think we really do the cumulative assessment. This was, I think, very evident early on. When seat belts were first introduced, there were some very high-power advertising campaigns to try to motivate people to use them. Only about 10 or 15% of drivers were wearing seat belts, very low percentage, so they had these campaigns to get people to buckle up for safety. And, they didn't work. They had very little impact. Thinking about that, say we take 50,000 trips in a lifetime, these individual trips are really pretty safe, so people were not rewarded by wearing a seat belt that was a little bit uncomfortable. And if they didn't put it on, they weren't punished either, because they didn't need it. The problem is that, over 50,000 trips, the likelihood that you'll need a seat belt on one or more of those trips becomes significant. It might be one in three people will actually be in a serious accident where they would benefit from a seat belt. So the cumulative probability was high enough to warrant having people wear seat belts. I wrote an op-ed piece saying that only new laws will produce seat belt use. People started to wear seat belts because it was a law, and then it became a norm. Now, we have relatively high seat belt usage.

Shankar Vedantam: There's a deep philosophical insight here in what you're saying, because I think especially in the United States, we are a country that believes in individual liberty and autonomy and freedom, and people want to have the sense that they are making the choices that are best for them. But I think the many examples you've talked about here on the seat belts is perhaps the classic example, is how, if you leave things up to individual choice, it is not irrational to say, "On this particular drive, on this particular Tuesday that I'm driving, my risk is actually not very high." And it might actually be the rational thing to say, "Okay, I can forgo the seat belt," and nothing much happens. And our minds simply are not equipped to deal with these kinds of risks gradually accumulating over many, many decades for that one event, you know, 25 years from now when the seat belt actually is really useful. And situations like this, this is part of what you were alluding to a second ago, you really do need the intervention of systems that protect people, in some ways, from themselves. Can you talk about this idea a moment? It seems to me that that's one of the philosophical implications of the work that you've done.

Paul Slovic: Yes. And you get this broader perspective through science, through collecting data, that can show how these risks accumulate and affect both individuals and populations. The same thing happens with cigarette smoking. That is, the risk that smoking this next cigarette is really going to harm you significantly, it doesn't work that way. The harm that comes is the cumulative harm of smoking thousands of cigarettes. That leads to quite a significant increase in the risk of not just lung cancer, but many diseases. And it can be demonstrated through data, through statistics, both for the individual and for the population. But at the individual level, this very next cigarette or this very next drive is not likely to be a problem for you.

Shankar Vedantam: Some mismatches between our analytical approach to risk and our emotional response to risk comes about because of our inability to do certain kinds of math in our heads. I was looking for examples of this yesterday and came by an interesting puzzle. If you take a piece of paper and you fold it over, it now has double the thickness it had at first. And then if you fold it over again, it's not four times as thick as the original sheet. And the puzzle is, if you had an endless amount of paper, how many times would you have to fold it over to get a tower that stretches all the way to the moon? When I first saw the puzzle, I guessed the answer must be about you have to fold it a billion times. And the correct answer is 45. Just 45 folds, and you get a tower that basically stretches all the way from the Earth to the moon, which is a quarter of a million miles away. Can you talk a moment, Paul, about how people often experience difficulty appreciating the nature of exponential growth, where two becomes four, four becomes eight, eight becomes 16, and so on, and how this shapes our perception of risk?

Paul Slovic: Yes. This is a very interesting challenge for the human brain. Some experiments that were done in the 1970s in the Netherlands demonstrated this very clearly, where people were given a series of numerical measures of pollution increasing, and the pollution was doubling or tripling every year. They were asked, "Where will this be after 10 years?" What they found was that people projected in a straight line from this very low level at the beginning of this exponential growth and greatly underestimated where this was leading. So the hallmark of exponential growth and what makes it so challenging and insidious in a way is that it looks very benign at the beginning. Even though it is changing exponentially, the numbers are still small. What happens is it suddenly roars up like a fire that erupts and overwhelms us with very high numbers. So, we don't anticipate how quickly it's going to explode.

Shankar Vedantam: I want to talk about exponential growth in the context of the COVID-19 pandemic. In early March, I believe it was on March 9th, 2020, New York City Mayor Bill de Blasio had this to say about the coming pandemic.

Bill de Blasio: Some places like Italy are doing mass school closures. That's not on the menu here. Is there a theoretical scenario where that could happen? Of course. But is it anywhere near to where we are now? No.

Shankar Vedantam: This was on March 9th, Paul. New York City closed its public schools one week later on March 16th. I think that speaks in some ways to what you were just saying, which is that it's really hard, even if you're a public official with the best of intentions and you have a lot of data, to actually truly appreciate how staggeringly fast a pandemic can grow.

Paul Slovic: Absolutely. Governments all over the world were slow to appreciate what was going to happen when their small numbers of cases began to grow exponentially. I think there was a general delay in responding. Not everywhere, fortunately, but the majority of nations were slow to react because the numbers were small, increasing slightly, and it didn't look all that bad. But by the time we really start to take it seriously, it's out of control.

Shankar Vedantam: What do you think it is about the mind that makes it difficult for us to appreciate exponential growth?

Paul Slovic: Well, probably because it is relatively rare compared to straight-line growth, like counting. Counting is a linear system, and we're much more familiar with things that grow in a linear way than those that grow exponentially.

Shankar Vedantam: In so many ways, our minds struggle with the mathematics of risk. Exponential growth is hard for us to conceptualize, and as we heard earlier, the difference between a 1% risk and a 99% risk becomes difficult to take into account when our emotions become part of the equation. Then, there's the issue of control and whether we feel like we're the ones making choices about which risks to take. More than 50 years ago, the researcher Chauncey Starr discovered something curious. People were willing to accept far greater risks for activities that they chose over activities that did not involve personal choice. They accepted a higher level of risk for, say, skiing or bungee jumping, but found a similar level of risk unacceptable when it came to things like building safety or the use of preservatives in food. Paul and other researchers have refined this idea in subsequent studies, but this core finding about the importance of personal control may help us to understand why some people say they are not worried about becoming ill with the COVID-19 virus but are worried about the safety of the COVID-19 vaccine. You feel you have control when you go to a restaurant. You convince yourself the risk of the virus is small. But you don't have control over how a vaccine is made. You have to trust the results of studies conducted by scientists whom you will never meet. Paul says our sense of control plays a significant role in our perceptions of risk.

Paul Slovic: Someone once gave me an example. Supposing that you're slicing a loaf of bread, how close to the knife would you put your fingers? Supposing that someone else was slicing the bread, how close would you put those? You probably wouldn't be as close, and I think that's a nice example of the sense of control. I think it's a very important element in driving, where the driver feels that they are controlling the risk because they're controlling the speed and other aspects. They don't realize all the elements of the situations that are not under their control, like the condition, road hazards, or what other drivers are going to do.

Shankar Vedantam: Where do you think the origins for many of these biases are, this mismatch between the way our brains operate and the challenges that modern life places before us, Paul?

Paul Slovic: When we were earlier in our phase of evolution and our brain was forming, we were shaped very much by the experience that we faced at that time, which had to do with things that were up close and personal, things that were right in front of us, like an animal lurking in the bushes or a hostile tribe. We had that sensitivity to things that were relatively small in number, and we could sense directly through our senses. In the modern world, the hazards are far more diverse. Many of them are invisible, like things that have to do with bacteria or viruses and things that we can't easily see, things that happen at a distance from us but at some point will affect us... Things related to climate change, where the problem seems still fairly distant, something that's happening perhaps elsewhere to other people. The modern world has a whole different array of hazards from the ancient world, and a lot of these modes of thinking were shaped in the cave, so to speak.

Shankar Vedantam: Yeah. I remember going for a walk a few months ago in the woods. At one point in the trail, I remember leaping backwards. I looked down and I saw that there was a snake five feet in front of me. I was struck by the speed of my response to that threat. It was almost instantaneous. It almost felt like a reflex. And through all of the months of the COVID pandemic, I have never experienced a moment of fear like I felt when I saw that snake. And of course, when you think about it, the risk of COVID is probably much greater than the risk that that snake posed me, because most snakes, of course, in the wild, in the woods in suburban communities, are probably going to be relatively safe. But there's something in some ways about the Stone Age brain that has remained in my brain, where I'm much more vigilant to the risk of a snake than I am to the risk of this invisible virus that could do me and others harm.

Paul Slovic: Yes. Again, a difference between hazards that existed a long time ago that shaped the way our brains formed than the risks of today. Another variation on what you said is, instead of seeing the snake in front of you, you had heard an ominous sound in the bushes. And then the question is, do you stop and you analyze the sound and debate with yourself as to whether this is really something you should worry about? Or you just take it as something that sounds scary and move away from it. What likely will happen is that you accept the first reaction. This is something that my colleague Daniel Västfjäll and I have discussed. There's no gatekeeper that leads us to analyze information that conveys feelings in us. We just take it for what it is, and the brain lets these feelings in, and we react to them. We don't vet our feelings the way we vet arguments, and I think this goes way back as something that was very adaptive a long time ago.

Shankar Vedantam: And for good reason, because again, as you point out, if you actually stopped and analyzed every threat and drew up a cost-benefit calculation every time you heard a growl in the bushes, likelihood is you'd be dead.

Paul Slovic: Yes. Not only would your calculations likely be wrong because it's so hard to do those calculations, as you say, you may not survive. You have to just move fast.

Shankar Vedantam: There are many risks where, if I take the risk, I'm the one who is bearing the cost of that risk. If I decide to go mountain climbing or BASE jumping, I'm the one who's incurring the risk. Presumably, others are affected as well if something happens to me, but to a logic stand, I bear the consequences of my actions. That logic breaks down when it comes to the risk of something like a pandemic, where my actions in fact might affect the wellbeing of other people. Economists sometimes call these phenomena negative externalities. In other words, my actions that I'm undertaking freely might affect somebody else's ability to be free and their ability to do what it is that they want to do. If our minds are not very good at appreciating the things that are risky even to us, how effective can they be in appreciating the things that are risky to other people?

Paul Slovic: It's particularly true that we can't fence that when the consequences are not direct. Some things that we do that have collective consequences are, for example, stopping at red lights. If you violate those, you get to your destination a little quicker if you don't stop at red lights, but there'll be massive increase in collisions and deaths, which is very obvious harm. But we get good feedback on the harm that your individualism is producing. The problem with COVID that makes it so insidious and difficult is the fact that protecting yourself and others by wearing a mask, social distancing, staying home rather than going to school or to the workplace, you don't have the sense that you're creating harm when you are violating those, because you don't see the harm immediately or directly. We rely heavily on what's right in front of our eyes in terms of sensing risk. We don't see the damages that are caused, but we feel the benefits of not doing these things. We get to hang out with our friends and go to restaurants and bars and work, which we need to do. So, we feel the benefits of doing the wrong thing, but we don't see the benefits of doing the right thing, and that's a recipe that leads even the most responsible people to ease off over time.

Shankar Vedantam: Yeah. And of course, when you think about this from the point of view of natural selection, the reason viruses have thrived among human populations for thousands of years is really because they've taken advantage of how our minds work.

Paul Slovic: Yes. It is interesting. Well, they succeed when they have characteristics that take advantage of our cognitive limitations. In fact, as we go to more and more remote parts of the Earth and start to inhabit rainforests and other places and the climate changes, we are perhaps coming into contact with more of these viruses that heretofore we haven't had contact with. That's a worry in the future, but it's certainly the case with COVID. COVID has adapted to thrive in ways that are very difficult for humans to combat.

Shankar Vedantam: Yeah. I know the virus has not been designed by an evil psychologist, but sometimes, it feels as if the virus has been designed by an evil psychologist when you see how it's taking advantage of the fallibilities in our cognitive architecture.

Paul Slovic: And the very fact that the harm can be spread invisibly makes it ambiguous enough that politicians can then play on that to manipulate us to say that it is really not a serious problem.

Shankar Vedantam: Jodi Doering is an emergency room nurse in South Dakota. Late in 2020, she wrote about patients she was treating who were in denial about the COVID-19 pandemic. She said, "I can't help but think of the COVID patients the last few days. The ones that stick out are those who still don't believe the virus is real, all while gasping for breath. They tell you there must be another reason they are sick. They call you names and ask you why you have to wear all that stuff, because they don't have COVID, because it's not real."

Jodi Doering: And the reason I tweeted what I did is it wasn't one particular patient. It's just a culmination of so many people. And their last dying words are, "This can't be happening. It's not real." You're listening to Hidden Brain. I'm Shankar Vedantam.

Shankar Vedantam: This is Hidden Brain. I'm Shankar Vedantam. We've seen how many subtle biases shape what makes us afraid. We are more likely to fear things that do not involve activities of our own choosing. We are less likely to notice dangers when they grow exponentially or add up cumulatively. Paul, you've done a lot of work exploring how the same phenomenon might play out in the context of compassion, our ability to care about others. Can you talk about the phenomenon of psychic numbing, please?

Paul Slovic: Yes. I started to look at that when I became concerned about the failure of the world to respond to the genocide that was happening in Rwanda, where 800,000 people were murdered in about 100 days. The world knew it was happening and turned a blind eye to it, refused to intervene in any way. I started to study that, and one way to study that was to look at why we help some people who are in danger and not others. We found that this was very much related to how many people there were at risk. And again, we had been sensitized to the notion that our sense of risk was driven by our feelings, so we looked at how feelings work in this context as well. We realized that one life, we believe, is immensely important and valuable to protect, and we make an emotional connection to an individual in need, and will then do a lot to protect that person or to rescue them, but that it doesn't scale up. Why, if we value individual lives so greatly, do we do so little to protect thousands or millions of lives at risk? That was the puzzle, and we started doing experiments to help us understand that.

Shankar Vedantam: Some of the experiments that you've done have looked at what happens in people's minds as you expand the number of victims of a tragedy. What's striking is that we're talking about very large numbers here. Even at fairly small numbers, our ability to empathize almost seems to shrink as the numbers seem to grow. Talk about some of those experiments, Paul.

Paul Slovic: Yes. We asked people to donate money on behalf of children who were facing starvation, and we found that, as the number of children at risk increased, the propensity of an individual to donate money to help them did not grow as the number of potential victims grew, but rather was strongest for a small number of individuals, and then either flattened out or in some cases even declined. We had a phase for that. We said, "The more who die, the less we care." We don't respond proportionally as the needs get greater, and there are several reasons that we discovered for that. One is this notion of psychic numbing. When things become large and become statistics, we don't get the same emotional connection to the people at risk that we do when there's only a few of them. As the numbers increase, we say that statistics are human beings with the tears dried off. You don't get the same emotional reaction to the numbers that you do to the individual or small numbers of people.

Shankar Vedantam: And of course, all of us have experienced this in the course of the COVID-19 pandemic, even those of us who want to exercise empathy and compassion. It just simply isn't possible to muster the same sense of tragedy when the death toll goes from 221,355 to 221,356. That extra death doesn't count in our minds, because our minds simply are not calibrated to deal with that level of tragedy.

Paul Slovic: You could add not one life, but 10,000 or 20,000 lives there, and again, you feel the same. It's just a big number. If we don't see the ill people around us or know them or if we don't feel personally vulnerable, these statistics don't move us as they should.

Shankar Vedantam: Many years ago, the philosopher Peter Singer came up with a thought experiment that we've talked about before on Hidden Brain. Very simply, the thought experiment is imagine you're walking by the side of a pond and you see a child drowning in the pond. You can jump in and save the child at no risk to your own life, but you have to act quickly, and if you act quickly and jump in the pond, you're going to ruin a very fine pair of shoes you're wearing. You know, most people say, "Of course, I would jump in the pond to save the child's life. A child's life is worth more than my pair of shoes." The question that Peter Singer asks is, "Well, in that case, why would you not donate $200 of your money to save the life of a child halfway around the world?" It's the same trade-off in that case, a child's life versus $200. You came up with a heartbreaking twist on that thought experiment that in some ways is very revealing about the conversation we're having about how empathy and compassion work when numbers start to get larger than one. Tell me about the refinement to the thought experiment, Paul.

Paul Slovic: We ask people to imagine that they're walking by this pond and they see a child in the water drowning, and they're about to jump in and risk their own lives. And then, they see that, off in the distance, there's a second child also in danger of drowning. The question is, "Well, would you not go into the water and rescue the child that is nearby because there's another child that you can't rescue?" And you would say, "Of course not. I can rescue this child. Let's do it." But what we find in experiments is actually that people do get demotivated from helping people they can help by the bad feelings they get when they realize that there's others that they cannot help. So what's going on here is that we help others not only because they need our help, but also, we feel good about doing that. We're doing the right thing, and we can do it. And when you're made aware of others who you cannot help, this creates negative feelings that come in and mix in with the good feeling and dampen the good feeling you have about what you can do, so then you no longer do it. Obviously, in a real situation when you're right there with a child, you're not going to be demotivated. But if this is a more subtle kind of thing where you're asked to donate to a charity on behalf of starving children like this one, and then by the way, you're told that this is a big problem, the starvation in this region, that there are thousands or millions of children starving, you should even be more motivated to help this child. We found in experiment that the donations dropped in half when people were made aware of the fact that this child that they could help was one of many. This is crazy. You should not be demotivated from doing what you can do by the fact that you can't do it all. Do what you can do.

Shankar Vedantam: Mm-hmm (affirmative). You can look at what's happened with the COVID pandemic almost as a dress rehearsal for even more serious challenges that we might face collectively in the years to come. I'm wondering if you can connect our discussion about risk to the challenge of dealing with a problem like climate change. How do the workings of our minds predict what we have done and what we might be failing to do?

Paul Slovic: That's a very interesting question. First, COVID spreads exponentially, and the same thing may happen to us with climate change. Sure, it's on a different scale, but the processes that are contributing to climate change in terms of the buildup of certain types of pollutants and the changes in temperature are growing exponentially. The hallmark of an exponential growth process is that the really severe, unacceptable, unlivable consequences will be here more quickly than we think. Unless we pay attention to the scientists who are showing us with the data that this is happening, just like scientists were showing us with the data that COVID was growing exponentially, but we didn't take that seriously, we have to do the same thing with regard to climate change.

Shankar Vedantam: We talked a little while earlier about the idea of externalities, where my actions that I'm undertaking with autonomy might affect you. It's clear that there are things I would not do to harm you if I thought that they would harm you. I wouldn't come up and punch you in the face, but I might say, "What's the harm in my going to a bar? How is that possibly going to affect Paul?," because I can't see the chain that causes you harm. I'm wondering in some ways if climate change puts that on steroids, because here, the externalities are not just other people, they're not just people living in other countries, but they are people who are not yet born who will be inhabiting the Earth 50, 100, 200 years from now. If our minds are not well calibrated to think about the wellbeing of other people who are living next to us or in the next city or the next country, surely, it must be even harder for our minds to contemplate the wellbeing of

people who haven't yet arrived on the planet.

Paul Slovic: Yes. I think you're right, because we will devalue the lives of people who are currently living, but when they don't even exist yet, then it's even easier to devalue them. It's not necessarily that we are deliberately saying that their lives don't matter because if you ask people who are doing things that are harmful to the climate, "Are these future lives, people in future generations, are they important?," we would say, "Yes, of course, they are important." And what we found is that there's often a disconnect between our values and the actions, and that comes from the fact that, when we have to act, we've got a conflict between protecting unborn future generations versus getting the near-term conveniences and comforts that we get from doing the wrong thing to the environment. And so we act in ways that contradict our values, and we have to be aware of that. That implies that we have to do more than just educate people about the importance of protecting future generations. We also have to have enforcement of behaviors through regulations, we have to enforce safe practice. We also have to provide motivation and economic incentives for doing the right thing, and creating jobs in the industries that protect the environment. We have to recognize the fact that we need these external, often led by government and industry, the carrots and the sticks to produce and maintain climate-friendly behavior... That just creating a moral obligation by itself is not going to do it.

Shankar Vedantam: Yeah. It's interesting. You've been studying these issues for some four decades now, Paul. There must be a part of you that's a little disheartened when you see how little these insights have actually been used in the face of a global pandemic or climate change. What gives you hope as you look out at the landscape in terms of having these ideas actually applied to turn things for the better?

Paul Slovic: It's a challenge. I do get energized by facing challenges. Also, I think the information environment is different now, so hopefully, the awareness of the findings that we're coming up with and their implications can be spread far and wide very quickly. I find it an exciting challenge to try to synthesize and communicate the knowledge of the judgment and decision-making community to address these problems that are global in scope and potentially catastrophic.

Shankar Vedantam: Paul Slovic is a psychologist at the University of Oregon. To learn more about his work, go to arithmeticofcompassion.org. Paul, thank you for joining me today on Hidden Brain.

Paul Slovic: You're very welcome, Shankar. It's my pleasure.

Shankar Vedantam: Hidden Brain is produced by Hidden Brain Media. Midroll Media is our exclusive advertising sales partner. Our production team includes Brigid McCarthy, Kristin Wong, Laura Kwerel, Ryan Katz, Autumn Barnes, and Andrew Chadwick. Tara Boyle is our executive producer. I'm Hidden Brain's executive editor. Our unsung hero this week is Michael Costagliola. Michael is a composer whose music and sound design work has been featured in theater productions across the country. Since the pandemic has upended the theater world, Michael has expanded into making music for podcasts. Composing for podcasts is difficult, because the music has to be understated to allow the story and ideas to shine. Michael intuitively understands how to do this, and his work has the added benefit of being beautiful and distinctive. You heard some of it in today's episode. Thank you, Michael. For more Hidden Brain, you can follow us on Facebook and Twitter. If you like this episode, please be sure to share it with a friend. I'm Shankar Vedantam. See you next week.

Podcast:

Subscribe to the Hidden Brain Podcast on your favorite podcast player so you never miss an episode.

google podcast subscribe
spotify podcast subscribe

Newsletter:

Go behind the scenes, see what Shankar is reading and find more useful resources and links.