SHANKAR VEDANTAM, HOST: This is HIDDEN BRAIN. I'm Shankar Vedantam.(SOUNDBITE OF MUSIC)VEDANTAM: During the Middle Ages, word spread to Europe about a peculiar plant found in Asia. This plant had a long stalk with heavy pods attached. When you cut those pods open, inside you would find a tiny little lamb.(SOUNDBITE OF LAMB BLEATING)CAILIN O'CONNOR: Complete with flesh and wool like a live animal lamb.(SOUNDBITE OF LAMB BLEATING)VEDANTAM: This creature, half-plant, half-animal, came to be known as the Vegetable Lamb of Tartary.O'CONNOR: Various travel writers wrote that they had either heard about this or that they had eaten one of these lambs. And many of them said they had sawn the kind of downy wool from the lamb.VEDANTAM: When these narratives made their way to Europe, people felt they had a view of a different world. Of course, no one in Europe had ever seen the Vegetable Lamb of Tartary because there was no such thing. But for centuries, people kept talking about this fantastical creature as if it were real. It even came up in scholarly works right next to pictures of oak trees and rabbits.O'CONNOR: If people hadn't been telling each other about these things, nobody would believe that there were vegetable lambs because nobody had ever seen them, right?(SOUNDBITE OF MUSIC)O'CONNOR: And this is by no means a unique happening at that time.VEDANTAM: At that time. Of course, we would never fall for vegetable lambs.(SOUNDBITE OF MUSIC)VEDANTAM: We live in an era of science, of evidence-based reasoning, of calm, cool analysis. But maybe there are vegetable lambs that persist even today even among highly trained scientists, physicians and researchers. Maybe there are spectacularly bad ideas that we haven't yet recognized as spectacularly bad.(SOUNDBITE OF MUSIC)VEDANTAM: This week on HIDDEN BRAIN, we're going to look at how information and misinformation spread in the world of science and why evidence is often not enough to convince others of the truth.(SOUNDBITE OF MUSIC)VEDANTAM: Cailin O'Connor is a philosopher and mathematician at the University of California, Irvine. She studies how information, both good and bad, can pass from person to person. She is co-author with James Weatherall of the book "The Misinformation Age: How False Beliefs Spread." Cailin, welcome to HIDDEN BRAIN.O'CONNOR: Oh, thank you for having me.VEDANTAM: So one of the fundamental premises in your book is that human beings are extremely dependent on the opinions and knowledge of other people, and this is what creates channels for fake news to flourish and spread. Let's talk about this idea. Can you give me some sense of our dependence on what you call the testimony of others?O'CONNOR: So one reason we wrote this book is that we noticed that a lot of people thinking about fake news and false belief were thinking about problems with individual psychology - so the way we have biases and processing information, the fact that we're bad at probability. But if you think about the things you believe, almost every single belief you have has come from another person. And that's just where we get our beliefs because we're social animals. And that's really wonderful for us. That's why we have culture and technology. You know, that's how we went to the moon. But if you imagine this social spread of beliefs as opening a door, when you open a door for true beliefs to spread from person to person, you also open the door for false beliefs to spread from person to person. So it's this kind of double-sided coin.VEDANTAM: And what's interesting, of course, is that if you close the door, you close the door to both, and if you open the door, you open the door to both.O'CONNOR: That's right. So if you want to be social learners who can do the kinds of cultural things we can do, it has to be the case that you also have to have this channel by which you can spread falsehood and misinformation, too.VEDANTAM: So as I was reading the book, I was reflecting on the things that I know or the things that I think I know, and I couldn't come up with a good answer for how I actually know that it's the Earth that revolves around the sun and not the other way around.O'CONNOR: Yeah. That's right. Ninety-nine percent of the things you believe probably you have no direct evidence of yourself. You have to trust other people to find those things out, get the evidence and tell it to you.(SOUNDBITE OF MUSIC)O'CONNOR: And so one thing that we talk a lot about in the book is the fact that we all have to ground our beliefs in social trust. So we have to decide what sources and what people we trust and therefore what beliefs we're going to take up because there's just this problem where we cannot go verify everything that we learn directly.(SOUNDBITE OF MUSIC)VEDANTAM: We trust the historian who teaches us about Christopher Columbus. We trust the images from NASA showing how our solar system is organized. Now, we say we know Columbus was Italian, and we know the Earth revolves around the sun. But really what we mean to say is we trust the teacher and we trust NASA to tell us what is true.O'CONNOR: And the social trust and ability to spread beliefs, I mean, it's remarkable what it's let humans do. You know, no other animal has this ability to sort of transfer ideas and knowledge dependably from person to person over generation after generation to accumulate that knowledge. But you do just see sometimes very funny examples of false beliefs being spread in the same way.(SOUNDBITE OF MUSIC)VEDANTAM: Now, many of us believe there is a way to separate fact from fiction - science. But as Cailin points out, science rarely offers permanent truths.O'CONNOR: Well, first, I would say that in the book we really encourage trust in science. It's not a book trying to undermine scientific knowledge. But if you look at the history of science, there have been a lot of examples of cases where people believe something and then they discovered that that wasn't true. So one classic example is the miasma theory of disease. So before we had the germ theory of disease, everyone thought diseases were caused essentially by bad vapors in the air and that you would have these bad vapors near swamps, for example. But this led to all sorts of problems, of course, for diagnosing various medical issues. If you believe that illness is coming from bad vapors in the air and there's a cholera outbreak, you're not going to go check the local well to see if there's some kind of, you know, germ or bacteria in there. So that's one kind of example.Of course, there have been really dramatic changes in the way we understand the physical world. So Aristotle believed that things fall to the ground because they have the element of Earth in them, and they're trying to go to their natural position in the center of the Earth. Newton argued, no, they fall to the ground because there is a force of gravity that happens between any two massive bodies, and it pulls them together. Now we don't believe in that force anymore. We trust Einstein's theory of general relativity, which is that we're all in a curved space-time and when something goes to Earth, it's moving along its natural trajectory in that curved space-time.(SOUNDBITE OF MUSIC)VEDANTAM: As a philosopher of science, Cailin studies how scientists communicate and share information. If we rely on scientists to tell us what to believe, who do they rely on? Turns out, other scientists. Now, showing that this is the case isn't easy. The process by which scientists change their minds on questions such as the spread of disease or the movement of objects through space is very complex. Studying this complex process can be mind-boggling. Say, for instance, Dr. A...UNIDENTIFIED PERSON #1: Hello.VEDANTAM: ...Talks to Dr. B one day about her research.UNIDENTIFIED PERSON #2: Hello.VEDANTAM: It also turns out that Dr. B is collaborating with Dr. C who recently met Dr. D at a conference. Now, Dr. D frequently reads Dr. A's papers but doesn't know about Dr. C's research. A couple of years later, Dr. E reads what Dr. B has written about what Dr. A said in an article that Dr. C cited before Dr. F had even published her results.(SOUNDBITE OF MUSIC)O'CONNOR: Empirically, it's hard to study scientists because things like theory change will happen over the course of 10 or 20 years and involve thousands and thousands of interactions between different scientists. You know, how would you ever study that?VEDANTAM: How would you ever study that? Because Cailin can't follow all these interactions, she recreates them in a computer simulation.O'CONNOR: You'd want to think of it as a really kind of simplified representation of what's happening in the real world.VEDANTAM: She creates groups of fictional scientists, and she gives them a series of rules, like who they can talk to and who they trust. These simulated scientists collect data and discuss their simulated research. Cailin sits back and watches what happens.O'CONNOR: Even if you look at completely idealized agents - so you would think of these as simple representations of totally rational scientists or totally rational people testing the world that sometimes they do end up, you know, coming to a false belief about the world even though they're able to experiment in the model and they're able to draw really good inferences based on those experiments. Now, one factor in this is that sometimes in the model what you have is spurious results. So if you think about scientific data, usually it's equivocal. You know, it doesn't just tell you what the answer is. If it did, we wouldn't have to do science on it. Instead, it's probabilistic. You have to use statistics to figure out what's true.So one thing we find sometimes in these models is that one agent or scientist will get data supporting the false belief, they'll share it with the entire community of scientists and then everyone will come to all believe the false thing at once and sort of ignore a better theory. And part of what happens there is this social spread of knowledge and belief causing everyone to turn away from a good theory. So if you have almost too much social influence within a community, that can be really bad because everyone can stop gathering data since the entire community is exposed to the same spurious results.VEDANTAM: You know, we've talked on HIDDEN BRAIN about the psychological reasons people sometimes believe in fake news. We've talked about irrationality and biases and tribalism. We've featured cognitive scientists like Tali Sharot and Danny Kahneman. If I hear you correctly, what you're saying is that psychological factors can have an effect, but you can have the spread of bad information even in the absence of biases or stupidity.O'CONNOR: Yeah. So one way that the models we look at are really useful is that you can kind of pare away things that are happening in the real world and see - well, suppose we didn't have any psychological biases. Suppose we were perfectly rational. Would we always come to the right answer in science and in our day-to-day lives and see that the answer is no?(SOUNDBITE OF MUSIC)VEDANTAM: Coming up - how the real world compares to the models that Cailin builds in her lab. We explore case studies from science that show how good information can sometimes fail to spread even as bad information metastasizes.(SOUNDBITE OF MUSIC)VEDANTAM: Mathematician and philosopher Cailin O'Connor studies how information spreads through social networks. People who know and trust one another efficiently pass information back and forth and learn from one another. Unfortunately, the same rules of social trust can sometimes be a roadblock for the truth. Mary Wortley Montagu learned this lesson hundreds of years ago. She was an English aristocrat who found herself living for a while in what is modern-day Turkey.O'CONNOR: Mary Montagu seems to have been really enchanted by Turkish culture. You know, she was coming from England, an aristocratic culture there. In Turkey, she discovered these beautiful shopping centers. Bathhouses, she seems to be - have been enchanted by bathhouses where there would be a lot of women sort of lounging naked, going in the hot water, drinking hot drinks together.VEDANTAM: Another thing that struck Mary about Turkish women - they used an innovative technique to limit the spread of smallpox. It was called variolation.O'CONNOR: What this involved - I mean, it's a bit like vaccination now. You would scratch maybe the arm of a patient and take pus from a smallpox pustule and put that pus into the scratch. So what would happen after you did that is that the patient would get a very mild smallpox infection. Some small percentage of patients would die, but many, many fewer than who would die of an actual smallpox infection. And after they had that more mild infection, they would actually be immune to smallpox. So this was practiced commonly in Turkey - basically unheard of in England at the time.Mary Montagu had herself had smallpox and survived when she was younger. She had lost a brother to smallpox. And so when she encountered variolation in Turkey, she decided, well, you know, why don't we do this in England? She had her own son variolated, and she decided she was going to try to spread this practice in her native country.VEDANTAM: So when she returns to Britain, in some ways, Mary Montagu here functions like one of your agents in your computer models because you have, you know, one cluster over here in Turkey and one cluster over here in Britain, and essentially, you have an agent walking over from Turkey to Britain. And Mary Montagu says here's this wonderful idea. We can limit the spread of smallpox in Britain. Britain, in fact, at the time, was actually facing a smallpox crisis. How were her ideas received?O'CONNOR: So her ideas were not received very well when she first came back. One thing we talk a lot about in the book is that almost everyone has what you might call a conformist bias. We don't like to publicly state things that are different from the people in our social networks. We don't like to have beliefs that are different from the people around us. It's somehow very socially uncomfortable to do that. And we don't like our actions to not conform with the people who we know and love.So when she got back to England, you know, it was already the case that all these physicians in England didn't believe in variolation. They thought this was a crazy idea, and none of them were going to stand out from the pack of physicians and say, yeah, I'm the person who's going to try this or going to believe that this practice works because they were all busy conforming with each other.VEDANTAM: And, of course, these ideas were coming from another country, a country with very different cultural practices that seemed, in some ways, very foreign. The idea and the country itself seemed very foreign.O'CONNOR: That's right. So it's not just that it's a weird new idea that none of them believe in their kind of in-group. It's also that it's coming from Turkey. And furthermore, it's coming from women in Turkey, so it was a practice mostly done by women. And a woman is bringing it to England as well, so they also don't really trust her as a woman and someone who's not a physician.So social trust is a really important aspect in understanding how people form beliefs. Because we can't go out and figure out ourselves whether the things people tell us are true, usually we just always have to decide who to trust. And people have little shortcuts in how they do this. They tend to trust those who are more like them. They also tend to trust those who share beliefs and values and practices with them. So, for example, if you are a physician, you might tend to trust a physician. If you believe in homeopathy, you might tend to trust someone who believes in homeopathy. We all use these kinds of tricks. So what we saw in the variolation case with Mary Montagu, the physicians aren't going to trust this woman who doesn't share their beliefs and practices, who isn't much like them.(SOUNDBITE OF MUSIC)VEDANTAM: Now, you could argue that the physicians who rejected Mary Montagu's ideas were not behaving like real scientists. They weren't being dispassionate. They weren't being objective. They were bringing psychological biases into the picture - sexism, xenophobia, tribalism. In the real world, misinformation spreads because of some combination of network effects and psychological and cognitive biases.You see the same thing in the case of the Hungarian physician Ignaz Semmelweis. He was an insider, a man and a doctor. He even had the assistance of scientific evidence to support his claims. But it turned out even these were not enough to overcome the barriers that confront the truth.O'CONNOR: Ignaz Semmelweis was a physician living in Vienna. He was put in charge of this clinic, the first obstetrical clinic in Vienna. Next door was the second obstetrical clinic of Vienna. He was in charge of training new doctors in obstetrics, and at the second clinic, they were training midwives. And shortly after he took over, he realized that something really terrible was going on because in his clinic, 10 percent of the women were dying, mostly of childbed fever, while the midwives next door who presumably, you know, they would have thought they were - less expertise - only 3 to 4 percent of their patients were dying. So Semmelweis was obviously really worried about this. He had patients who would be begging on their knees to be transferred to the other clinic.He had this kind of breakthrough moment when a colleague of his was conducting an autopsy and accidentally cut himself. And then shortly thereafter, he died of something that looked a lot like childbed fever. Semmelweis realized, well, I've got all these physicians who are conducting autopsies on cadavers and then immediately going and delivering babies. And he thought, well, maybe there's something transferred on their hands, and he called this cadaverous particles. Of course, now we know that that is bacteria, but they didn't have a theory of bacteria at the time. So he started requiring the physicians to wash their hands in a chlorinated solution, and the death rate in his clinic dropped way down.VEDANTAM: And, of course, the way we think about science, we say, all right, we have - someone's discovered something wonderful. Everyone must have instantly adopted this brilliant, new idea.O'CONNOR: You would think, right? And he has this wonderful evidence, right? It was 10 percent, he introduced the practice, goes down to 3 percent. But that's not what happened. So he published his ideas, and the other gentleman physicians did not take them up. In fact, they found them kind of offensive. They thought this is - you know, he's writing that we have dirty hands, we have unclean hands, but in fact, we're gentlemen. They also thought it was just really far out of the range of theories that could possibly be true, so they didn't believe him despite the really good evidence and the deep importance. You know, people's lives were really at stake. And it took decades for his handwashing practice to actually spread.VEDANTAM: In fact, I understand that Semmelweis himself eventually suffered a nervous breakdown. How did his own story end?O'CONNOR: So the way the story goes - though this is a little hard to verify - is that he was so frustrated that people weren't adopting his handwashing practice that he had a nervous breakdown as a result. He was put into a Viennese mental hospital where he was beaten by guards and died of blood poisoning a few weeks later.(SOUNDBITE OF MUSIC)VEDANTAM: We've seen how being an outsider or breaking with tradition can be barriers to the spread of a good scientific information, but you could argue that these examples were from a long-gone era of gentlemen physicians and amateur scientists. But even in the modern day of science, where researchers demand hard evidence to be convinced, it turns out that false, inaccurate and incomplete information can still take hold. In 1954, E.D. Palmer published a paper that changed how doctors thought about stomach ulcers.O'CONNOR: So what he did was looked at a lot of stomachs, I believe somewhere in the range of a thousand, and he found that there were no bacteria whatsoever in the stomachs that he investigated. A lot of people at that time had been arguing over whether stomach ulcers were caused by stomach acid or some kind of bacteria. This was taken as really decisive evidence showing that OK, well, it can't be bacteria because everyone thought Palmer's study showed there are no bacteria in stomachs, so it absolutely must be stomach acid.VEDANTAM: And, of course, in this case, Palmer was not trying to fabricate his data or make up data. He was sincerely arriving at what he thought was a very good conclusion.O'CONNOR: That's right. And it seems that it just was a problem with his methodology. Of course, there are bacteria in our stomachs. He just didn't see them because of the way he was doing his particular experiment. This was not a fabrication at all.VEDANTAM: One of the things that's interesting about this episode involving Palmer and the stomach ulcers is that as individuals essentially came over to believe what Palmer was telling them, there was a consensus that started to grow. And as each new person added to the consensus, it became a little bit stronger, which made it even harder to challenge.O'CONNOR: Yeah. So although they had been arguing for decades about whether ulcers were caused by acid or by bacteria, at this point people started to share Palmer's results. Pretty much everybody saw them. And this consensus was arrived at. OK, it's acid. And everyone who had been studying the possibility that bacteria caused stomach ulcers stopped studying that. And many people turned to looking at OK, how can we treat stomach acid in order to treat ulcers?(SOUNDBITE OF MUSIC)VEDANTAM: When Australian physician Barry Marshall came along a few decades later to challenge this theory, he was met with stony-faced resistance. He couldn't get his articles published. Scientists sniped at him behind his back even though, as it turned out, his data was far better than E.D. Palmer's stomach studies.(SOUNDBITE OF MUSIC)VEDANTAM: Coming up - what Barry Marshall did to fight misinformation and what we can learn from his story about how to spread the truth.(SOUNDBITE OF MUSIC)VEDANTAM: Australian physician Barry Marshall tried and failed for years to convince doctors that stomach ulcers were caused by bacteria. Like Ignaz Semmelweis, he found that mere evidence was no match for conventional wisdom.(SOUNDBITE OF ARCHIVED RECORDING)BARRY MARSHALL: People were bleeding in my practice and dying from ulcers in my hospital. I could see it. The only person in the world at that time who could make an informed consent about the risk of drinking Helicobacter was me. So I had to be in my own experiment. So we cultured a patient with gastritis. I underwent a baseline endoscopy. I drank the bacteria - 10 to the ninth colony-forming units. Then I had this vomiting illness - no acid present in my vomit. And when I vomited early in the mornings still half asleep - but it was just, like, water coming up.VEDANTAM: What on Earth was he doing, Cailin?O'CONNOR: (Laughter) So he decided that his idea that, in fact, ulcers are caused by bacteria wasn't spreading fast enough for his taste in the scientific community. And so he did this demonstration. He gave himself H. Pylori and gave himself stomach ulcers. And then he later cured them with antibiotics in this publicity stunt, almost, to convince people that, in fact, ulcers were caused by bacteria.VEDANTAM: Eventually, Barry Marshall and Robin Warren went on to win the Nobel Prize in medicine for their discoveries.(SOUNDBITE OF MUSIC)VEDANTAM: Mary Montagu, the woman who faced resistance in bringing variolation to England, never won a prestigious prize, but she also found a way to spread the truth. Like Barry Marshall, she found it had more to do with her sales pitch than with the evidence.O'CONNOR: So in the end, she did something really smart, which took advantage of the ways that we use our social connections to ground our beliefs and our trust. So she ended up convincing Princess Caroline of Ansbach to variolate her own two small daughters and to do it in this kind of public way. So she got one of the most influential people in the entire country to engage in this practice. So that did two things. So No. 1, it made clear, you know, because she did in this kind of public way and her daughters were fine, it gave people evidence that this is, in fact, a safe practice, and it's a good idea. But it also made clear to people that if they want to conform to the norm, if they want to share a practice with this really influential person, then they should do the same thing. And after Princess Caroline did this, variolation spread much more quickly, especially among people who had a personal connection to either Mary Montagu or to the princess.VEDANTAM: What's fascinating here is that this wasn't, in some ways, a irrational way to solve the problem. It wasn't saying, look; there's really convincing evidence here. You're almost using a technique that's pretty close to propaganda.O'CONNOR: It is a propaganda technique. Absolutely. So propagandists tend to be very savvy about the ways that people use their social connections to ground trust and knowledge and choose their beliefs. And they take advantage of those. In this case, it was using those - that social trust for good. But in many cases, people use it for bad. And if you look at the history of industrial propaganda in the U.S. or if you look at the way Russia conducted propaganda before the last election, people have taken advantage of these kinds of social ties and beliefs to try to convince us of whatever it is they're selling.VEDANTAM: One last idea in how you counter bad information, Semmelweis, as we saw, did not succeed in persuading other doctors during his lifetime to wash their hands thoroughly before they were treating patients. But, of course, now that idea is widely adopted. What does that tell us, Cailin, about how science in some ways might be self-correcting? It might not be self-correcting at the pace that we want, but over time, it appears that good ideas do beat out the bad ones.O'CONNOR: Yeah, so we have thousands and thousands of examples in science of exactly that happening, of good ideas beating out the bad ones. Of course, now we can look back and say, oh, well, that good idea won out and that good idea won out. We can't actually look at right now and know which of the ideas we believe now are correct ones or good ones. So there are actually philosophers of science like Larry Laudan and Kyle Stanford who argue for something called the pessimistic meta-induction, which is something like this - because scientific theories in the past have always eventually been overturned, we ought to think that our theories now will probably be overturned as well.But there is actually an optimistic side to this, which is that if you look at many theories in the past, ones that were overturned, often the reason people believed them is that even if they were wrong, they were a good guide to action. So Newtonian physics got us to the moon. It's not right, but it was really successful. Even the theory of stomach acid causing ulcers - well, if you treat stomach acid, it actually does help with ulcers. You know, it wasn't a completely unsuccessful theory. It's just that it wasn't totally right, and it wasn't as successful as the bacteria theory of ulcers because antibiotics do better.VEDANTAM: Of course, when it comes to something like handwashing, you know, you can say that over the last 150 years or, you know, people have adopted that idea, but it didn't actually mean that the people in Semmelweis' time changed their minds. It really was that those people essentially left the stage and new people came along. There's an old joke in science which says science progresses funeral by funeral.O'CONNOR: (Laughter) Yeah.VEDANTAM: And in some ways, that's what you're talking about here.O'CONNOR: Yeah. So that can be - so theory change can happen because ideas that are good ideas spread throughout a community, and then more people start to test them and then they communicate them to more people and eventually you reach a consensus. One thing that the philosopher Thomas Kuhn really argued is that when you're having these kind of big paradigm shifts in science, often it's young people coming up with a new paradigm and then switching to it because they don't have any skin in the game in the old one. You know, they haven't spent their life defending this older theory. And then, you know, maybe eventually the people who are defending that older theory retire or die (laughter) and then you have theory change.VEDANTAM: One of the interesting implications about all of this is how we should think about the truth. And in some ways, I think the picture that I'm getting from you is a picture that says the truth is not a binary question. It's not, you know, is it true, is it false? I mean, some questions, of course, perhaps can be reduced to is it true, is it false? But, really, science is in the business of producing probability estimates for various claims, and I think what you're saying is that for us to actually be on the right side of the misinformation information divide, it's helpful for us to think in probabilistic terms rather than in binary terms.O'CONNOR: Yeah, that's absolutely right. So we do think it's really important to think about belief in terms of degrees and evidence and believing something strongly enough. And part of the reason is that there has been this strategy where people who are trying to subvert our beliefs will say, but we're not sure about something. They'll say, evolution is just a theory or there's some doubt about global warming.But, ultimately, not being sure about something is not what matters. We're never really 100 percent sure about anything. I mean, if you think about, think about any belief you could have - you know, that the sun will come up tomorrow. Well, it always has in the past, but that doesn't mean that 100 percent sure it will tomorrow. There's a really good chance it will tomorrow. We shouldn't be looking for certainty. Instead, we need to be saying to ourselves when do we have enough evidence to make good decisions?VEDANTAM: Cailin O'Connor is a philosopher and mathematician at the University of California, Irvine. She studies how social networks can spread both good information and bad. Along with James Weatherall, she is co-author of the book "The Misinformation Age: How False Beliefs Spread." Cailin, thank you for joining me today on HIDDEN BRAIN.O'CONNOR: Oh, thank you so much for having me.(SOUNDBITE OF MUSIC)VEDANTAM: This week's show was produced by Camila Vargas Restrepo and edited by Tara Boyle and Jenny Schmidt. Our team includes Rhaina Cohen, Laura Kwerel, Parth Shah and Thomas Lu.Our unsung heroes this week don't work at HIDDEN BRAIN or even at NPR, but the longer we do the show, the more we realize how many helping hands go into building it. We've often found we turn to the scientists and forecasters at the National Weather Service to tell us when we need to get an episode wrapped up by Friday because there's a storm arriving on Monday that could keep us from getting to the office. Government departments like the National Oceanic and Atmospheric Administration do their work so quietly and so well we often take them for granted. Today, we recognize the folks at NOAA, as it's called, for their vital work.For more HIDDEN BRAIN, you can find us on Facebook and Twitter. You can find information about the research we discuss on this show on our website, npr.org/hiddenbrain. If you like this episode, please think of one friend who might enjoy our show and drop them a word about it. I'm Shankar Vedantam, and this is NPR. Transcript provided by NPR, Copyright NPR.