What we say here about books applies to all formats we cover. Engaging Youll read or watch this all the way through the end. The word kind originated from the word kin. When you are kind to someone it means you are treating them like family. It isnt any longer. And here our dependence on other minds reinforces the problem. If weor our friends or the pundits on CNNspent less time pontificating and more trying to work through the implications of policy proposals, wed realize how clueless we are and moderate our views. Presented with someone elses argument, were quite adept at spotting the weaknesses. Hugo Mercier explains how arguments are more convincing when they rest on a good knowledge of the audience, taking into account what the audience believes, who they trust, and what they value. Can Carbon-Dioxide Removal Save the World. If someone you know, like, and trust believes a radical idea, you are more likely to give it merit, weight, or consideration. They dont. False beliefs can be useful in a social sense even if they are not useful in a factual sense. "I believe that ghosts don't exist." An inelegant phrase but it could be used. In a new book, The Enigma of Reason (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. In a separate conversation on the same trip, Trump referred to the more than 1,800 marines who lost their lives at Belleau Wood as "suckers" for getting killed. A third myth has permeated much of the conservation field's approach to communication and impact and is based on two truisms: 1) to change behavior, one must first change minds, 2) change must happen individually before it can occur collectively. 08540 Change their behavior or belief so that it's congruent with the new information. This website uses cookies to provide you with a great user experience. Of course, whats hazardous is not being vaccinated; thats why vaccines were created in the first place. You can't expect someone to change their mind if you take away their community too. In Atomic Habits, I wrote, Humans are herd animals. One of the most famous of these was conducted, again, at Stanford. People believe that they know way more than they actually do. Your highlights will appear here. New facts often do not change people's minds. It disseminates their BS. By Elizabeth Kolbert. We rate each piece of content on a scale of 110 with regard to these two core criteria. Consider whats become known as confirmation bias, the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. I have been sitting on this article for over a year. With a book, the conversation takes place inside someones head and without the risk of being judged by others. But here they encounter the very problems they have enumerated. The amount of original essays that we did for our clients, The amount of original essays that we did for our clients. I allowed myself to realize that there was so much more to the world than being satisfied with what one has known all their life and just believing everything that confirms it and disregarding anything that slightly goes against it, therefore contradicting Kolbert's idea that confirmation bias is unavoidable and one of our most primitive instincts. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. A very good read. You have to give them somewhere to go. "The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man . The further away an idea is from your current position, the more likely you are to reject it outright. This, I think, is a good method for actually changing someones mind. It's the reason even facts don't change our minds. In many circumstances, social connection is actually more helpful to your daily life than understanding the truth of a particular fact or idea. Voters and individual policymakers can have misconceptions. They can only be believed when they are repeated. But, on this matter, the literature is not reassuring. So, basically, when hearing information, wepick a side and that, in turn, simply reinforces ourview. The challenge that remains, they write toward the end of their book, is to figure out how to address the tendencies that lead to false scientific belief., The Enigma of Reason, The Knowledge Illusion, and Denying to the Grave were all written before the November election. The tendency to selectively pay attention to information that supports our beliefs and ignore information that contradicts them. But how does this actually happen? The students in the second group thought hed embrace it. Risk-free: no credit card is required. Wait, thats right. These are the fruits that are safe (and not safe) for your dog to eat, These Clever Food Hacks Get Kids To Eat Healthy, The 5 Ways You Know Youre Too Old For Roommates. Growing up religious, the me that exists today is completely contradictory to what the old me believed, but I allowed myself to weigh in the facts that contracted what I so dearly believed in. Weve been relying on one anothers expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. What allows us to persist in this belief is other people. The opposite was true for those who opposed capital punishment. Cognitive psychology and neuroscience studies have found that the exact opposite is often true when it comes to politics: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. The Dartmouth researchersfound, by presenting people with fake newspaper articles, that peoplereceivefactsdifferently based on their own beliefs. In a new book, "The Enigma of Reason" (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Julia Galef, president of the Center for Applied Rationality, says to think of an argument as a partnership. Inspiring Youll want to put into practice what youve read immediately. Fiske identifies four factors that contribute to our reluctance to change our minds: 1. This refers to people's tendencies to hold on to their initial beliefs even after they receive new information that contradicts or disaffirms the basis for those beliefs (Anderson, 2007). In the meantime, I got busy writing Atomic Habits, ended up waiting a year, and gave The New Yorker their time to shine (as if they needed it). Theyre saying stupid things, but they are not stupid. The British philosopher Alain de Botton suggests that we simply share meals with those who disagree with us: Sitting down at a table with a group of strangers has the incomparable and odd benefit of making it a little more difficult to hate them with impunity. In their groundbreaking account of the evolution and workings of reason, Hugo Mercier and Dan Sperber set out to solve this double enigma. As Julia Galef so aptly puts it: people often act like soldiers rather than scouts. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they werent the ones risking their lives on the hunt while others loafed around in the cave. I believe more evidence for why confirmation bias is impossible to avoid and is very dangerous, though some of these became more prevalent after the article was published, could include groups such as the kkk, neo-nazis, and anti-vaxxers. Friendship does. Not usually, anyway. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. A recent example is the anti-vax leader saying drinking your urine can cure Covid, meanwhile, almost any scientist and major news program would tell you otherwise. They want to save face and avoid looking stupid. The Atlantic never had to issue a redaction, because they had four independent sources who were there that could confirm Trump in fact said this. Prejudice and ethnic strife feed off abstraction. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. It is painful to lose your reality, so be kind, even if you are right.10. It led her to Facebook groups, where other moms echoed what the midwife had said. https://app.adjust.com/b8wxub6?campaign=. When the handle is depressed, or the button pushed, the waterand everything thats been deposited in itgets sucked into a pipe and from there into the sewage system. In the mid-1970s, Stanford University began a research project that revealed the limits to human rationality; clipboard-wielding graduate students have been eroding humanitys faith in its own judgment ever since. Providing people with accurate information doesnt seem to help; they simply discount it. Next thing you know youre firing off inflammatory posts to soon-to-be-former friends. Humans also seem to have a deep desire to belong. But a trick had been played: the answers presented to them as someone elses were actually their own, and vice versa. And this, it could be argued, is why the system has proved so successful. [arve url=https://youtu.be/VSrEEDQgFc8/]. I must get to know him better.. Nor did they have to contend with fabricated studies, or fake You cant expect someone to change their mind if you take away their community too. You already agree with them in most areas of life. In an ideal world, peoples opinions would evolve as more facts become available. "It is so, so easy to Google 'What if this happens' and find something that's probably not true," Maranda says. For example, our opinions. First, AI needs to reflect more of the depth that characterizes our own intelligence. Check out Literally Unbelievable, a blog dedicated to Facebook comments of people who believe satire articles are real. We're committed to helping #nextgenleaders. A group of researchers at Dartmouth College wondered the same thing. It is intelligent (though often immoral) to affirm your position in a tribe and your deference to its taboos. Elizabeth Kolbert New Yorker Feb 2017 10 min. Overview Youll get a broad treatment of the subject matter, mentioning all its major aspects. The economist J.K. Galbraith once wrote, "Faced with a choice between changing one's mind and proving there is no need to do so, almost everyone gets busy with the proof.". Peoples ability to reason is subject to a staggering number of biases. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others begins. Comprehensive Youll find every aspect of the subject matter covered. A Court of Thorns and Roses. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? 6 Notable. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. In Kolbert's article, Why Facts Don't Change Our Minds, various studies are put into use to explain this theory. She has written for The New Yorker since 1999. A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Humans' disregard of facts for information that confirms their original beliefs shows the flaws in human reasoning. "And they were just practically bombarding me with information," says Maranda. When Kellyanne Conway coined the term alternative facts in defense of the Trump administrations view on how many people attended the inauguration, this phenomenon was likely at play. The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance. Are you sure you want to remove the highlight? Enter your email now and join us. All rights reserved. 1. Convincing someone to change their mind is really the process of convincing them to change their tribe. The act of change introduces an odd juxtaposition of natural forces: on one . As one Twitter employee wrote, Every time you retweet or quote tweet someone youre angry with, it helps them. She says it wasn't long before she had decided she wasn't going to vaccinate her child, either. There was little advantage in reasoning clearly, while much was to be gained from winning arguments. A helpful and/or enlightening book, in spite of its obvious shortcomings. But I knowwhere shes coming from, so she is probably not being fully accurate,the Republican might think while half-listening to the Democrats explanation. So clearly facts change can and do change our minds and the idea that they do is a huge part of culture today. One way to visualize this distinction is by mapping beliefs on a spectrum. When it comes to the issue of why facts don't change our minds, one of the key reasons has to do with confirmation bias. The New Yorker's Elizabeth Kolbert reviews The Enigma of Reason by cognitive scientists Hugo Mercier and Dan Sperber, former Member (198182) in the School of Social Science: If reason is designed to generate sound judgments, then its hard to conceive of a more serious design flaw than confirmation bias. Habits of mind that seem weird or goofy or just plain dumb from an intellectualist point of view prove shrewd when seen from a social interactionist perspective. If the goal is to actually change minds, then I dont believe criticizing the other side is the best approach. If they abandon their beliefs, they run the risk of losing social ties. At the end of the experiment, the students were asked once again about their views. In the other version, Frank also chose the safest option, but he was a lousy firefighter whod been put on report by his supervisors several times. In each pair, one note had been composed by a random individual, the . Becoming separated from the tribeor worse, being cast outwas a death sentence.. The book has sold over 10 million copies worldwide and has been translated into more than 50 languages. But back to the article, Kolbert is clearly onto something in saying that confirmation bias needs to change, but neglects the fact that in many cases, facts do change our minds. Now, they can change their beliefs without the risk of being abandoned socially. At getAbstract, we summarize books* that help people understand the world and make it better. 7 Good. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. The New Yorker, Why Facts Don't Change Our Minds. A helpful and/or enlightening book that combines two or more noteworthy strengths, e.g. Its easier to be open-minded when you arent feeling defensive. Any subject. The Gormans dont just want to catalogue the ways we go wrong; they want to correct for them. After three days, your trial will expire automatically. If people counterargue unwelcome information vigorously enough, they may end up with more attitudinally congruent information in mind than before the debate, which in turn leads them to report opinions that are more extreme than they otherwisewould have had, theDartmouth researcherswrote. Im not saying its never useful to point out an error or criticize a bad idea. Presented with someone elses argument, were quite adept at spotting the weaknesses. It also primes a person for misinformation. Our rating helps you sort the titles on your reading list from solid (5) to brilliant (10). (Dont even get me started on fake news.) But some days, its just too exhausting to argue the same facts over and over again. Enrollment in the humanities is in free fall at colleges around the country. Silence is death for any idea. Controversial Youll be confronted with strongly debated opinions. The Grinch, A Christmas Carol, Star Wars. Sometimes we believe things because they make us look good to the people we care about. Nearly sixty per cent now rejected the responses that theyd earlier been satisfied with. Dont waste time explaining why bad ideas are bad. Rioters joined there on false pretenses of election fraud and wanted justice for something that had no facts to back it up. The students were then asked to distinguish between the genuine notes and the fake ones. But hey, Im writing this article and now I have a law named after me, so thats cool. getAbstract recommends Pulitzer Prizewinning author Elizabeth Kolberts thought-provoking article to readers who want to know why people stand their ground, even when theyre standing in quicksand. getAbstract offers a free trial to qualifying organizations that want to empower their workforce with curated expert knowledge. For experts Youll get the higher-level knowledge/instructions you need as an expert. In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, whod come to a different conclusion. Mercier and Sperber prefer the term myside bias. Humans, they point out, arent randomly credulous. What happened? As a rule, strong feelings about issues do not emerge from deep understanding, Sloman and Fernbach write. At this point, something curious happened. In the second phase of the study, the deception was revealed. Finding such an environment is difficult. Clears Law of Recurrence is really just a specialized version of the mere-exposure effect. Concrete Examples Youll get practical advice illustrated with examples of real-world applications or anecdotes. If you use logic against something, youre strengthening it.. If reason is designed to generate sound judgments, then its hard to conceive of a more serious design flaw than confirmation bias. Discover your next favorite book with getAbstract. So, why, even when presented with logical, factualexplanations do people stillrefuse to change their minds? If youre not interested in trying anymore and have given up on defending the facts, you can at least find some humor in it, right? Some students discovered that they had a genius for the task. Government and private policies are often based on misperceptions, cognitive distortions, and sometimes flat-out wrong beliefs. The midwife told her that years earlier, something bad had happened after she vaccinated her son. Surveys on many other issues have yielded similarly dismaying results. Article Analysis of Why Facts Don't Change Our Minds by Elizabeth Kolbert Every person in the world has some kind of bias. Institute for Advanced Study The way to change peoples minds is to become friends with them, to integrate them into your tribe, to bring them into your circle. E.g., we emotional reason heaps, and a lot of times, it leads onto particular sets of thoughts, that may impact our behaviour, but later on, we discover that there was unresolved anger lying beneath the emotional reasoning in the . Are wearguing for the sake of arguing? Over 2,000,000 people subscribe. Inevitably Kolbert is right, confirmation bias is a big issue. Visionary Youll get a glimpse of the future and what it might mean for you. By comparison, machine perception remains strikingly narrow. Ad Choices. Shaw describes the motivated reasoning that happens in these groups: "You're in a position of defending your choices no matter what information is presented," he says, "because if you don't, it. While the rating tells you how good a book is according to our two core criteria, it says nothing about its particular defining features. 2017. That meanseven when presented with factsour opinion has already been determinedand wemay actually hold that view even more strongly to fight back against the new information. Eloquent Youll enjoy a masterfully written or presented text. The economist J.K. Galbraith once wrote, Faced with a choice between changing ones mind and proving there is no need to do so, almost everyone gets busy with the proof., Leo Tolstoy was even bolder: The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.. As everyone whos followed the researchor even occasionally picked up a copy of Psychology Todayknows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational.