Why Science Needs a Scout Mindset

Episode 1887 April 10, 2024 00:28:00
Why Science Needs a Scout Mindset
Intelligent Design the Future
Why Science Needs a Scout Mindset

Apr 10 2024 | 00:28:00

/

Show Notes

Scout or soldier? When it comes to our opinions and beliefs, there's a bit of both in all of us. But which mindset is more beneficial? On this ID The Future, host Andrew McDiarmid welcomes Dr. Jonathan McLatchie to discuss the characteristics of a scout mindset and how it relates to the debate over evolution and the evidence for intelligent design. Get full show notes at idthefuture.com.
View Full Transcript

Episode Transcript

[00:00:04] Speaker A: Id the Future, a podcast about evolution and intelligent design. [00:00:11] Speaker B: Welcome to id the future. I'm your host, Andrew McDermott. Today I have with me Doctor Jonathan McClatchy to discuss why science needs a scout mindset doctor McClatchy is a fellow and resident biologist at the Discovery Institute's center for Science and Culture. He was previously an assistant professor at Sadler College in Boston, where he lectured biology for four years. McClatchy holds a bachelors degree in forensic biology, a masters degree in evolutionary biology, a second masters degree in medical and molecular bioscience, and a PhD in evolutionary biology. His research interests include the scientific evidence of design and nature, arguments for the existence of God, and New Testament scholarship. Hes also the founder and director of talkaboutdoubts.com. Jonathan, thanks for joining me. [00:01:02] Speaker A: Great to be here. Thanks for having me back. [00:01:04] Speaker B: Absolutely. Youre always welcome. Well, today I wanted to speak with you about a recent article you [email protected], comma, our flagship source for news and commentary about intelligent design and the debate over evolution. Your piece is called why science needs a scout mindset. Lessons from Julia Galef so first, can you tell us who Galef is and why her recent book caught your attention? [00:01:29] Speaker A: Absolutely. So Julia Galef is an american writer public speaker. She's also the co founder of the center for Applied Rationality. She used to host a podcast called Rationally Speaking, which was the official podcast of New York City Skeptics. She has a bachelor's degree in statistics from Columbia University that she graduated with in 2005, and then she spent several years doing research with social science professors at Columbia, Harvard, and MIT, including a year writing case studies on international economics for Harvard Business School. She started to do a PhD in economics, but soon thereafter she determined that she didn't actually want to be in academia, and so she left graduate school and moved back to New York, and she's been a freelance journalist since then. So she published the Scout Mindset in 2021, which is a book that I read with great interest. I found it to be a very big influence on my thinking as a scholar. And she also has a fantastic TED talk on the subject as well. So I definitely recommend that to your viewing. She also has a great YouTube channel. But anyway, Julia Galeth is a bayesian epistemologist just like myself. So she so for those that haven't listened to the previous podcast that we did on bayesian reasoning, Bayes is basically a way of structuring our thinking about evidence and how to update our beliefs in response to new data. And so as a Bayesian we maintain that evidence is defined in terms of a likelihood ratio, the probability, the evidence existing given the hypothesis being true on the numerator versus on the denominator, the probability of that same evidence existing given the falsity of that hypothesis. And the extent to which that likelihood ratio is top heavy, is the extent to which we have evidence for the proposition under review. So, Julia Galef and I both see eye to eye on a bayesian approach to evaluating hypotheses and updating our beliefs in response to new evidence and so forth. So I certainly resonated very much with her book, the Scout Mindset. [00:03:42] Speaker B: Yeah, yeah. And listeners, if you haven't heard that approach to bayesian thinking that Jonathan and I did, go ahead and look for it. We did an episode, whole episode covering that and how it relates to intelligent design. So Galeth has written a book called the Scout Mindset, why some people see things clearly and others don't. And this book has resonated with you. What does Galev mean by the soldier mindset and the Scout mindset? [00:04:10] Speaker A: So Julie Galef uses the soldier mindset and the scout mindset as kind of metaphors to describe two forms of reasoning. So soldier mindset is essentially a metaphor for motivated reasoning, and it leads us to loyalty. Loyally defend the stronghold of our belief commitments against intellectual threats, come what may. You can think about it as being a criminal defense attorney for your belief. Your belief is the client. And as a soldier, you're out there to defend your, your client, as it were, come what may. And so, in other words, you're there to defend your territory, which is what a soldier does. A soldier defends his territory. And so someone in soldier mindset, which we can also call one, engaging in motivated reasoning involves actively seeking out data that tends to confirm our beliefs while rationalizing or ignoring contrary data that tends to disconfirm them. Whereas on the other hand, the person who is, who is adopting a scout mindset attempts to honestly determine how the world really is. As Gale defines it, the scout mindset is the motivation to see things as they are, not as you wish they were. So for someone in the soldier mindset, Giallov argues, reasoning is like defensive combat. She says, I'm quoting, it's as if we're soldiers defending our beliefs against threatening evidence. End quote. For the soldier to change one's mind, to admit that one was wrong, is seen as surrender and failure or a sign of weakness. One's allegiance is to one's cherished beliefs rather than to the truth, even if those beliefs conflict with the balance of evidence. For the soldier, determining what to believe is done by asking oneself, can I believe this or do I have to believe this? Depending on what your motivations are. Whereas if you're in scout mindset, reasoning can be likened to map making and discovering that you are wrong about one or more of your belief simply means revising your map over time so that you have a better map of reality. And so scouts are more likely to seek out and carefully consider data that tends to undermine their own beliefs, thereby making one's map a more accurate reflection of reality, deeming it more fruitful to pay close attention to those who disagree with their own opinions than to those whose thinking aligns with their own. [00:06:37] Speaker B: Okay. Makes a lot of sense. While in her book, Galev reports on a study that probed a connection between scientific intelligence and divergent opinion, she writes that being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. Why is this an important finding? [00:06:58] Speaker A: So this relates to a study that was cited by Galef. It's really quite sobering, and it aptly demonstrates the prevalence of the soldier mindset in our society today. In this particular study, participants were tested in regards to their scientific intelligence with a set of questions. So these questions were divided into four categories, basic facts, methods, quantitative reasoning, cognitive reflection, and remarkably, when conservative Republican and liberal Democrat participants were also asked whether they affirmed the statements that there is solid evidence of recent global warming due mostly to human activity such as burning fossil fuels, there was a positive correlation between scientific intelligence and divergent opinion. So that is to say, the higher one scientific intelligence, the more likely a liberal Democrat was to affirm the statement and the more likely a conservative Republican was to disagree with it. And this is quite sobering because it shows that oftentimes when the more that you would expect, as one becomes more educated in academic fields, that opinions on controversial matters should converge rather than diverge. Right? The more educated people become, the more they ought to converge on a common consensus opinion. And what this study showed is that in fact, this isn't actually necessarily the case. The more people become educated in the sciences, the better they become, the smarter they become at at justifying why they already the believed in the first place. This isn't the only study to reveal the tendency of more educated people to diverge an opinion on controversial topics. Another study surveyed people's views on ideologically charged subjects, including stem cell research, the big Bang, human evolution and climate change. And the finding was that individuals with greater education, science education and science literacy display more polarized beliefs on these issues, although the study found that, quote, little evidence of political or religious polarization regarding nanotechnology and genetically modified foods, end quote. So Julia Galef, in her book, summarizes the implications of those studies. She says, and I quote, this is a crucially important result because being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money, but they won't protect you from bias and ideologically charged questions. [00:09:38] Speaker B: A very sobering finding indeed. Well, Galef lays out five characteristics that distinguish the scout from the soldier mindset. What's the first one she discusses? [00:09:48] Speaker A: Sure. So the first is the ability to tell other people when you realize that they were right. And she, of course, qualifies this by noting that, quote, technically, scout mindset only requires you to be able to acknowledge to yourself that you are wrong, not to other people. Still, a willingness to say I was wrong to someone else is a strong sign of a person who prized the truth over their own ego, end quote. And so, I mean, how often do we go to someone and tell them that we were wrong in the information that we provided? Or, I mean, if we get. So let's say that we're teaching a class, for example, and we have a discussion with a student, and it turns out the student was right and we were wrong about a particular fact, this person, a scout mindset, is more likely to go to the student later and say, yes, I was wrong about this, here's the correct information. Likewise, if we're debating with a peer about a particular topic, and it turns out that the person that we were dialoguing with was correct about a particular point, or we were mistaken about something, then we should chase them down and present to them the fact that we were actually wrong. And here's the correct information. To make concessions about being wrong is actually, I would argue, an intellectual virtue. And it sometimes can be a blow to our pride. It takes humility, but I think it's something that is important to do for someone that is really passionate about truth over being right. [00:11:25] Speaker B: Yeah, very much so. Well, the second characteristic of a scout mindset is a willingness to look at your track record on reacting to criticism. How do you typically handle criticism? Why is this important? Important, yeah. [00:11:39] Speaker A: So Galef explains in her book, I'm quoting, she says to gauge your comfort with criticism, it's not enough just to ask yourself, am I open to criticism? Instead, examine your track record. Are there examples of criticism you've acted upon? Have you rewarded a critic, for example, by promoting him? Do you go out of your way to make it easier for other people to criticize you? And of course, in the discussion over evolution, this becomes quite applicable, because denying critics of evolution tenure or firing them from faculty positions or preventing them from being able to publish are hardly characteristics of a scout mindset. I mean, how easy do we make it for people to criticize us? How do we react to criticism? Do we shut people down when they criticize us? Do we reprimand them for criticism? Or do we encourage them and say, yes, thank you for this constructive criticism. Even you may or may not agree with that criticism. But do you set up an environment where people are comfortable to come to you with disagreements as to your, your approach or positions? [00:12:50] Speaker B: Yeah. Yeah. Well, the ability to prove yourself wrong is another characteristic that Galeth pulls out. Give us an example of what that would look like. [00:13:00] Speaker A: Yeah, so she also writes, I'm quoting, she says, can you think of any examples in which you voluntarily proved yourself wrong? Perhaps you were about to voice an opinion online, but decided to search for kinder arguments first and ended up finding them compelling. Or perhaps at work, you were advocating for a new strategy, but changed your mind after you ran the numbers more carefully and realized it wouldn't be feasible. And so it's really important to seek out the best responses, rejoinders to arguments that we put forward publicly. So if you don't know what the best arguments to your position are, then that puts you in a vulnerable position, because then you're less likely to be fully informed and it's easy to make mistakes. But if you try to find out, okay, what are the best counter responses to this particular line of reasoning? Searching online or in books and so forth, trying to find out who are the best proponents of the opposing perspective, trying to, or even subjecting your work to pureview, to find a peer or colleague who has expertise in this particular subject, asking them, can you give me an honest appraisal of my argumentation here? And trying to find vulnerabilities and holes in your own arguments, I think, is a, is an important characteristic of scout mindset. [00:14:20] Speaker B: Yeah. And that ties into the last two characteristics that she mentions that distinguish the scout mindset from the soldier, one that's avoiding biasing one's own information. And which is really important, this one, the ability to recognize good critics, you want to say something on those? [00:14:38] Speaker A: Absolutely. So the fourth feature, as you said, of the scout mindset, is to avoid biasing one's information. So she writes, and I quote, for example, when you ask your friend to weigh in on a fight you had with your partner, do you describe the disagreement without revealing which side you were on so as to avoid influencing your friend's answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure so you're not tempted to move the goalposts later? So as for the fifth feature that Galef lists is about being able to recognize good critics, as you mentioned, and she comments that, quote, to view your critics as mean spirited, ill informed, or unreasonable. And its likely that some of them are, but its unlikely that all of them are. Can you name people who are critical of your beliefs, profession, or even choices who you consider thoughtful, even if you believe theyre wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable, even if you dont happen to know of specific people who hold those views, end quote. So I mean, if lets take our position in the intelligent design community, for example. I mean, it's tempting to look online at less sophisticated and nuanced critics, like Richard Dawkins for example, or Dave Farina or PZ Myers or someone. But there's also more sophisticated critics of design that we should be reading and being familiar with. So, for example, in philosophy, Paul Draper is a very thoughtful critic, or Graham Oppie is very thoughtful critic of religion and existence of God and that sort of thing. When we listen to critical voices that are skeptical of the positions that we adhere to, it's very tempting to look at the less sophisticated and less nuanced critics because it gives us a feeling of superiority and we enjoy being able to mentally tear their arguments apart. But it's also important to seek out the more informed, the more reasonable and more charitable critics that interact with their material as well. [00:16:41] Speaker B: Yeah, very much so. It sounds like to develop our scout mindset, and then it's good practice to test for our own biases. Galev offers five tests for bias in our reasoning. Can you walk us through those briefly? [00:16:54] Speaker A: Sure. So the first of those is the double standard test, which essentially asks whether we apply the same standards to our cells that we would apply to others. Another test that she suggests is the outsider test, which attempts to determine how you would assess the same situation or data if you had no vested interest in the outcome. The third is the conformity test, which attempts to discern the extent to which one's opinion is in fact one's own. And so she explains in quoting, if I find myself agreeing with someone else's viewpoint, I do a conformity test. Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them? And quote, it's very easy for people to feel influenced by a particular thought leader, and so their positions are in alignment with that particular thought leader, and that leader changes their mind on that particular subject, then one changes their mind also. And that's not a very rational or reasonable thing to do. Our belief should be based on the evidence and arguments and facts, not on Hugh, not who gave us those arguments or facts in the first place. So the fourth test is the selective skeptic test. So she says, imagine that this evidence supported the other side. How credible would you find it then? End quote, which of course relates to the outsider test. And then the final test is the status quo bias test. So, she says, imagine your current situation was no longer the status quo. Would you then actively choose it? If not, that's a sign that your preference for your situation is less about its particular merits and more about a preference for the status quo. Of course, you can also go the other direction and be more of a contrarian. And so some people might enjoy holding to a fringe position. And so that's one that can work both ways. But if you have a propensity to follow the consensus or follow the status quo, would you change your mind? Would you still hold to the position that you hold to if it were no longer the status quo, no longer the academic consensus in a given field? [00:19:06] Speaker B: Makes sense. Yeah, those sound like useful tests. Well, Gale of says that scouts revise their opinion incrementally over time on a topic, viewing errors as opportunities, and learning to see the experience of being wrong as valuable, not just painful. Why is that good advice? [00:19:25] Speaker A: She suggests in her book that we should drop the whole wrong confession altogether. Instead, talk about updating. So she explains, an update is routine, low key, as the opposite of an overwrought confession of sin. An update makes something better or more current without implying that its previous form was a failure. And she points out that we should not think about our changing our minds as a binary thing. Rather, we should think of the world in shades of grey and think about changing our mind in terms of an incremental shift. People generally don't change their mind in one fell swoop. People who are rational generally change their mind incrementally. And so we should think about beliefs as being on a continuum. It's not a binary thing where you're either believing a proposition or disbelieving it. But we should be able to change and alter and adjust our credence levels over time as we encounter new information, as we encounter new data. So she notes that thinking about revising one's beliefs in this way makes, quote, the experience of encountering evidence against one of your beliefs very different, since each adjustment is comparatively low stakes. So, for example, she says, quote, if you're 80% sure that immigration is good for the economy, and a study comes out showing that immigration lowers wages, you can adjust your confidence in your belief down to 70%, end quote. And so, of course, this applies to the subject of scientific disciplines, evolutionary theory, intelligent design, and those sorts of things as well, because, as I said, people generally don't change their mind in one fell swoop. But people on both sides should learn to adjust their beliefs and their credence levels based on the information that they encounter as they read and listen to people on both sides of each argument. [00:21:20] Speaker B: Okay, so when we're confronted with discordant data that might differ from the position we hold, it's nice that it doesn't have to be this high stakes, all or nothing showdown. It's more about updating our confidence and beliefs we hold. And in that way, it's similar to, as you say, bayesian thinking or reasoning, and that likelihood ratio of a hypothesis being true or false. Now, Galev calls the absorption of new data to a belief more like an update, as you're mentioning. So, is this an approach that you champion or agree with? [00:21:53] Speaker A: Absolutely. I totally agree with our assessment. I mean, in developing a cumulative case, you might have many pieces of data, none of which are of particularly great weight, but they can amount to a massive cumulative argument. Every mountain can be broken down into tiny motes of dust, but that doesn't mean that the mountain itself isn't heavy. We might encounter these data points one at a time or in small clusters. We might not change our mind in one fell swoop, but we should update our confidence as we encounter those data points. And if there is a sufficient cumulative case that can be mounted for a position, then over time, our position should grow in increasing alignment to that position, to that hypothesis, supported by the evidence. [00:22:43] Speaker B: Okay. Yeah. In this polarized climate we're living through right now, how do we know what types of people and outlets will give us the best chance of learning from disagreement. Galev has some things to say here about the types of not just people, but media outlets and institutions that we will turn to to practice learning from disagreement. [00:23:09] Speaker A: Yeah. So Galev points out that when it comes to intentionally exposing ourselves to content representing the other side of a debate in which we are interested, people tend to make the mistake of always ending up listening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side. But she explains that those are not very promising selection criteria. First of all, what kind of person is most likely to initiate a disagreement? A disagreeable person. Second, what kind of people or media are likely to become popular representatives of an ideology? The ones who do things like cheering for their side and mocking our character in the other side, that is, you. So instead, Galeb suggests, to give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you don't agree with them. People with whom you have some common ground, intellectual premises, or a core value that you share, even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith. And you find this with respect to scientific theories like evolution, and it's championing in the public arena, as well as religions like Christianity or Islam, etcetera. And you find it with respect to atheism, that many of the most popular figures that defend those positions actually turn out to be the least nuanced, the least careful, the least charitable, the least erudite individuals that one could inquire of. And that's unfortunate. But it's a sad reality that people that are more confident and less nuanced are likely to become more popular because people like affirmation of their own position, and it to be said confidently. And people don't like nuance and reservation and conservativeness in conclusions. And so unfortunately, people like Richard Dawkins or the late Christopher Hitchens and Sam Harris and Daniel Dental, they're not very good thinkers on the subject of, say, philosophy of religion. They ended up becoming very, very popular, whereas much more nuanced thinkers like Jeff Lauder or Graham Oppie or other great philosophers of religion who are atheists, haven't had the same John Mackey, for example, haven't had the same level of public exposure. And so that's a side state of affairs. So we should be careful when we're looking for outlets and resources that make the best and most robust case for the other side, to not merely assume that those that are the most popular are also the best, because it tends to be the opposite oftentimes. [00:26:13] Speaker B: Yeah. And I think it is good news that we don't have to engage everyone who disagrees with us. You know, it's better to move towards those that make it easier to be open to their arguments, not harder ones that might share common ground and share core values and that practice that nuance and demonstrate a degree of uncertainty. You know, nobody has this all buttoned up, and you gotta be able to show that as you express your beliefs. Well, as you say in your article, Jonathan, there's an element of scout and soldier in all of us. But the bottom line here is caring about truth, being desirous of the truth, as Galeth puts it, even if it's not what you were hoping for or not, what's most convenient. So, Jonathan, thanks for sharing with us some ways of thinking about how we can reason so that we can do it more effectively out there in the marketplace of ideas. [00:27:06] Speaker A: Thank you. Thanks for having me on. [00:27:08] Speaker B: Absolutely. Well, buy more of Jonathan's work at his website, jonathanmcclatchie.com, and you'll also find his articles along with many other great [email protected]. Comma your go to source for news and commentary about intelligent design and the debate over evolution. Well, we'll have Jonathan back very soon. As you can tell, it's a bit of a pattern. I'm having him back regularly to share his research and his insight into lots of different related topics, and it's a lot of fun. So thanks again, Jonathan. Pride to the future. I'm Andrew McDermott. Thank you for listening. [00:27:45] Speaker A: Visit [email protected] and intelligentdesign.org dot this program is copyright Discovery Institute and recorded by its center for Science and Culture.

Other Episodes

Episode 1017

May 22, 2017 00:11:54
Episode Cover

Anti-Human Science: Stephen C. Meyer and Wesley J. Smith on the March for Science

On this episode of ID the Future, listen in as Wesley J. Smith and Stephen C. Meyer answer questions at a Washington D.C. event...

Listen

Episode 1611

June 13, 2022 00:18:56
Episode Cover

Stephen Meyer — God Behind the Birth of Science and the Cosmos

On today’s ID the Future Return of the God Hypothesis author Stephen Meyer and radio host Michael Medved discuss some hit videos featuring Meyer...

Listen

Episode 1131

June 20, 2018 00:14:45
Episode Cover

DNA as Clue: How Intelligence Detects Information, and Creates It

On this episode of ID the Future, attorney and engineer Eric Anderson continues his discussion hosted by Mike Keas on what it means that...

Listen