The following interview is a transcript of the first episode of the "Capitalism, Climate, and Culture" podcast series from GMU Cultural Studies. Richard Todd Stafford sat down with John Cook, research assistant professor at the Center for Climate Change Communication, to talk about climate change denial and what can be done to inoculate the public against misinformation in "post-truth" society.
The audio version of this podcast episode can be found here or by visiting the iTunes site: https://itunes.apple.com/us/podcast/gmu-cultural-studies-podcast/id1437690638
John Cook: And they found that the link between political ideology and climate belief was strongest in countries that were the most reliant on fossil fuel, and so their conclusion was that ideological groups fueled by fossil fuel money pouring misinformation into the public sphere is what's driven this polarization.
Adam Proctor (Narrator): That was the voice of John Cook, who was recently interviewed by Richard Todd Stafford in conjunction with the Cultural Studies Colloquium at George Mason University.
The cultural studies program at George Mason University is a center for interdisciplinary research and doctoral training. This year's colloquium series examines capitalism and climate change. How did we get into this mess, and where do we go from here? This podcast series will explore these questions for a wider audience.
Richard Todd Stafford: I'm Richard Todd Stafford, and I will be interviewing John Cook, a Research Assistant Professor here at Mason's Center for Climate Change Communication. This is part of the Cultural Studies Colloquium Series, so our podcast interview today is associated with a talk he's giving later on called “Fake News, Educational Opportunities.” As we understand it, the topic of your talk, John, will be the role of ideological driven misinformation, and how we respond to science denial, and ultimately how can we turn misinformation into an educational opportunity.
Welcome to the podcast, John.
John Cook: Thanks, Todd. Great to be talking to you.
Richard Todd Stafford: I wanted to start out with your contributions to the literature involving how the consensus of climate science can be quantified. These two articles have been sort of widely used including in a variety of sort of non-academic settings. What I really want to know is, how would you like non-specialists and non-scientists to understand the relationship between the consensus among scientists and the coherence of the science itself?
John Cook: Maybe I should start by just giving a brief history of the work I did on consensus to give people some context.
In 2013, we published the paper “Quantifying the Consensus on Human Caused Global Warming” published Climate Papers. We basically looked at 21 years of studies about climate change, identified all the papers that stated whether humans were causing global warming or not, and amongst those papers we found 97% agreement that humans were causing global warming. Pretty straightforward. It was non-controversial. We weren't the first paper to find 97% consensus. We weren't even the second. But when we published, all hell broke loose. A lot of people criticized our study and said, "There's no
consensus," and were determined to I guess create as much uncertainty and doubt about consensus, as they had been for the decades leading up to that.
And then in 2016, three years later, we published another follow-up study which basically took our study, the earlier studies on consensus, the subsequent studies on consensus, all the authors of all the consensus studies, we got together. It was kind of like an Avengers -- consensus Avengers -- author team, and we published a synthesis of all of the studies finding that all these studies doing different methods all find overwhelming agreement amongst climate scientists that humans are causing global warming. That's it. Like climate scientists agree that we're causing it. That's a pretty simple message. But I guess opponents of climate action and climate deniers have been casting doubt on the consensus since the early 1990s.
Richard Todd Stafford: One of the responses among those people who deny climate change has been, not just to deny that there is a consensus. In some cases, you'll encounter individuals that accept that there's a consensus, but then they say that the consensus actually has no bearing on the scientific truth, and they often point to Galileo as a figure for this.
John Cook: Right. Okay. Yep.
Richard Todd Stafford: And so it strikes me that the consensus of scientists is standing in for the coherence of the science itself, the way that a variety of different findings sort of fit together.
I guess my question to you is, what do we want non-specialists to understand about that relationship between consensus and the scientific coherence?
John Cook: That's a really good point, and we were thinking about this when we first published our first study in 2013. When we published it, we published a FAQ anticipating these kind of questions. We used this great quote by John Reisman, who said that, "Science isn't a democracy. It's a dictatorship. Evidence is the dictator." Which I thought really sums that you don't decide our understanding of the universe by a show of hands. The evidence dictates our scientific understanding. So given that, what's the point of consensus? Why does it even matter? It matters because of how the average person, the average non-scientist thinks about complicated scientific matters. We don't have time to collect all the studies and do all the research on every topic because there's so much information out there, and time is short. It's hard enough keeping up with all the characters on Game of Thrones, let alone all the science. Unfortunately, we have a two year gap before the last season to catch up, but I digress.
So what the average person does is they use consensus as a heuristic or a mental shortcut, "What do the experts think?" Usually that's a fairly good indicator of what the actual scientific evidence indicates. That's a psychological reality. That's how people think, and therefore it's important that, not only do
we communicate the evidence on climate change, we also need to communicate the state of scientific agreement because if we don't, then other people will be communicating misinformation that tries to confuse the level of agreement.
Richard Todd Stafford: One of the things that really stands out about the response you just gave is this gap between the way experts engage with the evidence and with the ongoing science, and the way that the public engages with the science as non-specialists without time to assess every individual study, and in many cases without the skills and background that would be necessary to do so.
One of the ways that plays out that strikes me as very important, and this both among those who deny climate change and among some mainstream media sources who sort of misguidedly want to present a balanced view. The nuanced everyday debates within experts communities around climate are presented as evidence or mischaracterized as evidence that the core scientific knowledge about anthropogenic climate change is unsettled.
I guess my question as someone who's very interested in science communication more broadly is, is there a way to represent the excitement of scholarly debate in scientific discovery in a way that doesn't mislead the public to underestimate consensus about the basics?
John Cook: Right.
I think there's different messages for different contexts. If I was stuck in an elevator with someone, and I had like 20 seconds, and they said, "What's up with climate change?" I would probably just say, "Well, 97% of climate scientists agree that we're causing it." If I had time for a longer conversation. If it was like 30 stories and we could really get into it or having beers in a bar or something, I would try then to explain how science works and explain that science isn't a monolith where we understand everything at the same level. There are some areas that we understand really well, and there's other areas where we're still exploring and trying to figure out exactly what's going on.
And so having explained that idea, that nuance, that there's different levels with different understanding, then you can point out that in climate change there's some things that we understand really well and we've known for decades. There's been a consensus since the 1990s that we are causing most of global warming. As for questions like how are clouds going to change under global warming? What's El Niño going to do? How do the different feedbacks interact? What do climate models do as you get higher and higher resolution as computers get more powerful? These are all ongoing questions, and scientists are exploring, and we'll continue researching and trying to get more and more detailed answers over time. But the basic answer that we know enough to act on climate change has been known for decades.
Richard Todd Stafford: Okay.
I'm going to press you a little bit on this though because in the examples or thought experiments that you presented, you're in these sort of face-to-face or personal interactions. But I think that when we're communicating with the public, very often it's through these mass mediated forms, whether it's the newspaper, YouTube videos, or what have you, where we don't have the sort of ability to know whether the person is only going to take -- you know, is this a situation where I should just present the core message? Like, "We know that the climate is changing. We know that humans are the primary driver, and we know that we have to take action now." Right? That's the sort of settled piece.
But in a lot of these cases, do you want to sort of communicate these more nuanced things like, "Is El Niño going to be stronger or weaker? Is the Atlantic current going to reverse or stop?" And these are real scientific questions. They're exciting, and in fact important scientific questions that the public probably does want to know about, probably has reasons to want to know about, but it's that sort of decontextualized way that those debates get taken out of the popular media representations that then become the argument, "The science isn't settled on this."
John Cook: Right.
Richard Todd Stafford: I guess my challenge is, how do we deal with this outside of the face to face setting?
John Cook: I think the same principle applies because you have a whole range of different contexts. You might Tweet. You might post a Facebook post where you have a bit more room to read. A blog post, you have more room than that, then a long form article, and then a book. So there's a whole range of different contexts. And similarly, the type of communication I've been doing is mainly responding to misinformation, and so we've written the same debunking in all these different forms, from a meme, to a tweet, to a Facebook post, to seven minute online videos, to textbook chapters. All of those are using the same information structure, but just going into different levels of detail.
The interesting thing is I'm currently doing this fascinating eye tracking research with some communication researchers here at George Mason, and we're able to really understand exactly how people consume social media, and it's mind boggling how little time people spend. Even looking at tweets, they look at it for like five, ten seconds, and they move onto the next one, so you have so little time to grab people's attention, and so that principle of keep your communication simple, but ... I forget the exact quote. It's like, "Keep it simple, but not too simple," or something along those lines. Simple enough, without distorting the science is the key to being effective.
Richard Todd Stafford: In creating these interventions that sort of point people towards the techniques of misinformation and help people identify what misinformation looks like, you're relying on this model of inoculation. For instance, with Daniel Bedford and Scott Mandia, you've made a case for using agnotology or this deliberate study of how doubt, ignorance, and misunderstanding are systematically cultivated both at the college level and in popular media.
I guess what I want to know more about is what does it look like in practice to inoculate someone against these techniques of misinforming?
John Cook: Yeah.
As I just described, the idea of inoculation can take so many different forms. It's a really adaptive, flexible way of communicating. Let me just explain the principle of it and what it is first. Inoculation takes the idea of vaccinating in the medical sphere and applies it to knowledge. We can expose people to a weak form of misinformation, and that helps them build up immunity or resistance so that when they encounter the actual misinformation, they don't get influenced by it in the same way that injecting people with a weak form of a virus gives them immunity so that they don't get infected by the real virus.
What do I mean by a weak form of misinformation? Usually it involves introducing someone to the myth, but also at the same time that you tell them that this misinformation exists, you warn them of the danger that they might be misled. The warning is really important. And you explain the fallacies, or the techniques that the misinformation uses to mislead, so you're essentially explaining the magician's trick. You're showing the sleight of hand that they're using to trick people. And once people can see behind the magician's trick, they're immune from being deceived by it.
There are so many different ways that you can inoculate, you can explain the techniques of denial, and we're now just exploring that in various studies. I started by working with some critical thinking philosophers on how do you actually build an inoculation. What's a rigorous, systematic way to find the fallacies in misinformation. And they used all their critical thinking methods to show me the step by step ways to deconstruct arguments, identify any fallacies in them, and then having identified the fallacies, then you can explain to people, "Hey. This argument commits the fallacy of non sequitur, or red herring, or jumping to conclusions, or whatever.
But then the philosophers explained this or pointed out to me this technique which has been intriguing me ever since called “parallel arguments,” taking the bad logic in a piece of misinformation and translating it into a different situation, usually an absurd situation, and saying, "If you use that logic in this situation, you can see how ridiculous the logic is, and therefore ... " That's a way of pointing out to people that argument is logically false. "It's just like doing this." And I realized that I see that technique used every night on late night comedian shows. They're always debunking. "Some politician said this, and
that's just like being in this situation and saying the same thing," and everyone laughs. They instantly get how ridiculous that original statement is. It essentially inoculates people against false arguments without having to get into -- I mean ideally I would love to communicate all the science and explain all the facts of climate change, but you can potentially inoculate people just with a simple parallel argument.
Richard Todd Stafford: It seems really valuable to me to sort of help people develop the ability to identify those arguments that truly aren't logically valid. But it also seems like some denialist messaging might be logically valid, but depends on an empirical state of affairs that is not true.
John Cook: Right.
Richard Todd Stafford: Is there a way to use agnotological methods to sort of prepare people to see that?
John Cook: Yeah. I mean that's what the critical thinking method did. It was, if you deconstruct any argument, every argument has the structure of a set of premises and then the conclusion. And so what we did was, step by step you deconstructed. You work out all the premises and conclusion. The beauty of critical thinking is it's kind of like X-ray vision into thinking. You see what the actual structure of their argument is. The first step is, is it logically valid? Do those premises lead to the conclusion? And if not, then already you've shown that the argument is false. But often it is logically valid, but just one of those premises is based on a falsehood of some sort. So, if it is logically valid, then you go to the next step and interrogate all those premises, so it's just a systematic way to try to surgically pinpoint exactly where an argument goes wrong.
Richard Todd Stafford: In order to get people to do that, you have to sort of generate a habit of mind that doesn't involve that five second engagement with the tweet. Is that right?
John Cook: There's a real tension between me as a communication researcher and my colleagues who are philosophers on our goals. My goal was to come up with ways to develop communication messages. Their goal was to turn the entire world into critical thinking philosophers. And while I would love that everyone was deep critical thinkers, it's never going to happen. It's not realistic. I mean it's a goal that we should aim for, and I think that we should develop educational curricula that tries to promote that goal. But in the meantime, in the real world where everyone's time-pressed and will look at tweets for five seconds, what do we do? That critical thinking method can be a way of thinking that we teach the people, but it can also be a technique that communication uses. Communication like science communicators, educators, scientists use to analyze misinformation or just statements in general to find out whether they're true or not. And having identified fallacies, then create your five second messages, or your 60 second messages, or your seven minute videos, or your lesson plans, or whatever.
Richard Todd Stafford: It does seem like one of the issues that you would run up against,-- if your goal was on one hand to sort of cultivate a community where people think more about what the premises are and whether the conclusions logically follow from them, and on the other hand to present people with communications that sort of model that kind of critical interrogation of the evidence,-- it strikes me that evidence that you've found that many people who deny anthropogenic climate change are for contradictory claims actually quote close to each other in many cases with no seeming acknowledgement that they're contradictory might be something of an issue. Like it's not just a symptom that you're trying to address. I guess the distinction is whether it's just in bad faith that they're doing that, or whether in all earnest they're doing that.
What are your thoughts about why some people seem in all earnest to sort of endorse mutually contradictory views? Is it that they just haven't internalized these habits of mind?
John Cook: I think that firstly almost all climate deniers are earnest. I think very few of them are intentionally being deceptive. And understanding the psychology of climate denial is really important in order to understand that. Because if someone is motivated to reject the science for whatever reason, because it threatens their ideology, or because it threatens their social identity. Regardless of the reason, if someone is motivated to deny science, then they employ these -- I guess these psychological kind of techniques, or they just process information in a biased way. If you communicate some climate change to someone who's threatened by climate information for whatever reason, they'll be biased in the way they process that. And someone who processes information in a biased way looks exactly the same as someone who's being intentionally deceptive because essentially they're deceiving themselves.
And so I think we can't know what's going on in a person's mind. We don't know whether they're being deceptive or in good faith. And so the approach I always take is to be agnostic about their motive and just focus on their technique. Because otherwise, if you're just accusing everyone of lying, firstly, you're probably going to be wrong. Secondly, it's just not a constructive, useful approach. So having established that, the next thing to recognize is that all those psychological biases manifest in different ways, different fallacies of logic.
What we did when we in my work is create this taxonomy of all the different types of fallacies that people employ when they reject climate science, and so understanding all those fallacies is key to being able to spot the mistakes in bad arguments, and then being able to inoculate against them.
Richard Todd Stafford: You've done some work related to this that suggests that exposure to information about anthropogenic climate change tends to strengthen denial in some populations. I guess this is that phenomenon that you were describing where they may in some ways be lying to themselves, but it seems surprising that it strengthens their denial. You know, their denial doesn't stay at a steady state. And this is even as it strengthens acceptance of the science in others.
How do you explain this polarization of reactions to the same piece of information?
John Cook: I ran experiments where I presented consensus information to people, basically told them that 97% of climate scientists agree, and then I measured their response. Like whether their accepting of climate change went up or down after hearing this information. I found that there was a spectrum of people from across the political spectrum from liberal to conservative. Liberals were more trusting of climate scientists. Conservatives were less trusting. And when strong conservatives, just right at the end of the spectrum, a fairly small proportion of the population, those with the strongest distrust of climate scientists. When they were told that 97% of climate scientists, who they distrust, agree on human caused global warming, their distrust got even stronger. Like it kind of activated their distrust. It was like, "I don't trust those pointed hatted boffins, so yeah, I believe it even less than ever now."
So telling people about or giving people climate information essentially activates their conspiratorial thinking, if they already were prone to conspiracy theories. And what that tells me is that conspiracy theorists are essentially immune to facts. Because any facts that conflicts with their conspiracy theory, they just assume that it's part of the conspiracy. It's very difficult, almost impossible to convince the conspiracy theorists. And so I think that we could spend our time banging our heads against a brick wall trying to convince the conspiracy theorists or a climate denier. But a much more productive way to spend our time is the vast majority of the rest of the population who are open to information. And I think the implicit assumption underneath inoculation is that it's not about cure. It's about stopping denial from spreading to the rest of the population.
Richard Todd Stafford: The more you use the inoculation metaphor, the more it strikes me that the sort of vaccine denialism takes very much the same structure as climate denialism.
John Cook: It does.
Richard Todd Stafford: There's a certain irony to this metaphor.
John Cook: In our eye tracking research that we're currently doing, we're trying to inoculate people against misinformation on the topics of climate change, gun control, and vaccination. I think there's a particular irony or poetic justice in using vaccination -- or using inoculation to try to stop vaccination myths. I hope that it turns out to be fruitful because it would just be really cool.
Richard Todd Stafford: It will certainly make for a good paper title.
John Cook: Yeah. That's right. It'll be fun coming up with our cleverest kind of play on words.
Richard Todd Stafford: Both in your work about polarization, and I think in a couple of the other articles that you've contributed to, you've used self-reported belief in free markets to illuminate the relationships between the effectiveness of climate change related communications and political orientation. The question I have about that is, why use this measure? Why not use another indicator like political party affiliation, whether someone has a more hierarchical, or egalitarian viewpoint, or whether someone's more individualistic or communitarian?
John Cook: Right. I mean that's a good question.
In my initial surveys in the interest of keeping the surveys as lean and mean as possible, we just went with one measure of political ideology. And we chose free market ideology because it was one of, if not, the strongest predictor of climate beliefs. But actually in hindsight since then, and it's really only been the last couple years, I've really come to realize that tribalism is actually much stronger than ideology in driving people's beliefs and actions. I think the last few years in the US has shown just the strength of tribalism. And this bears out in subsequent research since I did my initial surveys. One meta-analysis that looked at a whole range of surveys into climate beliefs and ideology found that political ideology was the second strongest driver of climate beliefs, but political affiliation was the strongest, so nowadays I actually try to use both measures, political affiliation and political ideology, when I'm measuring people's political backgrounds.
Richard Todd Stafford: I think that you were just getting at it a little bit, but I'd like for you to define a little bit more what you mean by tribalism in this context.
John Cook: Well, humans are social animals. That's just how we evolved. That's how our brains work. Social norming, the behavior of our social groups is one of the strongest drivers of our beliefs and behavior, and so that means that what our social group thinks, believes, and acts has a really powerful influence on our beliefs. Now climate change didn't used to be a social or a culturally polarized issue. Back in the '80s, George HW Bush said, "We're going to fight the greenhouse effect with The White House effects." But over time, starting with ideological groups, they just started pouring misinformation into the public. And just gradually over several decades, the public came to associate attitudes about climate change with what their social group believed. Like Republicans thought, "My social group doesn't really trust those climate scientists, and doesn't accept the science and all those lefty policies that go with the science." And Democrats came to see that, "My group accepts the science and all the policies that go with accepting the science," and it just became associated with our political groups or our social groups.
I'm going to repeat back what I think I just heard. For a given individual, a sense of group identification might be the best predictor of how they're going to react to information when they're exposed to it. But the groups themselves came to have sort of beliefs about climate scientists or other kinds of expert groups, public health experts telling them to vaccinate their children, etc., by virtue of
outside interests that sort of feed misinformation that's targeted to their social group, the group they identify with. Is that what you're saying?
John Cook: Yeah. Generally, yes. There's nuances to that, but what a social group thinks they should think can change over time, like as you get misinformation coming in. And in fact a couple of studies have found that one of the biggest predictors of changes in a group's attitudes has been elite cues, or cues from their political leaders. In around 2009, for instance, suddenly public attitudes about climate change dropped quite strongly. And at that same time, the conservative leaders started throwing out all this misinformation, and attacking the science, and casting doubt on climate science. And those elite cues, those cues from their tribal leaders had a big impact.
So whether its misinformation coming from conservative think tanks, or coming from conservative media, or political leaders, or through coming originally from think tanks and then flowing through these other sources, that's how the information disseminates to social groups. And then there, I guess social identity forms around these beliefs.
Richard Todd Stafford: Since you're coming from another national context, I wonder, is there national and regional difference about this? Or do we see this sort of the same model of misinformation flowing from elites into social groups generating a sort of conservative disidentification with climate scientists globally?
John Cook: Some of my colleagues at the University of Queensland did a study of different country's attitudes about climate change and predictors. And they found that the link between political ideology and climate beliefs was strongest in countries that were the most reliant on fossil fuel, and so their conclusion was that ideological groups fueled by fossil fuel money pouring misinformation into the public sphere is what's driven this polarization. Naomi Oreskes uses the phrase, "A perfect storm" between ideology and vested interests or the fossil fuel industry.
I've forgotten your question, but I guess -- Yeah, across countries, it's strongest in the US. It's also quite strong in Australia. But generally speaking, it's countries where the fossil fuel industry had a vested interest in keeping the public confused about climate change.
Richard Todd Stafford: I'm going to pivot back to a theme that we were discussing a little bit earlier, and it's really the way we frame the problem of misinformation. You've just given a sort of broader political economic context for the problem of information. But when people are thinking about the audiences that receive these messages, I am hearing two sort of major frames. One of those frames emphasizes that the general public has information deficits. They don't understand the scientific knowledge. They don't understand the degree of scientific agreement. And crucially, they don't understand the internal scientific processes, norms, and institutions. So that's sort of one frame that I hear very commonly.
But it seems like this frame that you're describing in terms of tribalism is often presented as a competing frame. That is the primary problem's not information deficits, but rather that group identities and cultural ideologies shape the way people react to and make use of the information that's available to them. What's the relationship between these views, and is there a way to reconcile them? Can these both be true, or are these mutually exclusive ways of framing the problem?
John Cook: Right.
When I'm asked, "Is it information deficit or is it cultural values?" I always answer "Yes" because I think it's a real danger to treat it as a dichotomy. It's a false dichotomy because it's not solely one or solely the other. I think that both have elements of truth. Information does matter. Even for people who are biased, information matters to them to some degree. It just matters to levels of degrees. And similarly, not everyone is cultural either. The people at the edges of the political spectrum are stronger in terms of the political biases. People in the middle are less so. So there's a mix, and facts matter, information matters, but culture also matters, and we need to be aware of how culture can bias how people process information. And understanding both is key to making sure that our communication is framed in a way that gives the facts a fighting chance. So we try to avoid triggering the psychological minefields that conservatives might have with climate change.
I'll give you an example. What I found with my inoculation study was when I inoculated against climate misinformation, I didn't mention climate change. I didn't mention the exact myth that I was inoculating. I just mentioned the technique used to mislead people because no one wants to be misled. Whether you're conservative, or liberal, wherever you are on the political spectrum, no one likes being tricked. And so that's a potential way to reduce the influence of misinformation without triggering those psychological booby traps.
Richard Todd Stafford: Fantastic.
Thank you, John. I really appreciate you coming and talking with us today. We look forward to hearing your lecture in just a few moments.
John Cook: My pleasure.
October 13, 2018