Misinformation, Disinformation with Lee McIntyre

Ceejay Hayes:

This is CounterPol. Today, we're exploring the origins of disinformation and misinformation with Dr. Lee McIntyre. Lee is a research fellow at the Center for Philosophy and History of Science at Boston University and an instructor in ethics at Harvard Extension School. Lee brings to light a through line between opportunist and public facing disinformation campaigns that starts well before 2020 or 2016. I recognize that this is a sensitive topic, mostly because misinformation and disinformation research fall squarely on a political fault line. It's easy to hear the phrase and immediately make an assumption as to what group of people or ideologies are being lambasted. That is not the point of this conversation. Disinformation and misinformation are real threats to social order, and it is imperative that we as individuals do what we can to ensure that our worldviews are informed by properly contextualized truths. I hope you enjoy.

Lee McIntyre:

I'm a philosopher of science and I spent most of my career studying what's special about science and why anybody would attack it. Who are the science deniers? What do they have against science? What's their understanding of science? And after writing about that for many, many years, I started to realize that science deniers for the most part weren't born, they were made. Someone made them become deniers and the way that they did that was through disinformation. Because if you look at what deniers do, they don't deny all science, they deny the science that they've been radicalized on about vaccines, about climate, about smoking. And those are the subject of disinformation campaigns. Somebody wanted them to be deniers. And then I realized that it was the same in politics. The same thing was happening in American politics. And that's when I wanted to write my new book on disinformation, because it was something of a realization to me that this was all due to disinformation.

Ceejay Hayes:

At a very base level, what is disinformation? What is misinformation? I know you make a distinction between the two, even though in this space we tend to use it interchangeably.

Lee McIntyre:

The way that I use the term is that misinformation is an accident, but disinformation is a lie. Some people like to use the word misinformation as kind of an umbrella term, and that's fine. I like to draw a hard distinction between the two, because I think that most people don't get it. Newscasters in particular don't get it. And if you let them get away with saying misinformation when it's really disinformation, then they're going to report on it like it was a hurricane or some natural disaster, which means that nobody's at fault. disinformation, somebody's at fault. And so I need journalists, especially cable journalists in the United States, to start using the term disinformation because I need them to start reporting on it as what it is, which is a strategic, organized campaign of denial. That's why I use the term disinformation. And I don't mean any disrespect to folks who use the term misinformation because it's technically, disinformation is a type of misinformation. But I think it's important to make that distinction because it helps us learn how to fight it.

Ceejay Hayes:

Also related to our understanding of disinformation is post-truth, which you've also written about. Can you let the listener know what post-truth is as for how we just understand what is factual and what exists in reality.

Lee McIntyre:

It's easy to misunderstand the term post-truth. Some people think that the term post-truth means, oh, nobody cares about truth anymore. That's not what it means. Post-truth was the OED's word of the year in the fall of 2016, even before Trump was elected. But given Brexit and the whole campaign that was going on in the United States, the usage of that term post-truth spiked something like 2000 percent. And they defined it as an era when feelings meant more than facts. They gave it not an inaccurate definition, I mean they're the OED after all, but I didn't think that it went to the deepest level of what was behind it, why it was happening. So in my book, Post-Truth, I make a very specific definition of that notion as what I call the political subordination of reality. Post-truth is a warning. It's an alarmist term for the fact that there are people out there who don't want the truth to be the truth. They want reality to be different what it is so that they can get what they want, whether it's money or political power, whatever it is. I see post-truth as a tool in the authoritarian's toolkit. And so I define it in political terms, because when I say we're in the post-truth era, I don't mean that nobody cares about truth or that nothing can be done about this. I mean we're in an era in which truth is under threat. Disinformation is one of the ways that truth is under threat, because when you're in a polluted information environment, It's very, very hard to know what's true and who's speaking in good faith and who's not. So some philosophers, unfortunately, just don't get it and keep saying over and over in different ways, well, if we were in the post-truth era, I mean, why do people even look both ways before they cross the street? That's ridiculous. I expect more from philosophers than that. Post-truth is a claim that truth is under threat, and I think they should understand that.

Ceejay Hayes:

But now we've got a quick understanding of disinformation. We have a quick understanding of post truth. So can you give us a quick history on the use of disinformation and misinformation to destabilize democracies and societies in general?

Lee McIntyre:

myths and disinformation have probably been around as long as human speech. The term disinformation is a Russian term, I can't pronounce it in Russian, but it started in the 1920s when V.I. Lenin appointed Felix Terzanski, the first director of the Cheka, because they needed to engage in some asymmetric warfare, they needed to fight the counter-revolutionaries to the Russian Revolution, and they came up with all of these psychological warfare tactics that really founded the disinformation era. The other strand of this is that December 15, 1953, the tobacco executives in the United States met at the Plaza Hotel to decide what to do about a forthcoming study which was going to show that there was a causal link between smoking and lung cancer. And of course, they couldn't have that. They were worried about this. So they hired a public relations expert to come in and tell them what to do. And he said, fight the science. And he didn't mean do better science. He meant fight it through public relations, you know, take out full page ads, start to really hit that idea that this link hasn't been proven. Well, of course, given the problem of induction, nothing has ever been proven. I mean, it's just the nature of science. It's about probability. there's always some uncertainty, you can't have proof. But they rode that wave of uncertainty for the next 40 years. Those are the roots, and the problem is this. I think that politicians paid great attention to the way that science deniers used disinformation for 70 years, starting from that tobacco strategy, which Naomi Oreskes and Eric Conway argue in their book, Merchants of Doubt, really paved the way. It's the same strategy that the climate deniers use. It's the same strategy that the anti-vaxxers use. And so we, who are educated people, understand that's ridiculous. But you've got to realize that it was wildly successful. That campaign of science denial for 70 years has been wildly successful and they've used it for everything that they want. And I think that politicians looked at that and said, I want to get me some of that. I want to deny the reality that I don't like. I don't like it for the purpose of my campaign that the crime rate is going down. I want it to be going up. So I'll talk about it going up. I don't like it that immigration is not as big of a problem as it could be. So I'll make it sound like immigration is a much worse problem. Politicians began to discover the usefulness of denying the actual consensus narrative and providing an alternative narrative. That's how we really arrived in the post-truth era, because those techniques of science denial and disinformation found their way into politics. And it culminated, I think, the thing that woke me up was January 6th. I think there's a straight line between what happened in that meeting at the Plaza Hotel in the 1950s and January 6, 2021. What was Trump doing? He was radicalizing the troops with his alternative narrative, which was based on a lie that served him but didn't serve them. That is just textbook what the science deniers were doing.

Sander van der Linden:

You mentioned something important about the uncertainty of science that's sometimes weaponized by bad actors. You talked about the politicization of science, this idea of alternative facts. Sometimes I think people can get confused about this, and I was wondering if you could explain the difference between postmodern and post-truth. Is one a gateway or the other? So the idea that even amongst philosophers and scientists, that there's multiple ways of knowing things. Is that a slippery slope to post-truth, or should we see that as something entirely different?

Lee McIntyre:

In my book Post-Truth, I've got a whole chapter on post-modernism, in which I claim that the idea that there is no truth, or that the truth cannot be known, or that there are many truths, or all truth is relative, or that all truth is political, you can only understand truth within a political context. Those are dangerous notions. Now, the people who came up with them were doing them in the context of literary criticism, they weren't me. meaning to hurt anyone, maybe. But those tools got picked up and became very useful in something called the science wars from the 1990s, where all of a sudden people were claiming, well, you know, if there's no truth, then same thing for scientists. And anything that a scientist says, they're not objective. They're just making a claim about reality that serves their own power. Well, that's ridiculous. And most philosophers of science recognize that it's ridiculous and fought back against it. But boy, did that turn out to be a very useful narrative for people in politics who wanted to say, well, there's no truth, there's only narrative. So now you have right wing postmodernists. It's odd because the literary critics were from the left. And I think that they invented this really powerful tool and left it on the battlefield. And one day, the right wing came along and said, wow, look at this flamethrower. This is really useful. We can use this. And I don't think that Trump is reading Derrida and Foucault in his spare time. I don't think he probably reads at all. But some people around him are right wing postmodernists. And so I think that that has been one influence. I don't want to place too much blame on the postmodernists who came up with it. But the thing that upsets me is this idea that philosophers right now who understand the importance of the notion of truth and will argue like cats and dogs over what truth is and is the correspondence theory, the coherence theory, the pragmatic theory correct, they think truth is important. So we need to say that. And I'm a public philosopher, and I'm not afraid to say that. Truth is important. And there is such a thing as reality, and there is such a thing as truth, and it may be difficult to know. Go back to Plato and his cave. But that doesn't mean that there is no truth, there's only narrative. And the reason that's the hill I'm willing to die on is this, because to make the claim that there's no such thing as truth, or that truth can't be known, is precisely what the authoritarian wants. There's a wonderful quotation I put it on page three of my book on disinformation. Quotation from Hannah Arendt, who says, the ideal subject of totalitarian rule is not the convinced Nazi or the convinced communist, but people for whom the distinction between fact and fiction, true and false, no longer exists. That's the danger, right? It's the danger that if people give up on truth, if they're cynical, if they think there is no such thing as truth, or you've got your truth, I've got my truth, that is the path to submission to authoritarian rule.

Alan Jagolinzer:

In my field, we are cynical, and we do vet information, and to me, this notion of post-truth and disinformation is inherently exploitive. Do you have any sense, why does an audience bite into this and go along with it when it seems to be self-harmful?

Lee McIntyre:

Which audience bites into it? You mean the people who are fighting disinformation or the people who are disinformed?

Alan Jagolinzer:

I mean, in order for an authoritarian, as you noted in your last discussion, there have to be people who basically go along with the notion that the bogus information is in fact the correct information. They don't vet it, they just take it as given. And I'm just curious if you have a sense for why that is.

Lee McIntyre:

They're opportunists. I mean, they don't care about the truth. They're not good faith. They're not like scientists or philosophers who are really trying to discover the truth. They're just saying whatever gets them from one moment to the next or gets them to their next goal. I mean, that's the one really important thing about disinformation, I think. First, if the disinformer can get you to believe a falsehood, that's all to the good. That's what they want. They tell you that something false is true, and if you believe it, you become part of their army. You may not, as a believer, get anything out of it. In some sense, the believers are victimized because they believe a falsehood. But the person who invented the disinformation, they get something out of it. But I think there's a secondary goal to disinformation. and that's to polarize us. That's to polarize us around a factual issue to create team building so that what we end up with is not just somebody who believes one falsehood but somebody for whom the path to truth is poisoned because they've been radicalized by this demagogue who's using disinformation to erode their trust in the truth-tellers. That's the really insidious part of disinformation. It doesn't just wreck one fact or one truth. It gets you to distrust the people on the other side, to see them as your enemy, even to hate them. And that's what we now find so much of. And then the third goal is the one we just talked about, where you're so cynical. And this is another insidious part. Even if you don't believe that the falsehood is true, you've still maybe been polarized. Maybe you're cynical. Maybe you think, oh, what's the use? I can't fact check everything. Or worse, it's not even worth talking to the people on the other side. you can get so polarized by disinformation to think that even though you don't think it's true, you think, well, those other people are in a cult. Why should I even bother with them? And boy, if you do that, then they're really down the rabbit hole and gone.

Ceejay Hayes:

You're touching on a very important point that speaks to the underlying reality of our polarized society, is that it breaks consensus on where we can identify truth. And I think that's a huge huge obstacle in countering disinformation that I would like to get to in a couple of questions. But one thing I want to ask is, I'm going to assume that people who are thinking about disinformation casually, and sort of locating it from 2016 to now, have a dispersation that disinformation sort of originates under political fringes, and then moves inward towards the mainstream. But if I think historically, and I brought in my sense of what disinformation can be, I personally just identified different. where Black men were not given the information they needed to provide informed consent for participation in the study over 40 years. I think about Ronald Reagan and his 1976 presidential campaign, where he invokes the story of Linda Taylor, this woman who admittedly defrauded the government through government welfare programs, but he well exaggerated the figure to the hundreds of thousands. I think about Colin Powell misleading the UN Security Council to justify invading Iraq. I think of these scenarios. These are not the fringes. This is the institutions and people within the mainstream who are their own goals. So what would you say about where disinformation lives? And maybe we could only talk about U.S. democracy, but like where it lives within the U.S. democracy.

Lee McIntyre:

Here's the hard part. Sometimes distrust is legitimate. If you were lied to by the medical community for 40 years, You might be a little shy about signing up for a study or taking that vaccine. I mean, sometimes that is the actual reality and lived experience that people have that they shouldn't trust the scientists or the medical professionals. And what do you do about that? That's a very difficult thing to figure out how to overcome that. Political disinformation, I think, originates from someone who wants something. Sometimes it's money, sometimes it's power, usually it's power. And they don't like reality as it is, so they pretend that it's otherwise. But it's not simply to convince you, it's to assert their power. Now, maybe in some ways this isn't too far from Foucault, so pardon me for that, but the idea here is this, when you're sharing disinformation, and I think of Trump here saying that his inauguration was bigger than Obama's, did he really expect anybody to believe that? Maybe not, but he did it anyway, why? Because he was saying, I'm so powerful that I can say something that's untrue and there's nothing you can do about it. And guess what? That's your new reality. It's a kind of gaslighting, but it's also a kind of bending reality to your will. And people love good faith. understand that we don't always have consensus around factual issues, that reasonable people can disagree, and the disinformers exploit that. Because a disinformer, by definition, knows that what they're saying isn't true, but they're trying to get you to believe it. Why? Because they want something, and by getting you to believe it, they can get that thing. Now, that is such a radical misuse of the idea that truth is hard to know, that objectivity is not perfect, that there's not 100% consensus around any scientific issue. You know, we all understand that. But it's a misuse because they're not acting in good faith. It is a reasonable thing, I think, for people to say, but look, there's no consensus on what's objectively true in some areas. So how do we know that the denier isn't right? You do science. That's how you find out. Scientists understand that there's not 100% consensus around things. Scientists understand that objective truth is hard to come by. But their response is not, well, hell, let's kick in the door and let the flat earthers in. It's let's see what the evidence is. Let's test it. Let's be people of good faith who understand that no matter how good our theory is, it could be overthrown by future evidence. So we have to be humble. It's an idea called fallibilism in philosophy. I think scientists need to lean into that to understand that uncertainty is not anything to be ashamed of. That's how scientists learn. I wrote an earlier book called The Scientific Attitude, in which I argue that, in fact, that's the defining characteristic of science, the ability to say, I care about evidence so much that I'm willing to change my mind on the basis of new evidence. That is, I think, the greatest invention of the human mind, that attitude, that ability to do that. Disinformers don't do that. Liars don't do that. They're not people of good faith, and they're just exploiting the rest of us. And so I think the antidote to post-truth is the scientific attitude. The attitude to disinformation is people of good faith and knowledge saying, no, that's wrong. Not here's 100 percent proven absolute truth because we don't know that. But here's what science has found about vaccines. Here's what science has found about climate change.

Ceejay Hayes:

Your work locates the origins of disinformation, our current disinformation crisis in science deniers and denialism. Can you trace that journey for us, but also talk about how politicians are attacking scientists in this moment?

Lee McIntyre:

I think that the science denial and the political and the reality denial are now one thing. and it's a feedback loop where you now find people like Trump for political purposes denying scientific truths about vaccines, etc. So it's the same thing, and the umbrella term is reality denial, but science denial led the way there, okay? How do I locate it? I mean, I guess it was that first moment when people figured out that strategic denial worked. Well, look, we all understand from the time we're children, the first time we ever tell a lie. Damn, that was easy. I didn't get caught. I lied and they believed me and I got what I wanted. That was awesome, right? As individual human beings, we understand that and people lie as individuals for different reasons. Sometimes it's we don't want to hurt somebody else's feelings. It's not always nefarious intent. Disinformation is not like that. Disinformation is a strategic, organized campaign. It's a type of information warfare is what it really is. to get an army of people to believe a mistruth. And why do they want them to believe that falsehood? It's because it serves them. I'll give you an example. Everybody knows the ridiculous, insane idea that there are tracking microchips in the COVID vaccines. Almost no one knows where that came from. That came from Russian intelligence. Russian intelligence officers invented that and pumped it out in April 2020, one month into the pandemic before Pfizer and Moderna vaccines were even a glimmer. They published an article in an English-language propaganda arm of the SVR, which, you know, used to be the KGB, which said any future Western vaccines for COVID will have tracking microchips in them, courtesy of Bill Gates, who holds patent 666 on this technology. Then the following month, 28% of the American public believed it. Now, why did Russia do that? There were two reasons. One was money. The Pfizer and Moderna vaccines had not been invented yet, and the Russians had a vaccine called the Sputnik. And if they could get the Sputnik in the Asian and African markets, they could have made billions of dollars, which they really desperately need. But the other is power, is pride. Sputnik? Why did they name it Sputnik? And it destabilized American science. The Russians had been undermining American science for 20 years. There's an article in the New York Times in 2019 called Putin's long war against American science. So even Putin started with science denial against Western science, but he figured out He's a KGB officer, formerly in East Germany. He figured out, oh wait, we can do this too for memes in the 2016 election, et cetera, et cetera. Like I said, it's all one thing. It's people of bad faith and selfishness who don't care if they kill thousands of people, as long as they get what they want.

Ceejay Hayes:

It just kind of feels like confirmation bias is informing how we understand truth. And so how do you solve for that when people are so emotionally and ideologically tied and attached to what they believe to be true?

Lee McIntyre:

We've all got cognitive bias. There are about a hundred of them. The Wikipedia page on this is actually pretty good if you want to take a look at all the weird ones. Confirmation bias is the big one. That's the one that everybody knows. How do we solve for it? I think that the best way to fight what we're up against is through enlightenment. People who have already gone the rabbit hole, down the rabbit hole, it's very hard to get to them. I wrote an earlier book called How to Talk to a Science Denier, which is based on some empirical research by Cornelia Bache and Philip Schmid, that shows that sometimes you can get to people, even once they've gone down the rabbit hole. Pre-bunking is better, all props to the brilliant cognitive scientists we've got on this call, but is there a way to help people who have already gone down the rabbit hole? Yes. But better is to keep them from going down the rabbit hole by not amplifying that disinformation. And one way that you pre-bunk is to expose the plot, to let people know in advance that it is a plot. And how many people who hear disinformation know it's disinformation? A lot of them think, oh, it's just misinformation because that's what the newscasters have told them. But I think the best way to fight disinformation is to do, I guess, a kind of pre-bunking to say to people, look, There are nefarious actors out there who want you to believe things that are untrue because it serves their interest. And if you can expose what those interests are, you can show who the people are, you can name names, you can show what they get out of it, whether it's power or whether it's money, whether it serves their ideology or something like that. If you can show why someone would be lying, then I think that that can go a long way toward fighting disinformation. But the rule one in my new book on disinformation, at the end I have ten practical steps that any citizen can use to fight. Rule one is you cannot win an information war unless you know you're in one. Because one cognitive bias we have is that we're too damn trusting. The primacy bias, the first thing we hear we think is true. And I mean, that's been wired into us since caveman times. I'm sure that old great philosopher Mark Twain said it best, it's easier to fool somebody than to convince them that they've been fooled. People hear something and they're fooled, and then you come along and say, don't you realize you were fooled? And they go, no, no, and they'll fight you to the end. Their ego's on the line. They don't want to admit that they were wrong. But if you can get to them first and say, This next guy who's going to talk to you, he's a shill. He's a political operative. He's a politician. He's going to be lying to you. Be really careful. He might even tell you conspiracy theory. Here's why that's dangerous. We have tools in our toolkit to fight back against disinformation, and there are many books out now, one of them called Foolproof, that tells us how to do this. I think that we, all of us, need to do this. The government is not going to save us. The social media companies are not going to save us. Individual citizens need to figure out that we are in an information war and we are all warriors in this. We all have cognitive biases. That does not mean that we cannot fight back. It means that we need to enlighten ourselves. And once we become enlightened, we can push back, and you push back with the truth. Another bias is the repetition effect, right? We believe things the more we hear it. So why aren't we using the repetition effect for truth? Why do we let the liar say it a million times, and then we say the truth, and they don't listen, and then we just say, ah, no use talking to him, we walk away. Sometimes I'll answer hate mail by saying, I'm so sorry, you were duped. They don't like it, but it's true, and I'm pushing back in just the way they need to hear.

Alan Jagolinzer:

I wholeheartedly agree with you with respect to that. My question is, how do we collectively push back against the threats? So there's threats of violence for many of these cases, and there's also Threats are being called into congressional testimony from gaslighting politicians.

Lee McIntyre:

It's the latest thing. And I don't want to say I don't understand it. What I don't understand is why journalists aren't taking it more seriously. There is right now coordinated pushback against disinformation researchers. We see it from our fellow academics. We see it from politicians. We see it from some ideological journalists who are just pretending that, oh, this is just all alarmism, you know, what's the big worry? They'll cherry pick out some statistic about what small percentage disinformation is of the information stream, ignoring the fact that it's how many people saw it and how much it influenced them, right? Zuckerberg is always saying, you know, oh, well, look at how many fake accounts we took down. But he doesn't quote the statistic of how many people saw it before he took it down. In 2016, he said, oh, it's crazy to think that Russian memes influenced the 2016 election. The next year, he admitted that 126 million people saw those fake posts. So, I mean, I think we call that out, too. What Jim Jordan is doing with his weaponization of government committee in the U.S. Congress is itself disinformation. I wrote an op-ed that I never got published because the journalists wouldn't take it seriously called the disinformation about disinformation because all these people who are claiming that disinformation researchers know they're the ones who are biased. That's a disinformation tactic they're using. right? And who would defend the right of disinformers to be in the information stream but somebody who's benefiting from disinformation so they can cry all they want about how disinformation research is this terrible thing that's censorship. It's not censorship. Refusing to amplify someone else's lie is not censorship. Facebook and Twitter could overnight decide to stop amplifying lies, but they benefit from doing so, and maybe that's why they do it. But I think we need to call that out. I think we need to say, and what I wanted to say in my op-ed, but maybe I'll just say it here, is that Jim Jordan is defending the right of disinformation to exist because he's a disinformer. I heard Gillis, was it, who ran for governor in Florida quote, a hurt dog is going to holler. That's why you hear Jim Jordan and others hollering about disinformation research and how awful it is, because they're hurt. And so what that tells me is score. We're doing the right thing. Push back, expose the plot. What does Jim Jordan have to benefit in attacking disinformation researchers? Maybe he'll get them to shut up. Maybe he'll intimidate them into not doing their work. Maybe he'll threaten, maybe they can't afford a lawyer, maybe they get in trouble at their university. That's the trouble. I mean, one of the hallmarks of disinformation is to hide the fact that it's disinformation, and that's what they're doing. So I think that's kind of next-level chess, isn't it, right? And they're subtle. One of the most frightening books I've read recently is this free book called The Handbook of Russian Information Warfare. This is a NATO training manual. This is for NATO soldiers and commanders about how the Russians fight information war. And the important point about this book, and it's free, you can get it online for free at PDF. You can also write to NATO and get a free copy like I did. What's so frightening about it is disinformation used to be government to government. Now governments use it against their own citizens and governments use it against the citizens of other countries. So we're all warriors in this and we all need to read this type of manual to learn how to fight back. And I wrote my book to be a short pocket size citizen's guide because my book also, I mean, I'm not in NATO, but it's a training manual. It's a training manual for how individual citizens can fight back. And the first thing you have to do is realize you're in an information war and that you're not helpless because that's what the disinformers want you to think. Oh, I'm helpless. I can't do anything. I'll just give up. 2024 election in the United States is coming up. Scares the hell out of me because I watched the polls. You know, I watched the horse race poll. Oh, Trump and Biden are even okay. but that's not the swing states. Then you look at the swing states, oh my god, people need to be woken up, they need to be enlightened, the plot needs to be exposed, and I think it's perfectly okay to say These people are lying. And the media is just starting. They're just starting to wake up to this. There's one newscaster that I follow religiously because she always does a good job. I don't know if you get MSNBC in England, but Nicole Wallace, 4 p.m. here in Boston time on MSNBC, she always gets it right between mis and disinformation. And she educates people on how to fight back.

Ceejay Hayes:

It's quite scary when you have any sort of consciousness about how deeply polarized we are and how embedded disinformation is in just how different groups are understanding the world around them. What's giving you hope? What can we look to? An organization, a campaign, maybe a different country's context. But what's giving you hope that there is a positive end to this

Lee McIntyre:

I think there are more people who care about truth than don't. And I think that people hate to be fooled. They hate to be fooled. So the more we can do to expose what disinformation is, that it's a strategic denialist campaign, cynically put forward by people who want something and are willing to exploit other people and even to see them die for their own interest. I think if we could get that message out, that could turn the tide. That was the motivation. I mean, I'm a scholar, I'm a philosopher of science, what am I doing writing this manifesto about fighting disinformation? And I'm doing it because I'm trying to wake people up. And what I tell people is, buy the book, But leave it on the bus, leave it in your Uber, pass it to someone else. There's barely enough time to wake enough people up to do this. And there are nine other tips in the book that people can read before they pass it on to someone else. But I mean, we can get organized. In the United States, we organized and fought back against Trump after he was elected. All of that resistance, the Women's March, the March for Science, we can do that again. We need to do it before the next election. And we need to do it within the context of understanding the lie. Liam McIntyre, thank you so much. Thank you, guys. That was a lot of fun. And by the way, one thing that gives me hope the work that you're doing. You guys are the tip of the spear. You're really doing very important work. It's my honor to be on your podcast.

Ceejay Hayes:

Thank you so much. One of the elements of countering disinformation that deserves attention is the emotional and psychological work of processing the fact that the information one used to construct their worldview is false or improperly contextualized. As Lee mentioned before, it is much harder to shift one's understanding of the truth once they have made a decision on what is true. One of the obstacles to that shift is ego. I have certainly resisted confrontations to my beliefs because to concede that my worldview, inextricably linked to how I move throughout life, is built on a fallacy and or can present a threat to others felt like an assault on my character. The emotional and psychological obstacles confronting any personally held worldview is one of the reasons why this and misinformation is such a powerful tool in sowing societal discord. With opportunists at the individual and institutional level mobilizing myths and disinformation, it is crucial that we all adopt a practice of critical analysis and scrutiny to information that comes our way. You are not a bad person when you recognize that a way of thinking that you've been socialized, educated, or acculturated into produces harm. Nor are you any less of a person when you make the realization that part of your worldview is based on a falsehood or improperly contextualized piece of information. There is a responsibility, however, once such realizations are made. Once you make these assessments of your worldview, you then must make the conscious choice to either meaningfully shift your thinking, which is a process, not a step, or to continue unchanged. Worldviews are allowed to evolve and you have no obligation, cultural, familial, personal, or otherwise, to adhere to a perspective that is unfit for purpose. Ask questions and be critical. Explore your perspectives and understand why you hold them. And when challenged, hold your urge to be defensive and engage with your opposition. Life is a process of learning. The information we internalize can water us or poison us. Thanks again to Lee McIntyre for his reflections on the origins and purposes of disinformation. His book, On Disinformation, is available wherever you buy books. Do what you can to support your local bookshop. I've also included links to some of the other pieces mentioned throughout. And thanks as well to Drs. Alan Jagolinzer and Sander van der Linden. You may recall a reference made by Lee to a brilliant cognitive scientist, that would be Sander. And his book, Foolproof, is also available wherever you buy books. Shout out to our editor, Jac Boothe of Neon Siren Studios. And thanks to you, the listener, for taking the time out of your day to engage with this conversation. I hope you're left with new insights. Please do share the CounterPol Pod with your loved ones and give us a review. Until next time.

Previous
Previous

Polarization and Civil Disorder with Omar McDoom

Next
Next

Social Media and Knowledge Production with Yoel Roth