Ben Shapiro’s Guide To Avoid Getting Taken In By Conspiracy Theories
Favicon 
www.dailywire.com

Ben Shapiro’s Guide To Avoid Getting Taken In By Conspiracy Theories

There’s a difference between a conspiracy and a conspiracy theory.  A conspiracy is something that actually exists in the real world. We have an entire statute devoted to uncovering conspiracies and prosecuting: the Rico Act, an Act directly tied to the idea of conspiracy connected to crime?  You can tell when a conspiracy is happening because of the evidence.  For example, when you say that there was a conspiracy from Anthony Fauci and his buddies to silence people who did not take his viewpoint on the Wuhan virus, that is true because there are emails showing that Fauci did that exact thing. When you say that there was a conspiracy of silence around Joe Biden’s health condition, that is obviously true, not just because of your eyes watching the media basically cover it up, but because we know from actual contemporaneous reporting that everyone around Joe Biden knew exactly what was happening. The evidence makes it not a conspiracy theory, but a conspiracy. Conspiracy theories are something different. And they’re not just creeping into the fringe anymore; they’re staging a full-blown takeover in the public square. You’ve seen it pretty much everywhere. You know that we’ve reached peak absurdity when even Alex Jones, who himself has done a lot of this in the past, is suddenly demanding evidence from certain malefactors. Conspiracy theories are not skepticism. Skepticism would be to question something and look for the evidence. Not skepticism is where you question something, but don’t look for any evidence before you throw out a theory that is not backed by the evidence. That is not skepticism. Conspiracy theorizing is not skepticism. It is usually intellectual cowardice dressed up as critical thinking. WATCH: The Ben Shapiro Show If we don’t torture the rhetorical playbook, we end up with a society where the facts are optional, where actual people who know things are mocked, where every basement-dwelling keyboard warrior thinks that fanfiction trumps reality. There are people who are in the basement on the keyboards who are doing good, hard work, which is important — if they’re basing it on the evidence. But that’s not happening right now. In order to have a functioning polity, you have to have a common set of facts. Conspiracy theories are directly tied to a lack of facts. Let’s take a look at how to tell the difference between a conspiracy and a conspiracy theory — and what the hallmarks are of somebody who’s pitching you a conspiracy theory. This is what I call the “QED” of conspiracy thinking. It’s a simple framework with three pillars: Q is for Fake Questions  E is for Fake Evidence D is for Fake Defenses This framework is designed to expose the intellectual dishonesty at the core of conspiracy thinking. And once you see it, it’s difficult to unsee it, because you’ll see these tactics almost everywhere with regard to people retailing particular narratives. We’ll start with the first pillar in the conspiracy theory playbook, fake questions. Q: Fake Questions Fake questions are the foundation of conspiracy theories. These questions are not genuine inquiries that seek truth or evidence or facts. They’re rhetorical devices designed to create doubt, even where there’s no evidence that the doubt should exist. The most obvious example of this? “Just asking questions.” Tactic #1: “Just asking questions” This is when somebody poses a loaded question while they’re pretending to be a neutral observer. They’ll say, “I’m just asking questions. Why won’t the government discuss the second shooter on the grassy knoll in the JFK assassination? Why can’t we even talk about the JFK assassination?” The question presupposes two premises: First, there’s someone trying to hide the truth from you, and second, they’re trying to “silence” you. Many of the people who use this technique are speaking to literally millions of people. “Why can’t we talk about the JFK assassination? Hmmm??” Guess what? I noticed you guys are talking about the JFK assassination in front of millions of people. Nobody’s actually silencing you; they’re simply asking for the evidence of your contention and you’re not providing the evidence. “Just asking questions” is a rhetorical sleight of hand. And here is the problem: the tactic puts the burden of proof on the wrong side. Typically, in logical discourse, if I make a contention, I have to provide evidence of my claim. I can’t just make a claim and then ask somebody else to disprove it. If I’m going to debate and I make a claim, I should have evidence to back the claim. I can’t just throw out a claim like “The aliens landed in Area 51! Show me the evidence they didn’t land in Area 51!” No. The burden of proof is on me to show they landed in Area 51. It’s my job to show the evidence and then challenge the other side to refute it. That’s how an actual conversation typically works. But that’s not what’s being done here. Tactic #2: Motive over evidence This is when the person retailing a conspiracy theory suggests that because someone benefits from an outcome or could benefit from an outcome, they are responsible for the outcome. They show no actual connection between the person and the outcome, but if you benefit, you must have done it. Example: Lyndon Johnson became president after JFK died, therefore he assassinated JFK. That is not an actual logical piece of reasoning. It wouldn’t pass muster in a freshman logic class. Motive may suggest where to look for evidence, but it is not evidence itself. By the conspiracy theory’s logic, if you benefit from a sale at your local grocery store, you must have designed the sale at your local grocery store. You benefiting from something does not mean that you did it. If your aunt dies and leaves you $1 million, it doesn’t mean you murdered your aunt. Tactic #3: The appeal to ignorance This one is particularly insidious, and you see this all the time, particularly in the interwebs. It is to argue that something has to be true because it hasn’t been definitively disproven. But here’s the thing: You can’t very often definitively disprove anything. I can make a case, as Richard Dawkins suggests, that there’s a spaghetti monster controlling all of human activity. If I say, “There’s a spaghetti monster controlling all human activity, disprove it.” There’s no way to disprove it. And then you say, “Well, I’m not even claiming there’s a spaghetti monster who was doing this sort of stuff. I’m just saying it might happen. Can you disprove that?” This fundamentally misunderstands how knowledge works. Absence of evidence is not evidence of absence, but neither is it evidence of presence. The burden of proof lies with the person making the extraordinary claim. Tactic #4: “You don’t know the whole story” This is a classic. This is when somebody who’s retailing a conspiracy theory implies that there’s hidden knowledge only the initiates can access. “They’re not telling you everything. Do your own research.”  Of course, this is effective because sometimes people aren’t telling you everything and you should do your own research. But — nobody knows everything about complex events. The conspiracy theorist exploits that gap to insert a preferred narrative. “Do your own research” is a way of basically saying, “Yeah, you don’t know as much about this as this other guy who wrote several books of credible evidence on the subject, but you shouldn’t trust him. You should do your own research.” When people say this, typically what they really mean is, “You shouldn’t do your own research, you should listen to me because you’re not going to do your own research. You might come to a different conclusion. Instead, just listen to the theory that I’m retailing because I have done the research.” Typically, that is rarely true. E: Fake Evidence That’s how you structure a conspiracy theory. Fake evidence is where somebody who’s retailing conspiracy theory attempts to build a case using what appears to be evidence, but it doesn’t stand up to scrutiny at all. Tactic #1: Cherry-picking You’ll see somebody who’s retailing a theory seize on a single data point, while ignoring a mountain of evidence that contradicts that theory. They will say, “Did you know that Trotsky was a Jew? And that means that the Sovietization of Russia was a Jewish plot.” Lenin wasn’t a Jew; Stalin wasn’t a Jew. Tons of people in the Soviet infrastructure were not Jews, and even those who claimed Jewish background were not observant Jews in any way. But you’re not going to hear about any of that because they are cherry-picking. This very often happens when someone is attempting to credit a group with outsized power. They’ll say, “Have you seen how many Jews there are in Hollywood?” It’s true, there are a lot of Jews in Hollywood. There are also a lot of non-Jews in Hollywood. And there are very few religious Jews in Hollywood. You can do pretty much anything by cherry-picking; You just cherry pick one thing and ignore all of the rest of the things. Tactic #2: Secret sources You hear this all the time: “I have a friend who works for the government, a credible person. This person told me, and I believe that person.” Such claims are unfalsifiable because you don’t reveal the source, so how am I supposed to falsify the claim? In actual journalism and academic research, you have to have multiple sources to confirm something. This is why people have become more and more skeptical of anonymous sources, even from legacy media sources, because it’s very difficult to fact-check a claim based on anonymous sources saying things. But it’s a great way of foisting your claim on a source that you won’t even name or describe, because if you describe them, “It might let out of the bag who they are. You wouldn’t want them to get in trouble.” Tactic #3: False cause This is rampant, people who are retailing conspiracy theories who will take two correlative events that are not causative. They will say, “The government conducted a training exercise near New York and then 9/11 happened, thus they are connected.” This is a very basic “post hoc ergo propter hoc” fallacy: Because one thing followed another, the first thing caused the second thing. “I wore my lucky socks and my team won, therefore my socks caused my team to win.” That is not an actual argument. Correlation has to be proved to be causative, not just correlated. Tactic #4: An appeal to authority One of the great problems in our society right now is that the experts have blown themselves out on so many topics. There are so many people who claim to be experts on topics and it turns out that they totally blew it. This has opened the door to a bunch of people who will now term anyone an expert on a topic. The appeal to authority is usually a way of finding an authority who has a PhD in an unrelated topic and then saying that this person is an expert on this particular thing. Claiming expertise is really quite easy in today’s modern world. You just say you spend a lot of time reading books, you read a lot of books, and this makes you an expert on the topic. If we can’t appeal to expertise, this is why evidence becomes more important, not less important. If you don’t trust the experts, then you should ask for the evidence across the board. This should be a way of saying, “Don’t cite your authority. Just bring the evidence.” That would be it. But people don’t bring the evidence. They cite some sort of expert who’s saying something out of the box, and you don’t know enough to actually question them. That is a cheap and easy way of avoiding responsibility for the theory that you are retailing. Tactic #5: Overestimating coordination This is usually where conspiracy theories fall apart, because what they’ll do is posit a vast and complex secret conspiracy that would require hundreds or thousands of people to be involved.  Most conspiracy theories have to get more and more and more complex. Let’s say you believe that the moon landing was faked. That would not require a few dozen people to be in on it, it would require thousands, probably tens of thousands of people to be in on it and maintain a perfect lie for literally decades. Have you ever tried to organize a surprise party for a friend? The chances are really good that the friend is going to find out about the surprise party if you have 20 people at the party. Multiply that complexity by a thousand — and you don’t know any of these people, and they’re part of a government organization. You think that nothing is going to leak? Ever?  People talk; they make mistakes; they have crises of conscience. Usually conspiracies that are successful are relatively small, and then they become open when they’re still relatively small and they gain power. So, for example, the original Soviet was a conspiracy of people who were pretty open about their ambitions. It started off secret, and pretty quickly became public. And then they expanded their ambition. The idea that you can have a secret, massive conspiracy involving tens of thousands of people across decades strains rationality. Tactic #6: “Us vs. Them” thinking This is not even an appeal to evidence. This is, “If you’re in on it, if you’re in the know, then you’re us. And if you’re a credulous dupe and you don’t believe us, then you’re them. You’re a sheeple.” This is a great way of alienating everyone who doesn’t agree with you and calls for evidence. It’s very emotionally satisfying because you feel like you’re in the know and you feel like you’ve gained the secrets of the universe, but it’s totally intellectually stunting. The truth is that reality is quite complex. Most events result from an exchange of intentional actions, unintended consequences, systemic factors, lots of random chances. That’s life. You know that in your own life. Reducing that complexity to heroes and villains makes for great movies, but it makes for really, really bad analysis. Tactic #7: Confirmation bias where you interpret information in a way that confirms your preexisting beliefs. If you believe in a conspiracy, everything becomes evidence for the conspiracy. Any contradictory evidence you simply dismiss as part of the cover up. Let’s say that you believe that JFK was assassinated by the Mafia, and then it turns out that actually Lee Harvey Oswald didn’t have any ties to the Mafia. You say, “That’s just because we haven’t looked hard enough. And anybody who makes that claim is probably in on it.” Tactic #8: Apophenia To see meaningful connections or patterns in unrelated or random things. This happens all the time. Human brains look for patterns. You see the media do this: “These three unrelated politicians all visited the same city in different years. It must be a conspiracy.” No, it’s just a coincidence, which is a real thing that exists in our universe and obtains in your life pretty much every day. D: Fake Defenses This is where somebody asks for evidence over and over. “What’s your evidence? Show me the evidence.” There are a bunch of tactics they can use, such as non-falsifiability. Tactic #1: Non-falsifiability When it comes to claims about the world itself, you should have a falsifiable theory. But conspiracy thinking is non-falsifiable. You structure a claim so it can never be proven false. A good theory makes predictions that could be disproven. But conspiracy theories are designed to be immune to evidence. They don’t want to provide a falsifiable thing. Tactic #2: Moving the goalposts When it comes to global warming, there are a couple of things you can say that are plausible and falsifiable. One: Is the world getting warmer over time? Two: Does that correlate with human activity with regard to carbon? The answer to both questions is yes. “What is the level of causation?” is still an open question. What is a conspiracy theory is where they said, “The world is going to end in 2012,” and then it doesn’t end in 2012. And then they say, “I really meant 2024,” and it doesn’t end then. “What I really meant was 2036.” In the real world, if you have a predictive failure, you have to revise your theory, but in conspiracy thinking, you simply revise the prediction. Tactic #3: Circular reasoning They will say, “We know you’re corrupt because you won’t report on this conspiracy theory. And we know the conspiracy is real because you won’t report on the conspiracy theory.”  It’s a closed logical loop. No external information is possible. You’re lying because the conspiracy theory is true. The conspiracy theory is true because you’re lying. Around the circle you go. And at no point does evidence ever enter the equation at any point. Tactic #4: The Kafka trap This one is named after Franz Kafka’s short story “The Trial,” where denial of guilt is taken as evidence of guilt. This is rampant on X. Someone says, “I’m not involved in a conspiracy, nor do I think that that conspiracy — ” “Probably because you’re involved in the conspiracy! That’s why.” It’s a character attack. It makes the accusation unfalsifiable because whether you deny or whether you confirm, they are both taken as confirmation. Denial, silence, and confirmation are all the same in the Kafka trap. If somebody accuses you of complicity in putting microchips in the blood of your enemies and you deny it, “You’re only denying it because it’s true!” If you confirm it, it’s true. If you stay silent, it’s because you won’t answer because it’s probably true. Tactic #5: Information overload This happens frequently when challenged for evidence. What you will get is a bunch of unrelated gobbledygook facts that are stacked on top of each other super-fast. That is very difficult to combat because it’s the equivalent of a terrorist rocket barrage. It’s a bunch of $50 rockets that are sent up, and then require $50,000 for an Iron Dome to take down each one of those argumentative rockets. And by the time you’ve done that, they’re already firing the next argumentative rocket. That has nothing to do with the central argument, but is incredibly time-consuming. And so people give up defending the truth because it’s so tiring. Tactic #6: Weaponizing doubt Conspiracy theories excel at this; to find a minor error in the official accounts of an event. And then they’ll say, “The entire thing is wrong.” When there’s a controversial issue, one should take 48 hours, because usually it takes a while for the truth to be established. But what usually happens is people jump to a conclusion. That conclusion is then used to discredit the actual truth, because somebody made a mistake when they first made the report. Tactic #7: False equivalence This is a defense mechanism for a bad conspiracy theory where you say all sources are equally biased. “My evidence comes from some schlub in a YouTube video who doesn’t know anything about the topic and has no credentials and hasn’t studied anything, but also, the legacy media lies.”  Yes, the legacy media does lie. That does not mean that all people are equally dishonest, or that all cases are equally verifiable or have equal veracity. You have to actually establish the evidence.  Tactic #8: “No true Scotsman” This is when someone says something like, “No true conspiracy researcher would deny the moon landing was faked.” This allows the community to maintain ideological purity. You exclude dissenters. You say that person is not actually a member of our community. They’re not pure enough. Tactic #9: The deep play This is really devious. This is where every debunking becomes evidence of a deeper conspiracy. “The fact that the legacy media is so intent on focusing on this theory, it shows that they’re hiding something. The fact that they keep spending time on this thing that I’m bringing up shows that they are part of it.” Tactic #10: The motte-and-bailey This is where somebody will make a totally implausible claim, such as, “The government is run by lizard people.” You respond, “That’s not true. The government is not run by lizard people.” They respond, “I’m just claiming that the government lies to us sometimes.” Well, yes, the government lies to us sometimes, but that does not justify your main claim. To conclude, I am not saying that questioning established narratives is bad. We have to do it. Skepticism is healthy, but a healthy skepticism is rooted in a request for evidence. There’s a world of difference between evidence-based skepticism and just stringing together random events, or stretching the truth beyond what it can bear, or simple speculation. These are not the same things. Skepticism leads to better understanding through a search for actual truth and evidence. And if you’re not doing that, then you’re just entering an intellectual rabbit hole from which pretty much nobody returns. The QED framework I just outlined — fake questions, fake evidence, fake defenses — is a great way to distinguish between legitimate inquiry and the “just asking questions” kind of conspiracy thinking. This is not about Left versus Right. You see this all across the political spectrum. All across it. The difference is how we approach all of this. We need to approach all of this with intellectual rigor. We need to actually be willing to change our minds based on actual evidence. You have to hold your own side to the same standards. The QED framework is a great way of telling who exactly is lying to you and who is not: If people keep insisting they don’t need to provide any evidence, if they’re “just asking questions” or if they just retail theories without any evidence to support them, use the QED framework. It will help tell you who actually cares about the truth and who does not, and who is making money off of you. The next time you hear somebody throw out a wild theory, ask yourself: Is this a serious question? Are they presenting real, serious, credible, verifiable evidence? Are they open to real critiques that might actually correct the theory? If the answer to any of those is no, you might want to click somewhere else.