Tuesday, March 26, 2019

The Illusory Truth Effect: How Millions Were Duped By Russiagate

ORIGINAL LINK

“Mueller Finds No Trump-Russia Conspiracy”, read the front page headline of Sunday’s New York Times. Bit by bit, mainstream American consciousness is slowly coming to terms with the death of the thrilling conspiracy theory that the highest levels of the US government had been infiltrated by the Kremlin, and with the stark reality that the mass media and the Democratic Party spent the last two and-a-half years monopolizing public attention with a narrative which never had any underlying truth to it.

There are still holdouts, of course. Many people invested a tremendous amount of hope, credibility, and egoic currency in the belief that Robert Mueller was going to arrest high-ranking Trump administration officials and members of Trump’s own family, leading seedy characters to “flip” on the president in their own self-interest and thereby providing evidence that will lead to impeachment. Some insist that Attorney General William Barr is holding back key elements of the Mueller report, a claim which is premised on the absurd belief that Mueller would allow Barr to lie about the results of the investigation without speaking up publicly. Others are still holding out hope that other investigations by other legal authorities will turn up some Russian shenanigans that Mueller could not, ignoring Mueller’s sweeping subpoena powers and unrivaled investigative authority. But they’re coming around.

The question still remains, though: what the hell happened? How did a fact-free conspiracy theory come to gain so much traction among mainstream Americans? How were millions of people persuaded to invest hope in a narrative that anyone objectively analyzing the facts knew to be completely false?

The answer is that they were told that the Russiagate narrative was legitimate over and over again by politicians and mass media pundits, and, because of a peculiar phenomenon in the nature of human cognition, this repetition made it seem true.

The rather uncreatively-named illusory truth effect describes the way people are more likely to believe something is true after hearing it said many times. This is due to the fact that the familiar feeling we experience when hearing something we’ve heard before feels very similar to our experience of knowing that something is true. When we hear a familiar idea, its familiarity provides us with something called cognitive ease, which is the relaxed, unlabored state we experience when our minds aren’t working hard at something. We also experience cognitive ease when we are presented with a statement that we know to be true.

We have a tendency to select for cognitive ease, which is why confirmation bias is a thing; believing ideas which don’t cause cognitive strain or dissonance gives us more cognitive ease than doing otherwise. Our evolutionary ancestors adapted to seek out cognitive ease so that they could put their attention into making quick decisions essential for survival, rather than painstakingly mulling over whether everything we believe is as true as we think it is. This was great for not getting eaten by saber-toothed tigers in prehistoric times, but it’s not very helpful when navigating the twists and turns of a cognitively complex modern world. It’s also not helpful when you’re trying to cultivate truthful beliefs while surrounded by screens that are repeating the same bogus talking points over and over again.

I’m dealing with a perfect example of the perils of cognitive ease right now. Writing this essay has required me to move outside my familiar comfort zone of political commentary and read a bunch of studies and essays, think hard about new ideas, and then figure out how to convey them as clearly and concisely as possible without boring my audience. This movement away from cognitive ease has resulted in my checking Twitter a lot more often than I usually do, and seeking so much distraction that this essay will probably end up getting published about twelve hours later than I had intended. Having to read a bunch of scholars explaining the precise reasons why I’m acting like such an airhead hasn’t exactly helped my sense of cognitive ease any, either.

Science has been aware of the illusory truth effect since 1977, when a study found that subjects were more likely to evaluate a statement as true when it’s been repeatedly presented to them over the course of a couple of weeks, even if they didn’t consciously remember having encountered that statement before. These findings have been replicated in numerous studies since, and new research in recent years has shown that the phenomenon is even more drastic than initially believed. A 2015 paper titled “Knowledge Does Not Protect Against Illusory Truth” found that the illusory truth effect is so strong that sheer repetition can change the answers that test subjects give, even when they had been in possession of knowledge contradicting that answer beforehand. This study was done to test the assumption which had gone unchallenged up until then that the illusory truth effect only comes into play when there is no stored knowledge of the subject at hand.

“Surprisingly, repetition increased statements’ perceived truth, regardless of whether stored knowledge could have been used to detect a contradiction,” the paper reads. “Reading a statement like ‘A sari is the name of the short pleated skirt worn by Scots’ increased participants’ later belief that it was true, even if they could correctly answer the question ‘What is the name of the short pleated skirt worn by Scots?’”

Stored knowledge tells pretty much everybody that the “short, pleated skirt worn by Scots” is a kilt, not a sari, but simply repeating the contrary statement can convince them otherwise.

This explains why we all know people who are extraordinarily intelligent, but still bought into the Russiagate narrative just as much as our less mentally apt friends and acquaintances. Their intelligence didn’t save them from this debunked conspiracy theory, it just made them more clever in finding ways of defending it. This is because the illusory truth effect largely bypasses the intellect, and even one’s own stored knowledge, because of the way we all reflexively select for cognitive ease.

Another study titled “Incrimination through innuendo: Can media questions become public answers?” found that subjects can be manipulated into believing an allegation simply by exposure to innuendo or incriminating questions in news media headlines. Questions like, for example, “What If Trump Has Been a Russian Asset Since 1987?”, printed by New York Magazine in July of last year.

What If Trump Has Been a Russian Asset Since 1987? https://t.co/1nYhadj0Wo via @intelligencer

 — @HamillHimself

You can understand, then, how a populace who is consuming repetitive assertions, innuendo, and incriminating questions on a daily basis through the screens that they look at many times a day could be manipulated into believing that Robert Mueller would one day reveal evidence which will lead to the destruction of the Trump administration. The repetition leads to belief, the belief leads to trust, and before you know it people who are scared of the president are reading the Palmer Report every day and parking themselves in front of Rachel Maddow every night and letting everything they say slide right past their skepticism filters, marinating comfortably in a sedative of cognitive ease.

And that repetition has been no accident. CNN producer John Bonifield was caught on video nearly two years ago admitting that CNN’s CEO Jeff Zucker was personally instructing his staff to stay focused on Russia even in the midst of far more important breaking news stories.

“My boss, I shouldn’t say this, my boss yesterday we were having a discussion about this dental shoot and he goes and he was just like I want you to know what we are up against here,” Bonifield told an undercover associate of James O’Keefe’s Project Veritas. “And he goes, just to give you some context, President Trump pulled out of the climate accords and for a day and a half we covered the climate accords. And the CEO of CNN said in our internal meeting, he said good job everybody covering the climate accords, but we’re done with it, let’s get back to Russia.”

(And before you get on me about O’Keefe’s shady record, CNN said in a statement that the video was legitimate and disputed none of its content, saying only that it stands by Bonifield and that “Diversity of personal opinion is what makes CNN strong, we welcome it and embrace it.”)

Zucker, for his part, told the New York Times in an article published yesterday that he was “entirely comfortable” with CNN’s role in promoting the Russiagate conspiracy theory the way that it did.

“We are not investigators. We are journalists, and our role is to report the facts as we know them, which is exactly what we did,” Zucker said. “A sitting president’s own Justice Department investigated his campaign for collusion with a hostile nation. That’s not enormous because the media says so. That’s enormous because it’s unprecedented.”

CNN prez Jeff Zucker: "We are not investigators. We are journalists, and our role is to report the facts as we know them, which is exactly what we did." https://t.co/DiUjr7Nkbg

 — @brianstelter

“We are not investigators”? What the fuck kind of dumbass shit is that? So it’s not your job to investigate whether what you’re reporting is true or false? It’s not your job to investigate whether the anonymous sources you’re basing your reports on might be lying or not? It’s not your job to investigate whether or not you’d be committing journalistic malpractice with the multiple completely bullshit stories your outlet has been humiliated by in the last two years? It’s not your job to weigh the consequences of deliberately monopolizing public attention on a narrative which consists of nothing but confident-sounding assertions and innuendo?

“We are not investigators.” So? You’re not dentists or firefighters either, what’s your point? That has nothing to do with the mountains of journalistic malpractice you’ve been perpetrating by advancing this conspiracy theory, nor with the inexcusable brutalization you’ve been inflicting upon the American psyche with your deliberate nonstop repetition of bogus assertions, innuendo, and incriminating questions.

The science of modern propaganda has been in research and development for over a century. If you think about how many advances have been made in other military fields over the last hundred years, that gives you a clear example of how sophisticated an understanding the social engineers must now have of the methods of mass manipulation of human psychology. We may be absolutely certain that there are people who’ve been working to drive the public narratives about western rivals like Russia, and that they are doing so with a far greater understanding of the concepts we’ve touched on in this essay than we have at our disposal.

The manipulators understand our psyches better than we understand them ourselves, and they’re getting more clever, not less. The only thing we can do to keep our heads while immersed in a society that is saturated with propaganda is be as relentlessly honest as possible, with ourselves and with the world. We’ll never be able to out-manipulate the master manipulators, but we can be real with ourselves about whether or not we’re selecting for cognitive ease rather than thinking rigorously and clearly. We can be truthful with our friends, family, coworkers and social media followers wherever untruth seems to be taking hold. We can do our very best to shine the light of truth on the puppeteers wherever we spot them and ruin the whole goddamn show for everyone.

It may not seem like a lot, but truth is the one thing they can’t manipulate, whether it’s truth about them, truth about the world, or truthfulness with yourself. The lying manipulators got us into this mess, so only truth can get us out.

________________________

Thanks for reading! My articles are entirely reader-supported, so if you enjoyed this piece please consider sharing it around, liking me on Facebook, following my antics on Twitter, throwing some money into my hat on Patreon or Paypal, purchasing some of my sweet merchandise, buying my new book Rogue Nation: Psychonautical Adventures With Caitlin Johnstone, or my previous book Woke: A Field Guide for Utopia Preppers. The best way to get around the internet censors and make sure you see the stuff I publish is to subscribe to the mailing list for my website, which will get you an email notification for everything I publish.

Bitcoin donations:1Ac7PCQXoQoLA9Sh8fhAgiU3PHA2EX5Zm2

stat?event=post.clientViewed&referrerSou

via IFTTT