How the truth was murdered

sharing misinformation - Najeebah Al-Ghadban

How the truth was murdered: Pandemic, protest, and a precarious election have created an overwhelming flood of disinformation. It didn’t have to be this way.”
MIT Technology Review, October 7, 2020
Intelligent Machines
by Abby Ohlheiser

“Instead, I’m here to point out, as others have before, that people had a choice to intervene much sooner, but didn’t. Facebook and Twitter didn’t create racist extremists, conspiracy theories, or mob harassment, but they chose to run their platforms in a way that allowed extremists to find an audience, and they ignored voices telling them about the harms their business models were encouraging.”


Hundreds of thousands of Americans are dead in a pandemic, and one of the infected is the president of the United States. But not even personally contracting covid-19 has stopped him from minimizing the illness in Twitter messages to his supporters.


Meanwhile, suburban moms steeped in online health propaganda are printing out Facebook memes and showing up maskless to stores, camera in hand and hell-bent on forcing low-paid retail workers to let them shop anyway. Armed right-wing militias are patrolling western towns, embracing online rumors of “antifa” invasions. And then there’s QAnon, the online conspiracy theory that claims Trump is waging a secret war against a ring of satanist pedophiles.


QAnon drew new energy from the uncertainty and panic caused by the pandemic, growing into an “omniconspiracy theory”: a roaring river fed by dozens of streams of conspiratorial thinking. Researchers have documented how QAnon is amplifying health misinformation about covid-19, and infiltrating other online campaigns by masking outlandish beliefs in a more mainstream-friendly package. “Q,” the anonymous account treated as a prophet by QAnon’s believers, recently instructed followers to “camouflage” themselves online and “drop all references re: ‘Q’ ‘Qanon’ etc. to avoid ban/termination.” Now wellness communities, mothers’ groups, churches, and human rights organizations are trying to deal with the spread of this dangerous conspiracy theory in their midst.


When Pew Research polled Americans on QAnon in early 2020, just 23% of adults knew a little or a lot about it. When Pew surveyed people again in early September, that number had doubled—and the way they felt about the movement was split down party lines, Pew said: “41% of Republicans who have heard something about it say QAnon is somewhat or very good for the country.” Meanwhile, 77% of Democrats thought it was “very bad.”


Major platforms like Facebook and Twitter have started to take aggressive action against QAnon accounts and disinformation networks. Facebook banned QAnon groups altogether on Tuesday, aiming directly at one of the conspiracy theory’s more powerful distribution networks. But those networks were able to thrive, relatively undisturbed, on social media for years. The QAnon crackdown feels too late, as if the platforms were trying to stop a river from flooding by tossing out water in buckets.


Many Americans, especially white Americans, have experienced the rise of online hate and disinformation as if they’re on a high bridge over that flooding river, staring only at the horizon. As the water rises, it sweeps away anything that wasn’t able to get such a safe and sturdy perch. Now that bridge isn’t high enough, and even the people on it can feel the deadly currents.


I think a lot of people believe that this rising tide of disinformation and hate did not exist until it was lapping at their ankles. Before that, the water just wasn’t there—or if it was, perhaps it was a trickle or a stream.

But if you want to know just how the problem got so big and so bad, you have to understand how many people tried to tell us about it.

Read the Full Article »

About the Author:

Abby Ohlheiser: I’m a senior editor at MIT Technology Review focused on internet culture. Before that I covered digital life for the Washington Post, and was a staff writer at The Atlantic Wire.