It’s hard to believe it’s been a decade, but in 2011, we were at the Counselor’s Academy Spring Conference when rumors began to circulate that a well-known PR firm had just been discovered to have created a whisper campaign against Google.
Because there were 200 communicators in the room, the interest in this story was high, which is probably why I remember the imbroglio so clearly.
This PR firm was hired by an unknown client to pitch a story about how the “Social Circle” service Google was launching was purportedly endangering consumer privacy by supposedly scraping private data and building “dossiers” on millions of users.
In at least one case, the firm offered to ghostwrite OpEds for bloggers and help get them placed in top news outlets.
Just a few days later, the Daily Beast revealed that the PR firm’s client behind the whisper campaign was none other than Facebook.
According to Beast writer, Daniel Lyon, the company claimed it was motivated by privacy concerns in general and concerns that Google was gathering its information from Facebook, in particular.
The incident, however, reflects the start of a larger phenomenon: whisper campaigns beget astroturfing beget fake news beget disinformation—and we’ve now arrived at a point where we’re faced with the weaponization of misinformation.
Three Vulnerabilities In How We Get Our Information
It’s not news that the proliferation of social media has democratized the dissemination and consumption of information. We’ve certainly learned in the last four years that this has eroded trust in traditional media outlets and undercut authority figures. It also has created an environment that is ripe for fringe groups and alt-tech platforms.
I use this example a lot, but it’s super relevant and paints a beautiful picture of what we’re up against—both as communicators and as humans.
Last summer, a conspiracy theory about Wayfair surfaced on Reddit. An anonymous QAnon conspiracy theorist falsely claimed that the company operated a vast child trafficking operation in which it ships and sells children in its industrial cabinets.
The non-evidence? A June tweet from another QAnon supporter and an insupportable conclusion that the $14,500 cabinets bear the names of child victims.
It’s easy to discount stuff like this as ridiculous, but it cost Wayfair time, money, reputation, and trust while they fought the completely false claims.
As communicators, we can all empathize. Can you imagine? Argh!
The problem is, as Cindy Otis said in a Barron’s article, “Wayfair won’t be the last. You should prepare yourself for when.”
The rate at which the weaponization of misinformation spreads is astonishing. There are significant vulnerabilities in how we all get our information today and they revolve around three primary elements:
- The medium: the platforms (typically social media and other online networks) where it’s incredibly easy for people to share misinformation;
- The message: what is being conveyed and how our cognitive biases allow us to share it without fact-checking; and
- The audience: well, us.
The first two go hand-in-hand. Social media is designed to deliver information as quickly as possible to as many people as possible.
Human Beings Are Not Rational Creatures
We’ve all experienced this as we optimize content for clicks and shares—and, in some cases, revenue.
But that also means misinformation that is sensationalized will catch the eye and be shared—true or not.
This misinformation also ranges from biased half-truths to conspiracy theories to outright lies. The intent is to manipulate the way people think and the way they behave.
Research shows that, on average, a false story reaches 1,500 people six times more quickly than a factual one.
For all that has changed about the weaponization of misinformation and the ability to disseminate it, arguably the most important element has remained the same: the audience.
People are not rational consumers of information. They seek swift, reassuring answers and messages that give them a sense of identity and belonging.
The truth can be compromised when people believe and share information that adheres to their worldview.
Research also reveals that individuals are ill-equipped to assess and cope with the volume of information we have at our fingertips today, leading them to quickly discard what they perceive as irrelevant or unwanted information.
Although people like to believe they are rational consumers of information, there is a phenomenon known as the need for cognitive closure—or certainty in an uncertain world.
This can, in some instances, create the conditions conducive to the extremism and polarization that allows misinformation to flourish.
The Biases We Have that Prevent Us From Being Rational
But it’s not just cognitive closure that motivates human beings to share misinformation they believe to be true without checking facts. We also have confirmation bias. This means we prefer information that confirms preexisting beliefs.
These two things interact with two other types of bias: motivated reasoning and naïve realism.
While confirmation bias leads people to seek information that fits current beliefs, motivated reasoning is the tendency to apply higher scrutiny to unwelcome ideas that are inconsistent with one’s ideas or beliefs.
In this way, people use motivated reasoning to further their quest for social identity and belonging.
And naïve realism leads people to believe that their perception of reality is the only accurate view and that those who disagree are simply uninformed or irrational.
(Sound familiar?? I don’t know how many times I’ve seen people at parties, indoors and not wearing masks, and thought, “Am I the crazy one here?” Of course, I don’t think I’m the crazy one, but some of what I see on social media from otherwise seemingly rational human beings does make me wonder.)
So what is one to do when the weaponization is used against our organizations and, as communicators, we have to pick up the pieces?
Manage the Weaponization of Misinformation
This, unfortunately, goes beyond the need to be crisis ready. It goes beyond having your binder chock-full of a crisis communications plan and best practices. It even goes beyond having a website ready to be turned on at the first whiff of a crisis.
It means you must be risk prepared.
No one’s crisis communications plans included a global pandemic that would require them to first figure out how to send workers home, safely and efficiently, and then figure out how to communicate that to customers—in just mere days.
They didn’t include social injustice movements and moving from not talking about religion, politics, or societal issues so as not to alienate anyone—to having a very clear and hard line about where the organization stands.
They didn’t include having to fight off mass coordination of a false event like Wayfair and others have had to do.
And they certainly didn’t include having to deal with employees or leaders who participated in an attempted coup of the U.S. government.
Yet, here we are…
Being risk prepared looks similar to being crisis ready. It crafts scenarios and best practices for the things we can imagine might happen—and we certainly have many more to add after the past 12 months. And it looks beyond what might harm reputation and market share (or price if the company is public).
It looks at environment, society, governance, and technology. It brings in the typical suspects—leadership, legal, communications, and marketing—but adds in human resources, regulatory, and risk teams, too.
The largest brands have crisis communications teams worldwide that should evolve into risk preparedness teams.
For organizations that don’t need that level of sophistication, the need to be prepared for any type of risk is necessary.
I often think about what would happen to my business if QAnon got pissed off at me for saying they’re crap—and writing about them.
What if someone was angry with me for miscommunication or misunderstanding and decided Spin Sucks was the next brand they’d take down.
Certainly, it wouldn’t have the same effect as harming Wayfair, but it would be costly to us, too.
No organization is safe—and we never know, as we saw with Robinhood and GameStop—when we might be next.
Don’t stick your head into the sand. Don’t wait until something happens to deal with it.
Start now to stop the weaponization of misinformation about your brand, your leaders, or yourself.