A few years ago, we worked with a company based in the U.K. that has technology to help communicators uncover weaponized information, from bad actors out to get their executives or bring their organizations down to really bad guys hiding in the deepest, darkest corners of the internet who want to do more than brand damage. The software was originally built to help in a crisis and has evolved into a tool every in-house communicator should use.

It was FASCINATING work, but also pretty scary. I’m just naive enough to believe the best in people, no matter what, so I was pretty floored to see what happens right in front of our noses. Stuff I’m not sure I’ll ever get over.

Our job, however, was not to scare communicators but to educate them on all of the weaponized information out there—and how to best handle it when it’s presented as fact about your brand.

Fast forward to last week, when PRSA released its Tackling Misinformation guide, which is meant to help comms pros navigate the ever-changing landscape of misinformation,  disinformation, and malinformation—or put another way, weaponized information.  

But what is the difference between the three? And how can you spot them, let alone educate your audiences on each so they don’t fall for false information about your brand?

What Is Weaponized Information?

Weaponized information starts with benign—misinformation—then introduces disinformation and moves to malinformation. Let’s start with the definitions for each to better understand it all.

Misinformation is false information that the person sharing believes to be true. My wonderful uncles share misinformation. They truly believe that the attorney general in New York is out to get Trump—not that he’s broken any laws—and they spread that information. Misinformation is false but does not intend to harm any person or group.

Disinformation is false information, and the person disseminating it knows it is false. It is a deliberate, intentional lie. A good example of disinformation was when Alex Jones was on the air, spouting off that Sandy Hook was a lie. Or Holocaust deniers. Or Flat Earthers. Or climate change deniers. The information is false—and they know it is—but they keep spreading it anyway. 

Malinformation is true, but it is used to inflict harm on a person, organization, or country. It might be information shared without context. A good example is when a brand uses a doctored image of a celebrity showing their support of the product or service. It can also be a true headline—perhaps a bit sensationalized—to encourage people to click to a damaging website or download harmful content. 

Dan Ariely’s new book, Misbelief, talks about malinformation—and how he was the target during the pandemic when he was vocal about being pro-vaccine. People would make videos of things he had said and then misconstrue it to take it out of context and support their anti-vax stance. 

If you think about a weaponized information Venn diagram, misinformation sits on the left side, malinformation on the right, and disinformation is in the middle. 

So it goes from false information the sharer thinks is true (pretty benign) to false information that the sharer knows is false (outrageous) to true information that the sharer has modified to inflict harm (downright criminal).

Goes Beyond Mere Acknowledgement

For several months, I’ve been taking screenshots of stuff I see online that fits one of these categories. One of my favorite examples of misinformation are the photos of Taylor Swift and Travis Kelce doing whatever they’re doing but with Bernie Sanders sitting in the corner in his mittens and mask. It’s clearly false, but the number of times it is shared as if it’s real is astonishing! 

Now that we’re clear on the different types of fake news, let’s discuss what this means for communicators.

A survey of 1,500 global experts ranks misinformation and disinformation as the most concerning issue for 2024. And, with the 2024 elections in the U.S. this year, Americans are concerned that the spread of fake news will get worse—it’s shaking democracy at its core. 

All of this mis, dis, and malinformation has a detrimental effect on trust, democracy, and civil discourse. 

Certainly, with the popularity of generative AI, the ability and speed to manufacture weaponized information and spread it has accelerated. Moreover, with so much news and information bombarding us via digital technology, many are ignoring dedicated news sources and/or reading only curated headlines on social media.      

Research shows that people are 70% more likely to share falsehoods than the truth. This demonstrates, in part, why there is such a vast problem. The issue is not simply isolated to the role of nefarious or ill-informed actors, technology, and now generative AI as an accelerant. 

History itself offers evidence that weaponized information will always exist. It is a human behavior issue, not a technology problem. If digital technology, social media platforms, and AI were removed from the world tomorrow, misinformation and disinformation would still exist.  

So what are we to do? For communicators, this goes beyond mere acknowledgment. We need have proactive strategies to counter misinformation effectively.

How to Counter Weaponized Information

There are three things the aforementioned PRSA report suggests: 

  1. Learn from bad actors and out-communicate them; 
  2. Educate our stakeholders; and 
  3. Return to the basics of PR.

The first thing PRSA suggests is learning from bad actors and then out-communicating them. But what does this look like in real life? 

You must study the bad actors, learn what makes them tick, and pay attention to how they’re spreading weaponized information. 

This will allow you to do three things:

  1. Discern what excellence should look like;
  2. Gain insights into what not to do; and
  3. Identify the traits and behaviors that hinder effective communication. 

We Are Surrounded By Bad Actors

In Misbelief, the book I mentioned earlier, Ariely speaks to his experience when bad actors created deep fake videos of him saying and doing things that never happened. Likewise, many of us have had the experience of someone calling to pretend to be someone close to us, claiming to need money to see if they can trick you into sending them all of your life’s savings. 

My publisher, may she rest in peace, got caught in a scandal like that. A guy pretended to be her brother-in-law and knew just enough to get her to send money. She did it three times before she realized she was being scammed.

The bad actors are getting better and better at tricking us into believing what they’re saying is true—from AI-generated voices that aren’t distinguishable from your loved ones’ real voices to videos that make it look like you were somewhere you were not.

Now, put that on steroids, and it’s hard to understand what to believe or not. As new technologies evolve, it’ll be easier and easier for the bad guys to use them for evil. 

If we study the bad actors, we can pretty easily define what they are doing to disseminate weaponized information and then we can use the same tactics to provide factual, ethical, and honest information. This might look like fostering emotional intelligence, prioritizing speed and accessibility, engaging across multiple channels, educating audiences, monitoring and responding, and building and maintaining trust by returning to PR basics.

You already do some of this, but you will need to tweak some of it to ensure you are fiercely protecting your brand and its reputation. 

Foster Emotional Intelligence

What’s the number one thing bad actors do well? Think about some of them you’ve encountered during your lifetime or career. Heck, think about what Trump does extraordinarily well.

They all exploit emotions. Part of the reason Trump has been as successful as he has is because he says things many, many people think but have been afraid to say out loud. He builds loyalty among those who have felt their feelings and opinions have been diminished by a “woke” and liberal society. Now, in my opinion, he’s using those skills for bad, but imagine if you do the same but use your skills for good.

Instead of manipulation, you are connecting authentically. You tell your brand’s story so that it resonates on a personal level, fostering a genuine connection and trust. 

While the bad actors are out there weaponizing information and creating fear, anger, and sadness, you can aim to evoke positive emotions, such as hope, joy, and pride.

Prioritize Speed and Accessibility

One of the challenges we have every day is responding to things in real-time. Typically there are approval processes we have to work through, which puts us behind days or weeks. 

But here’s the thing about weaponized information: it spreads quickly. This is because it’s designed to be easily digestible and shareable and because it’s created and shared in real time.

To counter this, work internally to build a process that allows you to respond immediately. Your messages should be accurate, timely, and easy to understand while also being shared in real-time. 

Engage Across Multiple Channels

Bad actors often use a multi-platform approach to disseminate their content widely. 

Most of us do this, but we can get stuck in the “post the same thing to all media” cycle without tailoring our message to fit the format or audience.

Consider how your content will be viewed on each social media platform, as well as traditional media outlets, podcasts, and webinars. 

Educate Your Audience

One of the most powerful tools against weaponized information is an informed public. Your job hasn’t changed—you still protect your brand’s relationships with its audiences. But now you might think about how to invest in educational campaigns that teach your communities how to identify credible sources and check facts independently.

This one is a bit more challenging because it will be seen as a non-necessity internally. Still, if you can incorporate education into your existing brand-building campaigns, you can do both. 

Monitor and Respond

At the start of this article, I mentioned a former client whose software is built to help communicators monitor weaponized information. Social listening can help, too, but building something more robust with bad actors in mind can be even more beneficial. 

No matter how you monitor, pay attention to what false narratives are being created and shared—and then go back to the process we talked about earlier to respond swiftly and in real time. 

One thing to note: people share content because it supports a belief or opinion they have, even if the facts show it’s false. That’s why it’s so important to foster emotional intelligence. You must appeal to their emotional side because we are often not rational. So don’t lead with facts. Lead with emotion.  

Build and Maintain Trust

In the long term, the most effective way to out-communicate bad actors is to build a trusting relationship with your audience. 

This is what we do. Every day. 

We’ve discussed it in nearly every episode this year—creating content demonstrating experience and expertise to build trust and authority

Keep doing that. Consistently provide valuable and accurate information. Transparently admit and correct mistakes. Engage with the audience to understand their concerns and questions. If you do those things every day, you will always have the trust you need to stay ahead of the bad actors.

A Weaponized Information Quick Recap

To recap, you will foster emotional intelligence, prioritize speed and accessibility, engage across multiple channels, educate audiences, monitor and respond, and build and maintain trust. 

If you carefully study the bad actors and use these tips, you’ll contribute significantly to the integrity of public discourse. I don’t know about you, but being able to do more for society is exciting. I hope it is for you, too.

Gini Dietrich

Gini Dietrich is the founder, CEO, and author of Spin Sucks, host of the Spin Sucks podcast, and author of Spin Sucks (the book). She is the creator of the PESO Model and has crafted a certification for it in partnership with Syracuse University. She has run and grown an agency for the past 15 years. She is co-author of Marketing in the Round, co-host of Inside PR, and co-host of The Agency Leadership podcast.

View all posts by Gini Dietrich