The Facebook Trending Topics Debacle Creates Bias DebateBy Gini Dietrich

In episode nine of this season (and the final season—sad face!) of The Good Wife, the legal team had to defend Chumhum (aka Google) when a restaurant owner saw a decrease in guests because her location was listed as “unsafe” on a Chummy Map.

(Don’t worry…I’m not spoiling anything and I am four episodes from the finale so don’t spoil it for me in the comments!)

The maps are said to automatically provide safety ratings of neighborhoods to tell users which parts of cities should be avoided.

The restaurant was in a neighborhood designated as dangerous, and the episode has the owner wanting to sue, arguing the safety ratings are racist codes for minority neighborhoods.

The company, of course, blamed the algorithm and said it couldn’t be racist because it’s all computer-generated. But it comes out that the coding was done by humans and makes a point about it being coded by white, male coders.

Trending Topics Debacle

As it turns out, fiction isn’t so far from reality.

I was reminded of this episode when I read all the hubbub at Facebook about it routinely suppressing conservative news in the Trending Topics section.

Here’s what is going on, according to Gizmodo:

Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.

Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all.

How Trending Topics Are Created

At first, I was incredulous. I thought Trending Topics was based on how many people liked, commented on, and shared an article.

It’s all an algorithm, I thought.

How would it suppress any news, if it’s based on topics people like, comment on, and share?

How would Kim Kardashian get so much attention if it weren’t the algorithm?

Why else would I care that Orlando Bloom has been seen with two different women in the course of a week?

But then I read Justin Osofsky’s (the Facebook vice president of operations) account of how they surface Trending Topics and it begins to make sense.

According to his blog post on the matter, they are first identified by how many likes, comments, and shares a piece of news gets. But all that does it add it to a queue for review by human beings.

They then, according to his blog post, do the following:

  • Confirm the topic is tied to a current news event in the real world (for example, the topic “#lunch” is talked about during lunch every day around the world, but will not be a trending topic).
  • Write a topic description with information that is corroborated by reporting from at least three of a list of more than a thousand media outlets. A list of these media outlets is available here.
  • Apply a category label to the topic (e.g. sports, science) to help with personalized ranking and to enable suggestions grouped by category for the various tabs on the desktop version.
  • Check to see whether the topic is national or global breaking news that is being covered by most or all of 10 major media outlets—and if it is, the topic is given an importance level that may make the topic more likely to be seen. A list of these outlets is available in the guidelines.

So, just like Chumhum, the Trending Topics are biased, even though they say they:

have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum, as well as to eliminate noise that does not relate to a current newsworthy event but might otherwise be surfaced through our algorithm.

Do We Have a Responsibility to Not Be Biased?

Mark Zuckerberg, himself, has addressed the issue, stating they:

are committed to building a platform for all ideas. Trending Topics is designed to surface the most newsworthy and popular conversations on Facebook. We have rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives.

He goes on to say he has invited leaders from conservative groups to Facebook to discuss the important issues and their points-of-view so he can have a direct conversation about what the company stands for and how it affects the daily lives of each of us.

But it raises an interesting conundrum: As content creators, business owners, business leaders, marketers, and communicators, what is our role in presenting unbiased information to our customers, prospects, stakeholders, and communities?

This very blog is a great example. It’s not difficult to figure out I’m not unbiased and that I have very strong opinions.

While I try extraordinarily hard to keep politics out of our content here, I’m willing to bet most of you can tell which way I lean, just by the mere fact that I won’t call he-who-shall-not-be-named by his name. I’d rather call him Voldemort or Beetlejuice than say his name.

You also know I am not unbiased when it comes to communications integration (the PESO model), PR metrics, and the antiquated way most communicators approach their job.

I’ve even been kicked out of a PR LinkedIn group because of my stance on measuring what we do.

On this week’s Inside PR, we discuss the very issue and Martin Waxman suggests we disclose.

I’m not sure that’s the right answer nor am I sure we should go as far as presenting both sides of a story so it’s not biased and has balance. After all, what makes for great blog content is taking a side—or stance.

So I leave the comments to you because it’s a conversation I’d really like to have: What is the responsibility of a content creator to not show bias? And how do we prevent it when it’s being created by humans?

Gini Dietrich

Gini Dietrich is the founder, CEO, and author of Spin Sucks, host of the Spin Sucks podcast, and author of Spin Sucks (the book). She is the creator of the PESO Model and has crafted a certification for it in partnership with Syracuse University. She has run and grown an agency for the past 15 years. She is co-author of Marketing in the Round, co-host of Inside PR, and co-host of The Agency Leadership podcast.

View all posts by Gini Dietrich