Facebook Experiment: Why Mess With Our Emotions?By Eleanor Pierce

Yesterday, Gini Dietrich wrote about the Facebook experiment, but covered it from a communications perspective.

Not one to outdo my boss, the fact still remains that some people are very mad at Facebook (what else is new, right?).

Others are more freaked out.

It’s all because this wasn’t your typical news feed manipulation. It was an actual study of the psychological effects your newsfeed can have on your emotions.

The Tenuous Results

The conclusion the Facebook experiment recently-published paper drew was that emotional states can be transferred via social media.

Personally, I don’t draw the same conclusion from that data.

For one, just because I post a positive message on Facebook doesn’t mean that’s how I feel today. Maybe I’m actually feeling miserable and I just don’t want to be a Debbie Downer.

Plus, the way the researchers decided whether a message was positive or negative is pretty flawed.

Regardless, the bottom line is this: Facebook participated in research that intended to manipulate the emotions of its users.

And, to be clear, that’s why it bothered people.

We all know Facebook experiments with, adjusts, and re-jiggers the news feed all the time.

With all of our Facebook friends, brands we like, and the paid content on the site, it would be absolutely unwieldy if our feeds showed all possible content in an ever-flowing fountain of updates.

So Facebook decides what to show us. And they use an algorithm they won’t disclose how they decide what, exactly, to show us.

The Ethical Quagmire

The difference here is that Facebook—along with researchers from Cornell and the University of California-San Francisco—was doing more than just manipulating their product. They were running an actual experiment on their users. And when you’re running a psychological experiment, the ethical stakes are high, and they’re clearly defined.

On Slate, Katy Waldman took a pretty good look at the issue.

She addressed some of the biggest ethical problems in the paper, namely the idea that subjects of research are required to give informed consent.

She wrote:

Here is the only mention of ‘informed consent’ in the paper: The research ‘was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.’

That is not how most social scientists define informed consent.

Here is the relevant section of the Facebook data use policy:

For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

I know Waldman did her research, but I wanted to talk to someone who understands this world, so I spoke to a family member who’s a psychology professor and spends a great deal of time thinking about these issues.

Psychologists Say

Billy Hills, who is a professor in the psychology and sociology department at Coastal Carolina University, told me that claiming the Facebook data use policy doesn’t really cut it. And as he sees it, there are serious risks at play.

What if someone gets angry and violent or sad and commits suicide? These may seem like extreme versions but things do happen and the rules are written to protect all from unethical research practices. Additionally, there was no way to opt out of the study.

If you don’t agree to the condition set forth by Facebook that your data is potentially usable by them, then you can’t have an account with them. This amounts to coercion, another big no-no for research practices involving human subjects.

The Facebook Experiment is Different

We generally get mad when Facebook messes with the news feed because it affects our ability to see what we want to see, to show what we want to show.

I understand that Facebook is a money-making enterprise, so I understand they will tweak the algorithm in order to maximize the amount of money they make. Ostensibly, they will do this by making a better product and getting all of us to spend even more time on Facebook.

This doesn’t trouble most of us too much—after all, how is it that different from A/B testing email subject lines? Or testing a new advertising campaign? Trying something different with PPC and comparing the data to see what works better?

But this situation is different. Facebook delved into the world of psychological research without following the pretty clearly defined code of ethics laid out there.

I’ll also say, I don’t know exactly why the professionals who do live and breathe this world went along with it, as this was an edited and reviewed paper, but they did.

It’s just another ugly PR situation for a brand that already has some serious trust issues.

I’d say people have pretty good reason to be freaked out by it.

Here’s what I want to know: Exactly how freaked out will you have to be to reconsider your relationship with Facebook?

Eleanor Pierce

Eleanor Pierce is a recovering journalist who can't decide which part of the country to call home. She's happiest when she's reading, though she also really likes writing, baking, dogs, and sarcasm. No, seriously.

View all posts by Eleanor Pierce