Eleanor Pierce

Facebook Experiment: Why Mess With Our Emotions?

By: Eleanor Pierce | July 15, 2014 | 

Facebook Experiment: Why Mess With Our Emotions?By Eleanor Pierce

Yesterday, Gini Dietrich wrote about the Facebook experiment, but covered it from a communications perspective.

Not one to outdo my boss, the fact still remains that some people are very mad at Facebook (what else is new, right?).

Others are more freaked out.

It’s all because this wasn’t your typical news feed manipulation. It was an actual study of the psychological effects your newsfeed can have on your emotions.

The Tenuous Results

The conclusion the Facebook experiment recently-published paper drew was that emotional states can be transferred via social media.

Personally, I don’t draw the same conclusion from that data.

For one, just because I post a positive message on Facebook doesn’t mean that’s how I feel today. Maybe I’m actually feeling miserable and I just don’t want to be a Debbie Downer.

Plus, the way the researchers decided whether a message was positive or negative is pretty flawed.

Regardless, the bottom line is this: Facebook participated in research that intended to manipulate the emotions of its users.

And, to be clear, that’s why it bothered people.

We all know Facebook experiments with, adjusts, and re-jiggers the news feed all the time.

With all of our Facebook friends, brands we like, and the paid content on the site, it would be absolutely unwieldy if our feeds showed all possible content in an ever-flowing fountain of updates.

So Facebook decides what to show us. And they use an algorithm they won’t disclose how they decide what, exactly, to show us.

The Ethical Quagmire

The difference here is that Facebook—along with researchers from Cornell and the University of California-San Francisco—was doing more than just manipulating their product. They were running an actual experiment on their users. And when you’re running a psychological experiment, the ethical stakes are high, and they’re clearly defined.

On Slate, Katy Waldman took a pretty good look at the issue.

She addressed some of the biggest ethical problems in the paper, namely the idea that subjects of research are required to give informed consent.

She wrote:

Here is the only mention of ‘informed consent’ in the paper: The research ‘was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.’

That is not how most social scientists define informed consent.

Here is the relevant section of the Facebook data use policy:

For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

I know Waldman did her research, but I wanted to talk to someone who understands this world, so I spoke to a family member who’s a psychology professor and spends a great deal of time thinking about these issues.

Psychologists Say

Billy Hills, who is a professor in the psychology and sociology department at Coastal Carolina University, told me that claiming the Facebook data use policy doesn’t really cut it. And as he sees it, there are serious risks at play.

What if someone gets angry and violent or sad and commits suicide? These may seem like extreme versions but things do happen and the rules are written to protect all from unethical research practices. Additionally, there was no way to opt out of the study.

If you don’t agree to the condition set forth by Facebook that your data is potentially usable by them, then you can’t have an account with them. This amounts to coercion, another big no-no for research practices involving human subjects.

The Facebook Experiment is Different

We generally get mad when Facebook messes with the news feed because it affects our ability to see what we want to see, to show what we want to show.

I understand that Facebook is a money-making enterprise, so I understand they will tweak the algorithm in order to maximize the amount of money they make. Ostensibly, they will do this by making a better product and getting all of us to spend even more time on Facebook.

This doesn’t trouble most of us too much—after all, how is it that different from A/B testing email subject lines? Or testing a new advertising campaign? Trying something different with PPC and comparing the data to see what works better?

But this situation is different. Facebook delved into the world of psychological research without following the pretty clearly defined code of ethics laid out there.

I’ll also say, I don’t know exactly why the professionals who do live and breathe this world went along with it, as this was an edited and reviewed paper, but they did.

It’s just another ugly PR situation for a brand that already has some serious trust issues.

I’d say people have pretty good reason to be freaked out by it.

Here’s what I want to know: Exactly how freaked out will you have to be to reconsider your relationship with Facebook?

About Eleanor Pierce

Eleanor Pierce is a recovering journalist who can't decide which part of the country to call home. She's happiest when she's reading, though she also really likes writing, baking, dogs, and sarcasm. No, seriously.

  • You already know how I feel about this from a communications perspective, so I’ll address it from a personal perspective. This, to me, is akin to companies that allow their customers to come into their locations with guns on their hips. Is it okay for them to allow this or should they ban it? How does one approach it, from a communications perspective? It’s a challenge, to be sure. What Facebook is doing is disappointing, but we are not the customer. We are the product. They are doing what they can with us – as the product – to monetize their business. And good for them. They don’t owe any of us an apology. But it’s icky and it’s unethical and, to your point, it could be very, very bad.

  • Your last question is SUCH a good one! There are a lot of things in life that aren’t right or ethical, but it takes a lot for most people to make a change from what’s comfortable or habit. Facebook, at this point causes a passive pain, but very active reward for most people. Until that reverses very few folks will move away (in my never humble opinion)

  • ginidietrich Right! It’s hard because, if you ARE a paying customer, Facebook actually provides a great product. The more advertising product they roll out, the more impressed I am by what we can do there – if we’re willing to spend the money.

  • LauraPetrolino You’re so right! I keep waiting for Facebook to turn into a ghost-town, as MySpace did (quite suddenly it seemed to me at the time, too …), but until it ceases to be the place my far-flung friends and family members share jokes and photos of their kids, it’ll be a place I hang out. Darn you Facebook (shakes fist at the Zuckerberg in the sky)!

  • Well thought out and researched. You’ve touched on some really key points that disturbed me the most about this. And yes- I want to know why the academics agreed to tactics they must have known were shady, to say the least. There is another aspect to consider, too- how Facebook and other social media operate on the same variable-ratio reinforcement schedule as slot machines and therefore breed addiction. So this experiment would have forced some people to go ‘cold turkey’ and other to get extra hits of notifications. Stepping away, therefore, only becomes harder…

  • LauraPetrolino And indeed- addiction. It does go well beyond habit, I believe.

  • ginidietrich My thoughts very much echo yours.

    Brands have conducted studies on unsuspecting people for goodness knows how long in formal and informal ways. Careful observation of buying habits, not always announced to customers, is why milk is in the furthest corner of the supermarket.

    The Facebook thing feels icky because it is our lives whereas before it might be a privacy issue, though we’re putting our laundry on a clothesline we don’t own, and which is located on property that isn’t ours.

    The thing is, all the outrage doesn’t hurt Facebook. I think a big part of the reason is 81 percent of the more than 1 billion users do not live in the US or Canada. And different parts of the world have very different views on privacy.

  • RebeccaTodd LauraPetrolino Oh absolutely. I believe I have a case of what Professor Hills would call “denial.”

  • RebeccaTodd It seems to me that they looked at it as a complete data set that they were just analyzing. But that’s an irresponsible way to go – in academia, you need to be responsible for your data set.

  • ClayMorgan ginidietrich  “different parts of the world have very different views on privacy” << True story! I think it’s psychologically related to the bigole personal space bubbles Americans have (and Canadians, too, I think …).

  • Great analysis and companion piece to Gini’s post yesterday.

  • Eleanor Pierce What also really disturbs is that the very people who probably needed to have positive interactions, people who were depressed or struggling- may have both had their statuses (stati?) withheld for ‘negative content’ and therefore received no positive encouragement from their online peers, without considering that Face also may have flooded their feeds with more negativity. Also- wanted to thank you for seeking an academic opinion. Brings a solid perspective.

  • JoshuaJLight

    Nicely written.  One additional note…Zuckerberg is a big fan of psychology so don’t expect the experiments to stop 🙂

  • Gini Dietrich


  • Arment Dietrich, Inc.

    Such a troublemaker. ^ep

  • JoshuaJLight  He’s an interesting character – I recently read a piece the New Yorker did about his education campaign in New Jersey … I guess if I had that kind of money and power, I’d follow my interests, too …

  • biggreenpen Well thanks!

  • RebeccaTodd Eleanor Pierce Yeah, it was an interesting conversation. He said a lot more interesting stuff – too much to fit into the blog post. He also talked about how informed consent should include the choice to withdraw from participation and a debriefing that provides a full explanation (including an explanation of any deception involved, if deception is necessary to keep people from changing their behavior). He said he didn’t think his internal review board would have let all that fly.

  • JoshuaJLight

    Eleanor Pierce JoshuaJLight yeah….I watched an interview where he discussed his passion for psychology.  Anyways…you’re a fantastic writer…seriously.  I read a lot of articles, and you’ve just done a great job here.  Keep it up.

  • JoshuaJLight Eleanor Pierce Aw shucks, thanks!

  • Eleanor Pierce JoshuaJLight Clearly she has a great editor.

  • ginidietrich Eleanor Pierce JoshuaJLight TRUE STORY.

  • JoshuaJLight

    Eleanor Pierce ginidietrich JoshuaJLight haha…you two are great 🙂

  • Great write-up, Eleanor, and raises some very important questions. There’s data mining for data’s sake, then there’s this very unethical (personal opinion) move that Facebook took. I’m seriously considering my use of the platform – it won’t hurt Facebook (at least in the short term), but it won’t let them benefit from me, either.

  • Eleanor Pierce ClayMorgan ginidietrich Funny. I didn’t have any problem with it. As I commented on Gini’s piece. I get it that I’m the “product” – and I also made that faustian deal when I signed up for this new online life of Google and social media – in order for YOU to make my life easier, I will have to give up my privacy. It’s evolution, baby. Don’t like it? Don’t use it. Simple as that.

  • ginidietrich had you write this to make me happy I am sure 8)
    As I mentioned in her post this isn’t A/B testing. And after the suicide of a friend a year ago this month and the suicide of a son of someone I know from Social Media also last summer I found this pretty incredulous. I want to come back to this…….
    You mentioned would it affect our emotions. Depends on the news. If it is a bunch of my friends all having a bad day or dwelling on a close person in our group who is really sick yes it would amplify sadness especially if I was trying to block it out and came to Facebook. If they show just negative stories then f course it won’t. I mean isn’t 90% of the news out there horrific? We are so immune to stuff and the US Censors have protected us for so many decades from ‘Reality’. They show us bombs exploding but no broken bodies. They show us mug shots of murderers but not the bodies of the victims. So no it won’t affect us.
    Which brings me back to suicide. I have been screaming against the war on terror since the Patriot Act was signed. It is the biggest waste of tax payer money. Our security system worked on 9/11. No guns or bombs were used. They used box cutters. I mean how humiliating for us. Since 9/11 on average the US has had 14-15,000 murders. In fact it is almost 200,000 murders that NO ONE IN THE USA cares about. (proof we just block out bad stuff). 
    BUT getting back to suicide. What I didn’t know until last summer was that we have more suicides than we do murders. And that has me feeling Facebook put it’s stock value and company at risk. One suicide from this could of wiped the company out.
    So how STUPID is Facebook for taking that risk. If they funded a proper 3rd party research with volunteers there is no risk. Sometimes the smartest people are actually dumber than a door nob. Facebook has been very lucky to get where they are more luck than savvy trust me and it could be running out.

  • Danny Brown I’m really curious to hear what you do. Right now, for me, the benefits (contact with friends/family all over the country) is much higher than the cost … but who knows how long I’ll still feel that way …

  • Howie Goldfarb I do think we can take some comfort in how poorly the study was designed. The way that they tell if a post is positive would rate both of the following sentences as positive:

    I am having a great day
    I am not having a great day

    Because the word “great” is in both of them. Which is part of why I think their results were so negligible–they’re hardly statistically significant.

  • Eleanor Pierce I’m in the process of creating an alternative profile. No real name; no family pictures; no personal pictures. No email address to tie to me. I’ll keep locked down, and only invite a very small amount of people to become “friends”. Then deleting my existing profile. 
    The way I’ve begun to look at it is this – belllindsay may say “we’re the product and if you don’t like it, tough” but for me, that’s an unfair “rule” to put on people who shouldn’t have to give up all privacy just to use a crappy network. To the experiment they carried out, are we really saying we’re happy that they carried that out on friends, family, loved ones? We may accept “we’re the product” (I don’t), but I don’t accept Facebook playing with the emotions of friends and loved ones like they’re some sort of plaything.

  • Nicely done Eleanor as with the initial post yesterday.  I have been following these posts and trying to get my head around Gini’s thoughts strictly from a business communications perspective. 

    Personally I don’t feel that their manipulation would or could have any affect on my emotional state with the exception of frustration, however from the health care field I have a deep concerns. We have been working very hard to develop the Health IT field so that consumers feel safe, secure and trusting of their medical data online. Big data will be so beneficial for population management issues as well as for disease, disorders and illness states. It is great to analyze big data gathered (with privacy intact) but once we begin to manipulate the data for expected outcomes, without consent or knowledge, we have embarked on a very slippery slope. What was most disconcerting, as repeatedly highlighted, was the academic (scientific) participation which appear to be warped in basic research ethics. 

    As with Facebook being the owned platform and our content or participation a product, I can make that analogy with our participation in the health system. The hospital, operating room or doctors office being the owned platform and our bodies (physical and psychological) the content. Granted when we sign into the hospital we give permission to share, use, and perform tasks on our content, especially so in a teaching hospital, but we do so with informed consent and the understanding that we still own our content (bodies) and are not to be manipulated for desired outcomes for experimental purposes without our knowledge. Sure a procedure or medical device, drug etc may be for trial purposes but we are given the risks and benefits in full disclosure so as to make informed decisions. There are certainly instances where negative psychological outcomes have come to light from ordinary circumstances ( ICU psychosis from the isolation and constant beeping of machinery) but they hopefully are not induced by intentional and proactive manipulation to see if psychosis can be brought about!

  • Danny Brown Eleanor Pierce belllindsay Yeah, I have two Facebook profiles, too, one that’s kind of a shell, but for different reasons. I guess that does mean down the line if I get fed up I’ve already got step one taken care of …

  • Danny Brown Eleanor Pierce belllindsay I see all sorts of psychological silly posts tests on Facebook: What age are you emotionally, what geographical area do you belong, are you really a southerner or a northerner, what type of animal would you be? The difference is I understand it is a psychological game and I agree to play rather than be played. No harm, no foul.

  • annelizhannan Most of these aren’t really “psychological tests”, they’re simple data mining questions. The tests adhered to in this experiment went much deeper – they deliberately controlled the type of sentiment you’d see in an update.
    Let’s say you’re a teen, with hormones all over the place. You’ve just split with your boyfriend/girlfriend; your grades are suffering; your parents are on your back; you’re being made fun of. Your emotions aren’t anywhere near under your control – then you see negative post upon negative post upon negative post. 
    Or let’s say you’re an adult with pyschological issues. Or financial worries. Or job worries. And you see nothing but negativity in your feed.
    It’s more than just games and silly questions that are on “trial” here. Eleanor Pierce belllindsay

  • Danny Brown Exactly and that is why I have the problems. They crossed the line over into the health arena with this ‘research’ which I find dishonorable if not unethical in legal terms.

  • Danny Brown I also posted some lengthier thoughts earlier but must have been caught in the spam filter again as it is not here.  I will have to check with Livefyre or belllindsay to see what is up.

  • annelizhannan Danny Brown belllindsay Ruh roh!

  • Eleanor Pierce Howie Goldfarb Yeah, see, now that’s the sort of “research” that no journal should ever publish. Sounds to me like some hotshot MIT engineers w/little understanding of sociology worked that up.

  • Eleanor Pierce Howie Goldfarb Yeah, see, now that’s the sort of “research” that no journal should ever publish. Sounds to me like some hotshot MIT engineers w/little understanding of sociology worked that up.

  • ClayMorgan ginidietrich I agree w/you – but I do think part of the public vs. private perception thing is that when you’re in a public place you know the potential. I would be absolutely shocked if FB wasn’t running these same sort of “studies” on private messages and communications. Also consider that with the ability to limit who reads posts, there’s an implication of privacy as well. And, hey, let’s not just pile on FB, because Google has been doing it with Gmail for years.

  • susancellura

    “It’s just another ugly PR situation for a brand that already has some http://www.washingtonpost.com/blogs/the-switch/wp/2013/12/31/facebook-wants-to-know-if-you-trust-it-but-its-keeping-all-the-answers-to-itself/.”
    Yes, we don’t quit FB. Hmmmm…