Five reasons to be freaked out by the Facebook experiment

facebook experiment

By Mark Schaefer

Like most of the online world, I’m stunned by the Facebook experiment designed to surreptitiously toy with the emotions of its customers. But it goes beyond the simple shock value of a company manipulating people.

The news emerged that in 2012, Facebook conducted a study to determine whether it could alter the emotional state of its users. The company’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users. They examined whether content from the subjects was more positive or negative based on the manipulated tone of the news feed.

There are broad reasons for concern that are deeper than what we see on the surface.

1. This was a corporate decision

I’m not a person who hates Facebook and is looking for a reason to ding them. I actually think it is understandable that Facebook wanted to conduct this kind of fundamental research.

However the rational decision would be to pay a university to get the same results under controlled and honest conditions. No company should ever make a decision to turn their customers into lab rats. The more disturbing issue is that Forbes reported that this research was approved by an internal Facebook review board. So this was not the case of a lone wolf embarassing the company. This breach reflects the dysfunctional corporate culture of Facebook. That makes my head spin.

2. Facebook is hiding behind legalese

At this moment, days after the furor erupted, Facebook has still not issued any apology. One of the researchers, Adam Kramer, created a Facebook post explaining the methodology and stating the impact on people as “minimal.”

Facebook justified the news feed mind game by saying it was covered by the company’s “Data Use Policy” (part of the terms and conditions nobody reads), which contains one cryptic line about how your information could be used for research. Christopher Penn did a nice job separating the difference between “legal” research and “ethical” research in his post about Facebook emotional testing.

So the message here is that the company will do whatever it wants as long they can cover their asses legally.**

3. They haven’t learned their lesson

Facebook’s arrogant approach to customers and privacy was so extreme that it was the subject of a U.S. Congressional investigation in 2012. They were found guilty, fined and subjected to 20 years of privacy audits by the goverment. The government essentially ruled that Facebook needs a babysitter. This Facebook experiment shows that the company still has the attitude and maturity of a petulant 5-year-old, doing whatever it wants unless it has adult supervison.

What if somebody was already experiencing depression and this experiment made them more depressed … even dangerously depressed? What is the probability that over 689,000 people that somebody was pushed into an inescapably dark place? Did they even THINK about the fact their “users” are real people who may already be suffering?

4. Its arrogance will be its undoing

Facebook made a terrible error in judgment. But it gets worse. It published the study in the March issue of the Proceedings of the National Academy of Sciences. The message here is, “we screwed our customers and we also want to stroke our egos buy getting academic credit for it.” It put its ego above its customers.

Here’s the chilling thought: This is the only experiment we KNOW about because it was published.

5. Facebook: The world’s Valium?

One takeaway of the study was that taking all emotional content out of a person’s news feed caused a “withdrawal effect.” Facebook concluded that it should subject you to happy content to keep you coming back. The implication is that to increase usage (i.e. maximize profits through ads) Facebook must not just edit your news feed through Edgerank, it should tweak the emotional tone of its world like a digital Valium.

The actual experiment is only the tip of the iceberg. What are they going to DO with the results of this research? I doubt the answer is “nothing.”

Implications of the Facebook Experiment

One camp has emerged supporting Facebook, claiming that we are all subject to digital manipulation by every company and Facebook has the right to do whatever it pleases with its data. Some contend this is simply normal A/B testing conducted by any company involved with eCommerce. It is more complex than that. Intentionally making sad people sadder crosses an ethical line beyond the day to day work of improving a user experience.

Last year, before the Facebook IPO, I wrote a post called “Why Facebook Will Become the Most Dangerous Company on Earth.” The premise was that with the unrelenting pressure to increase profits — every quarter without end — the company eventually would be forced to use its only real asset, our personal information, in increasingly bold and risky ways.

I think this is proving to be true.

The implication of a strategy that disrespects customers is not just a temporary emotional furor. There is an economic implication, too. Facebook is the world’s dominant social network and its only significant threat is itself. Corporate arrogance is a sure path to self-destruction as history proves.

What are your thoughts on this experiment and its implications?

** I think you could make an argument that Facebook is NOT covered by their terms and conditions on this episode. The policy states that the company “Uses the information it receives about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” The word “research” was added to the terms and conditions four months after the experiment started. I think it is questionable that changing the data you see in an experiment fits under this data usage policy. Clicking a box on a website does not constitute informed consent.

All posts

The Marketing Companion Podcast

Why not tune into the world’s most entertaining marketing podcast that I co-host with Tom Webster.

View details

Let's plot a strategy together

Want to solve big marketing problems for a little bit of money? Sign up for an hour of Mark’s time and put your business on the fast-track.

View details


Send this to a friend