Cmon, you have to admit, you have a bad day every now and then. But, what if the folks at Facebook purposely made it worse without telling you? Mind games for sure!

John Sullivan / Getty Images

Trending on the internet over the weekend is news at Facebook did just that, and a lot of Facebook users are steamin' mad!

It has been revealed that Facebook researchers manipulated the content some users were shown in an attempt to gauge their emotional response. In early 2012, for a period of one week, Facebook changed the content mix in the news feeds of almost 690,000 users. Some people were shown a higher number of positive posts, while others were shown more negative posts.

It worked! The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts. Facebook was able to successfully change the emotional state of its users. While the mood changes were small, the researchers argued that the findings have major implications given the size and scale of the social network.

The study was conducted by researchers from Cornell, the University of California, San Francisco and Facebook. Results were published in the Academic Journal Proceedings of the National Academy of Science.

You need to read the fine print: Facebook’s term of service gives them permission to conduct this kind of research. However, many users have reacted with anger at what they say is a dangerous social experiment. There is no indication that the 690,000 subjects were asked if they would like to take part in the study.

Getty Images

A Facebook spokesman spoke with NPR  and said that

We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely.

 

Given the company’s terms of service, it does not appear that Facebook faces any legal implications. But the guinea pig nature of the experiment — and the decision to execute it without the explicit consent of participants, raises ethical questions.

Lesson learned: Beware of what you see in your news feed!