A leading privacy group filed a formal complaint on Thursday with the US Federal Trade Commission (FTC) over a 2012 study in which Facebook manipulated the news feeds of nearly 700,000 users to see what effect the changes would have on their emotions.
The group, the Washington-based Electronic Privacy Information Center, said Facebook had deceived its users and violated the terms of a 2012 consent decree with the commission, which is the principal regulatory agency overseeing consumer privacy in the US.
“The company purposefully messed with people’s minds. At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers,” the advocacy group wrote in its complaint.
In the study, which lasted for one week in January 2012, Facebook changed the number of positive and negative posts that users saw in their feeds to see how that affected the emotional tone of the posts they made afterward. The research, which was published in an academic journal last week, found that people who were exposed to more negative material went on to write slightly more negative posts, and vice versa.
Facebook never sought formal permission from users for the study, arguing that its data use policy gave it sufficient permission to conduct the research. However, the privacy group noted that at the time of the emotion study, the company’s data policy did not tell users that their personal information could be used for research purposes.
Regulators in Ireland and the UK are also asking questions about Facebook’s actions over the study.
On Thursday, the journal that published the study, the US Proceedings of the National Academy of Sciences, issued an “ expression of concern” regarding Facebook’s decision not to get explicit consent from the affected users before running the study.
“Obtaining informed consent and allowing participants to opt out are best practices in most instances under the US Department of Health and Human Services Policy for the Protection of Human Research Subjects,” the journal’s editor-in-chief Inder Verma wrote in the note.
Although academic researchers are generally expected to follow the policy, Facebook, as a private company, was not required to do so, Verma said.
“It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out,” he said.
The company has apologized and said it has since adopted stricter internal standards for reviewing studies before they are begun.
“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” Facebook spokeswoman Jodi Seth said on Thursday.
On Wednesday, Facebook’s chief operating officer, Sheryl Sandberg, said the company was in communication with regulators all over the world and that “this will be OK.”