Facebook Promises a Deeper Review of Its User Research
SAN FRANCISCO - Facebook said Thursday that future research on its 1.3 billion users would be subjected to greater internal scrutiny from top managers, especially if they focused on 'deeply personal topics' or specific groups of people.
But no outside body will review Facebook's research projects, and the company declined to disclose what guidelines it would use to decide whether research was appropriate. Nor did it indicate whether it will get consent from users for projects like its emotion manipulation study, which set off a global furor when it was disclosed this summer.
In essence, Facebook's message is the same as it has always been: Trust us, we promise to do better.
That was unlikely to calm the storm caused by publication of the emotion study in June. In the study, Facebook changed the number of positive and negative posts that a half-million users saw in their news feeds to assess the impact on the emotional tone of their future posts.
Many users saw the manipulation, which was never disclosed to the individuals affected, as a gross violation of the trust they had placed in the social network. The company's defenders, including sites like OKCupid, noted that Internet companies routinely performed even more manipulative research on customers as they sought to improve their products.
In a blog post announcing its new review process, Mike Schroepfer, Facebook's chief technology officer, said the company had 'taken to heart the comments and criticism' that came after the emotion paper was published.
'It is clear now that there are things we should have done differently,' he said. 'For example, we should have considered other nonexperimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.'
Mr. Schroepfer, who declined an interview request, wrote that Facebook's researchers have been given clearer guidelines for their work, although he did not specify what those were. The company's engineers will get ethics training as part of their six-week boot camp when they join the company. And more intrusive research will now be reviewed by a panel of high-ranking Facebook officials, including people involved in the legal, policy and privacy arenas.
'We want to do this research in a way that honors the trust you put in us by using Facebook every day,' he wrote. 'We will continue to learn and improve as we work toward this goal.'
Facebook said it consulted with many outside experts, including academics and policy advocates, in crafting its new guidelines.
Two of those experts said Thursday that Facebook's efforts, while admirable, were just a start.
Jeffrey T. Hancock, a Cornell University professor and one of the authors of the controversial emotion study, who has been trying to improve future research practices, said he was pleased that Facebook was going to train its researchers in ethics.
But he said it was important to know what standards Facebook was going to use to judge internal research, including whether projects similar to the emotion manipulation study would be allowed in the future.
'Will they keep doing those and not publish them? Or does the review panel say we need to think about that?' he said. 'They don't say anything about informed consent or debriefing.'
Ryan Calo, an assistant professor at the University of Washington School of Law, said he had been urging Facebook and other companies to create internal research review panels similar to the boards that review research at universities.
'I'm very encouraged that they've taken the step to create this board to review not just consumer research going out the door but all the research they're doing at the company,' he said.
But he said Facebook also needed to publicly share its research guidelines, both to get feedback and to properly disclose to its users what kind of research it considered fair game.
'This is a company whose lifeblood is consumer data. So mistrust by the public, were it to reach too critical a point, would pose an existential threat to the company,' he said. 'Facebook needs to reassure its users they can trust them.'