Facebook has come under fire with certain practices on its site, like an experiment back in 2012 that revolved around tinkering with a news feed to manipulate certain emotions out of specific users. With that, the social media site has promised that, for future-based purposes, further research with said users would be much deeper than before, with a look at “deeply personal topics” across several types of users.

However, the company stopped short of detailing just where these practices would come into play, not yet revealing guidelines in terms of what research would be utilized, or what practices would be considered appropriate. It also failed to clarify whether it would ask for consent from users in terms of certain projects, like the failed emotional test described above.

It did confirm, however, that outside companies would not be involved with it came to this research. However, some researchers feel that, despite the promises to basically “do better,” Facebook needs to do more.

“This is a company whose lifeblood is consumer data. So mistrust by the public, were it to reach too critical a point, would pose an existential threat to the company,” said Ryan Calo, an assistant professor at the University of Washington School of Law, who had previously asked Facebook about creating a review panel in terms of research. “Facebook needs to reassure its users they can trust them.”

Speaking about the new review process in the company’s blog, Facebook chief technology officer Mike Schroepfer stated that it has “taken to heart the comments and criticism.”

“It is clear now that there are things we should have done differently,” he said. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”

There is still some show of concern, according to Cornell University professor Jeffrey T. Hancock, who was previously one of the authors behind the emotional study. While he is pleased that the company is considering teaching ethics better with its studies, he’s concerned that the results of future studies may not even be disclosed. “Will they keep doing those and not publish them Or does the review panel say we need to think about that ” he asked. “They don’t say anything about informed consent or debriefing.”

What do you think? Should Facebook be given a chance to prove itself with its new research studies Or are you worried that similar results could befall the social site?

Source: New York Times, Facebook