Advertisement

Former researcher says Facebook's behavioral experiments had 'few limits'

Facebook's still trying to brush off that whole psychological study with unaware users thing, but according to a former team member and outside researchers, the social network's data science department has had (changes have been promised) surprisingly free rein over how it polled and tweaked the site. Andrew Ledvina, who worked as a Facebook data scientist from February 2012 to July 2013, told the WSJ: "Anyone on that team could run a test."

In 2010, the research team gauged how "political mobilization messages" delivered to 61 million users affected voting in the US congressional elections that year, while in 2012, thousands of users received a warning that they were being locked out of Facebook. While the reason given was that it believed they were bots or using a fake name, the network already knew that the majority were real users -- it was apparently all in aid of improving Facebook's anti-fraud system.

Since its beginnings, Facebook's data science team has apparently run hundreds of behavioral tests. As Ledvina put it: "They're always trying to alter peoples' behavior." Other tests and research are apparently less invasive: less emotional button pushing, more button testing in an effort to click on more ads and generally waste more time on the site. As the WSJ notes, Facebook isn't the only one: Twitter, Google, Microsoft et al. also research and monitor their users.

[Image credit: AFP/Getty Images]