HomeNewsFacebook’s News Feed manipulation experiment got military backing?

Facebook’s News Feed manipulation experiment got military backing?

facebookexperia_301237025300_640x360-624x351
What have you done, Facebook?

If statements from the US military about Facebook and Cornell University’s highly controversial News Feed manipulation experiments are to be believed, then it’s more than just creepy and revolting.

Speaking to Mashable, an army spokesperson said Cornell University had sent a proposal in 2008 to the military for funding and it was about the same Facebook social contagion study conducted by the now-infamous Data Sciences group. As the report further points out the university has tried its best to distance itself from any military financial ties after rumours about such an agreement had emerged.  However what makes the whole thing more than a little fishy is that in the initial press release from June announcing the study, Cornell mentions receiving funding from the Army Research Office. But it has since been amended to say otherwise. The college maintains now that the study did not receive external funding.

As revealed in several reports earlier this week, Facebook manipulated the News Feeds of nearly 700,000 active users to see if positive and negative status updates and posts have a correlation to their real world emotions. The study revealed that people who saw negative posts primarily end up in a foul mood, while those viewing positive posts remained happy. Naturally, this means Facebook intentionally made some users sad and miserable over a long period of time. Now, a UK regulatory authority will be probing the company’s role in this and whether it misled users and wrongly used data with its experiment.

The Wall Street Journal further reports that Facebook’s Data Science conducted these mass social experiments with very little limitations and nearly any one within the team could request and get approval on any experiment. In one such test, thousands of Facebook users reportedly received a message telling them they were being evicted from the social network for being robots or using fake names. The truth was that Facebook just wanted to understand anti-fraud capabilities of its own platform.

Proposed experiments were not required to pass a review stage before being deployed. “There’s no review process, per se,” Andrew Ledvina, a Facebook data scientist till July 2013 told the paper. “Anyone on that team could run a test. They’re always trying to alter peoples’ behaviour.” Ledvina added that scientists within the team have apprehensions about deploying some tests, but they were pretty desensitised to it.

Subscribe To Our Newsletter!

To be updated with all the latest news, offers and special announcements.

RECOMMENDED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

TRENDING!