If we want to understand the brave new world we are entering with social media, then this story is an important read. Facebook decided to manipulate the posted content of hundreds of thousands of its users without telling them that they were part of an experiment. What Facebook did was to control what its users saw of the posts sent out by their social network: some were only allowed to see positive posts, while others were fed negative posts, as a way to gauge what effect this had on their mood.
How do we know this? Because the results have now been published in an academic journal. It appears it did not occur to either Facebook executives or the academics involved that there were serious ethical issues involved in such research. Facebook even defends its behaviour by stating simply that such experiments are part of the terms of service all users sign up to when joining Facebook.
One does not need to be a conspiracy theorist to understand the implications of this kind of research. Through the NSA revelations, we already know that Facebook and all the other providers of services in our increasingly digital lives are entirely subservient to government dictates, and that they are quite willing to hide such collaboration from us, their supposed customers.
So, aside from the immediate ethical issues raised by Facebook’s behaviour here, we also need to consider Facebook’s motives in conducting such research. Doubtless, it is looking at ways to improve its influence and profit margin. But the research will also be of consuming interest to our political elites, who want to know how to control and pacify us. Such information will become ever more vital in a world of depleting resources and greater social unrest.
Even Clay Johnson, who helped to manage Barack Obama’s online presidential campaign in 2008, understands the implications.
The Facebook ‘transmission of anger’ experiment is terrifying. Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?
In a sense, none of this is new, as one commentator, Jacob Silverman, notes. The internet, he points out, is already “a vast collection of market research studies; we’re the subjects”. But this story helps focus our attention on what is really at stake.
In Brave New World, Aldous Huxley foresaw a world where people were kept subdued, complacent and happy through the chemicals they ingested. In 1984, George Orwell predicted a world where media manipulation and surveillance kept everyone uninformed and fearful. The danger is that we are actually likely to get something that is a hybrid of the worst aspects of both these possibilities: ignorance, pacification, surveillance and emotional manipulation through our sources of news and interactions with each other. The new research suggests information itself, rather than chemicals, could be the preferred way to manipulate our emotional life.
Facebook’s blithe dismissal of all grounds for concern is only more reason to be deeply disturbed by where this is heading.