CHANCES ARE, YOU’RE on Facebook right now. About 1.7 billion people—almost a quarter of the world’s population—actively use the social media platform . And though it’s free, Facebook isn’t charity. It has a product, and that product is you and me. The company cleared a tidy $5.2 billion from user-directed ads in the first quarter of 2016 alone.
To keep that business running, Facebook doesn’t just need users: It needs active, engaged users. Facebook needs to get in your head, to understand how you’ll respond to a product or an offer or a marketing campaign—and more and more, it’s using internal experiments to predict those behaviors. But using those methods, commonly referred to as neuromarketing, means that Facebook needs to address the same ethical questions other behavioral scientists do.
In 2014, Facebook undertook an experiment on more than half a million of its users, manipulating feeds so some people saw more positive posts while others were exposed to a more negative stream. The moods were contagious: Those who saw more good news wrote happier posts and those who saw more bad news wrote sadder posts. But Facebook didn’t ask its users permission to do this; it has argued that their terms of service allows it to structure what you see. The blowback was massive, with some wondering whether the experiment pushed depressed users towards suicide. In response, Facebook has recently decided to draw on an essential element of ethics in behavioral science: an Institutional Review Board.