Last week, news of a Facebook experiment outraged users and even research and privacy experts. To test “emotional contagion,” about 700,000 Facebook users were unknowingly shown a filtered version of their news feeds—scrubbed of either positive or negative content. The result told us something we’ve already suspected: people shown positive updates are more likely to post happy things themselves, and vice versa.
But it wasn’t the experiment’s decidedly un-revolutionary result that has people guffawing. Bloggers are calling the experiment “unethical,” saying that “Facebook intentionally made thousands upon thousands of people sad.”
Technically, and uh, legally, I do think Facebook was in violation, as they had no “informed consent” to conduct these experiments. Apparently they even tried to retroactively cover their behinds by sneaking that little technicality into their terms 4 months AFTER the experiment was conducted, which further points to their culpability, and their knowledge thereof.
But we’re in a bit of a gray area, ethically. Facebook didn’t create happy or depressing content to show users; they simply filtered the content from those users’ friends. Our news feeds are already filtered; most of us have enough friends or follow enough pages to where we cannot see ALL content, so Facebook’s algorithm chooses what we see based on our previous interactions. In the case of this experiment, they just tweaked that algorithm.
And in case you weren’t aware, this isn’t the first instance of emotional manipulation on the part of a large company—that’s kind of the definition of advertising. Brands try to make you feel all warm and fuzzy inside so that you associate that feeling with whatever their slingin’. Or, they go for negative reinforcement: what about those ASPCA montages of super-sad looking puppies and kittens? (Yes, I know ultimately it’s for the good of the puppies, so maybe it’s not so terrible.) Or other products that make you feel uncool if you don’t have them—why aren’t we up in arms about that?
Yes, the experiment was dishonest and a bit creepy. But is it really a big deal that some people were a little more bummed out than usual? I don’t think people realize that Facebook isn’t a free-for-all, public forum—for example, when you post something on Facebook, they have all rights to use that content. You’re choosing to use their website and abide by their terms, even if you’re not aware of them all. It’s naive to think that your activity on the internet isn’t being tracked or studied, because it definitely is. Armed with the awareness of that fact, plus awareness of your own emotions, might just make you a little less susceptible to digital manipulation.