184158-how-we-should-feel-about-the-facebook-emotion-manipulation-experimen

How We Should Feel about the Facebook Emotion Manipulation Experiment?

So it turns out that Facebook has been emotionally manipulating users as part of a scientific experiment. And it's partly our fault.

Above: Suspecting emoticon from Shutterstock.com.

At the end of June, several news outlets ran the story that Facebook, the social networking giant which now commands about $2.91 billion in profits, ran a bizarre — and probably unethical — experiment on hundreds of thousands of its users back in 2012. According to reports, the company manipulated the newsfeed of 689,003 users with a view to provoking shifts in psychological and emotional dispositions.

To do that, Facebook played around with those users’ news feed algorithms to display different proportions of “positive” and “negative” emotional content, and then observed whether those adjusted proportions had any significant impact on the user’s emotions, which they assessed through the tone and content of the user’s subsequent posts. Why? Because the company’s research team was interested in learning how emotions spread through social media.

In other words, Facebook leveraged its enormous place in the digital ecosystem to utilize its users as guinea pigs.

Interestingly enough, we could have gone on without knowing about Facebook’s ethical transgressions if not for the company’s sense of academic fidelity. We know about this experiment largely because reporters picked up on the recent publication of the academic paper that came out of the experiment, titled “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks” — you can read it here on Proceedings of the National Academy of Sciences – which, among other things, noted that yes, indeed, emotions do carry through social media.

The news provoked a hum-dinger in the press, with the general consensus being that this was an incredibly unethical, irresponsible, and troubling thing for a company with such financial and social power to do. Facebook has since apologized, but in the brief moment the company mounted an attempt to defend itself, it mentioned that users theoretically gave her or his consent when she or he clicked the “Agree” button following a 9,000~ word Terms of Use Agreement. It’s a pitiful defense, underscoring the fact that Facebook did a shitty thing in the name of conducting a social experiment.

I’m not going to use this column as an opportunity to discuss the need for a Silicon Valley ethics board, or to go over the history of the Institutional Review Board in academia and talk about how that can serve as a parallel case to read these data-wielding tech companies. I’m not going to rail against the outrageous world we’re living in, or emit a rallying cry on how we need to fight back. Truth be told, I’m not well-armed enough to speak on any of these matters, and truth be told, I think I’m too far gone.

(Truth be told, I first heard about the academic paper before I read the write-ups on the experiment, and my first thought about the experiments and its findings were “Oh that’s really cool.” My third thought was “Hey, isn’t that illegal?” My second was about lunch. So, truth be told, I belong to the problem. My mind has already been grafted, my behavior already dependent. I am the techno-fool.)

We live in the wake of three contributing things: the dominance and ubiquity of tech culture and the tech industry, the revelations of a digital surveillance infrastructure that can very well (and probably will be) used against us, and a rising social generation that is incredibly dependent on and accepting of the artifacts of the nexus between the first two things. The embrace and inseparability of ourselves from the infrastructure provided to us by the tech industry probably accounts for the fact that no matter what terrible things these companies may do – from shitty social behavior like sexual harassment of a female co-founder or flippant disrespect of a tragic situation to much headier stuff like handing our personal data over to a wolfish governmental structure or using somewhat defenseless users as guinea pigs – they will probably always get away with it. It’s a common refrain that as long as profits remain strongly in the green, companies can get away with whatever they want. Looking at the corporate behavior of American Apparel and, well, BP, that seems to be totally and irrevocably true.

The fact of the matter is that we’ve gone too far over the edge already. We’re so far over the edge, we’re now at the bottom of the ravine and those among us who survived have built a new colony deep in the shadows, with only the sliver of sky visible through the giant crack above. Hell, the tech industry is too strong, the digital infrastructure is too pervasive, and we are all now too dependent. The conceptual split comes between those who recognize that fact, and those who either don’t or are still hopeful that we’re able to somehow come back, or impose order, or install some form of checks and balances. Of course, the latter probably belong to the same sort of camp that believes we can still have a truly egalitarian society based purely on hand-signals. Whatever.

Over the past few weeks, I’ve spoken to people who considered quitting Facebook for good – as if that’s a tangible form of rebellion, as if that’s even in the ballpark of the point of this whole thing. Simply quitting Facebook doesn’t hurt its operations; it doesn’t even solve the problem you want to solve. You are a walking ball of data, and everything is tracking you in some way or another. Your phone, your laptop, your entrance through a digital security gate, your presence on a security camera, your dependence on public transit or street lights or taxis or tools of communication. You need the digital infrastructure that is continually working towards its capacity for complete surveillance.

Should we should all just go Dr. Strangelove and learn to love the proverbial atomic bomb? But this isn’t the atomic bomb, because you aren’t intimately connected to and reliant on the bomb, and besides, you probably already love this thing. And your love is the thing that brought you here in the first place.

So, I don’t know, I’m at a loss at what to say about this. Other than, well: the Facebook Experiment was a terrible, terrible, terrible thing and we should feel terrible, terrible, terrible about it. There’s nothing that we can feasible hang on to in an effort to feel good about any of this. And perhaps, given the happy-happy-joy-joy spirit that the tech industry wants us to feel all day everyday forever, maybe feeling terrible, terrible, terrible is the only real way of rebelling against a social network that presumably wants us to love it.