logo

Facebook Study a Rare Public Reminder of Corporate Big Data’s Unaccountable Power

July 2, 2014 - From Issue 1.42 - By Erica Portnoy

For one week in January 2012, a Facebook data scientist (collaborating with a psychology professor) strategically altered the Facebook News Feed content of 689,003 users. A computer automatically evaluated the emotional tone of each post users saw. Some users saw fewer positive emotional posts than they otherwise might, while other users saw fewer negative emotional posts.

The researchers found, as they explain in the Proceedings of the National Academy of Sciences, that seeing fewer positive News Feed items led users to post more negative status messages themselves, while seeing fewer negative News Feeds led them to post more positive statuses, an effect that the researchers themselves call “emotional contagion.” (The effect was very small on average, but consistent across many users.) Ironically, the researchers were actually interested in investigating the opposite possibility — the theory that seeing too many happy posts about our friends “may somehow effect us negatively, for example via social comparison” to people who seem happier than ourselves.

The study involved some collaboration with researchers at Cornell, but Cornell’s Institutional Review Board (which exists to protect human subjects that may be harmed in experiments) concluded that their approval was not required because the Cornell researcher, who did not control the users’ Facebook feeds, “was not directly engaged in human research.” If the experiment had faced such review, it would likely not have met the rigorous standards for informed consent set out by a federal policy called the Common Rule.

On the other hand, experiments like these are standard practice for consumer-facing online companies, who are not subject to the standard rules of academic research ethics. Facebook’s study is a reminder of the powerful influence that companies have on our daily lives. As Shoshana Zuboff points out, “Facebook, like Google, represents a new kind of business entity whose power is simultaneously ubiquitous, hidden, and unaccountable.” What Facebook does is engineering, not science, and our online actions are the product being engineered. While we call Facebook’s employees “data scientists,” they don’t have the strict adherence to ethics, rigor, and reproducibility that we associate with science.

The study points out that “the well-documented connection between emotions and physical well-being suggests the importance of these findings for public health.” It’s easy to imagine a News Feed where you might never see people’s sad posts, and never offer consolation, if Facebook were to decide that a purely happy feed were better for your health, or would draw more of your attention to the site. No matter how they choose shape our online experience, Facebook’s decisions will carry benefits and harms for some — benefits and harms that can quickly add up, given how widely the site is used.

Source: Nature

Facebook’s power could even extend to a form of digital gerrymandering. Last election, Facebook drove more people out to vote, partly by telling them which of their friends had already voted earlier in the day. (A study in Nature found the effort drove about 60,000 more voters to the polls.) What if Facebook were to use that knowledge during the next election, and selectively give such feedback to users who matched the company’s political ideals? It’s within their technological power to do so, and within their legal power too, given today’s lack of oversight. Only the company’s self-restraint stops it from going further.

The bottom line is that if not for this public study, we would have no insight at all into these manipulations. For the most part, we still have little idea how Facebook, Google, Twitter and other major online firms may use their algorithms to shape our lives. And given that outside scrutiny is infeasible, it may be time to start thinking about what other options may make sense, to help constrain how this new power gets exercised. Or else we may be looking toward a world where manipulation like this is effectively not constrained.



Posted On: July 2, 2014
Posted In:
Post Details
*/ ?>