Related:

25 March 2014, Proceedings of the National Academy of Sciences: Kramer, Facebook Core Data Science Team, et al: Experimental evidence of massive-scale emotional contagion through social networks (PDF)
http://blogs.wsj.com/digits/2014/06/30/commentary-its-time-to-talk-seriously-about-facebooks-awesome-power/

It's Time to Talk Seriously About Facebook's Awesome Power

By Christopher Mims

Jun 30, 2014

Facebook has been quietly experimenting with its power to influence everything from the way we express our emotions to how likely we are to vote -- and the world has finally noticed. Now, it seems, is the time to have a debate that's long overdue: What are the obligations of companies like Facebook and Google, which have the power to shape our collective reality?

Over the weekend, outrage [1] over an experiment Facebook conducted in 2012 [2] bloomed on the Internet. The week-long experiment found that seeing more happy stuff on Facebook makes you slightly more likely to post something happy on the site; the inverse is also true. Researchers found no evidence--and this has been widely misreported--that Facebook actually changed anyone's emotional state. [3]

Setting aside the ethics of this experiment -- some find it horrifying [4] while others think it's no big deal [5] -- its results demonstrate something important:

There may be no company in history with as much power to influence what we think and feel as Facebook.

Facebook is big, and has a larger reach than any medium in history. And Facebook could, if it chose to, derive countless things about you whether or not you choose to reveal them, including your sexual orientation, relationship status, propensity to use drugs, IQ, political orientation, etc. [6]

The question is, what happens if and when Facebook decides to act on all this data? Not just sell it to marketers, but use it to influence your state of being in order to achieve a particular aim.

For example, what if there is an optimal mix of positive and negative content in your news feed that will keep you using Facebook for the greatest number of minutes a day? With this experiment, Facebook has already revealed it has the power to shape what you read in exactly this way, connecting that power to the number of minutes you spend on the sight is a trivial exercise in statistics.

Facebook would be foolish not to use this insight to manipulate our emotions in order to keep us on the site as long as possible, and because the algorithms Facebook uses to determine what shows up in your news feed aren't public, there's no way any of us would ever know how they did it. Nor are there any regulations forbidding the company from doing so. (I reached out to Facebook to ask whether anything like this is already part of Facebook's algorithm. No response yet.)

Here's another example of Facebook's power: In 2010, Facebook showed it can  increase voter turnout [7] in a U.S. election, by pushing the right kind of message to users. Given the demographics of Facebook, which skewed more young and tech-savvy in 2010 than it does today, it's worth asking whether, in so doing, Facebook managed to unwittingly influence congressional elections at the time.

The algorithms that shape Facebook's news feed -- and the search results we see in Google, and the posts that appear in the "discovery" tab on Twitter , and on and on -- are all black boxes. We have almost no idea how Facebook has decided to influence the 1.2 billion people who use the site regularly.

If, in his dotage, Mark Zuckerberg decides to become a Hearst-type media mogul, who actively shapes the news to further his own ends, there's nothing to stop him. In some ways, this makes Facebook a media company like any other in history. The difference is that with its infinite pools of data and ability to micro-target changes in its algorithm to every single user, in some ways Facebook has more power than any media mogul of yore.

It's also worth asking whether Facebook has a moral obligation to use its data for good. If Facebook can infer our mood from our posts, should it attempt to develop an algorithm to determine which of its users is most likely to commit a violent act, or to commit suicide? Does Facebook, like those who argue we should be putting anti-depressants in the water supply, [8] have an obligation to show its saddest denizens only the sort of posts that might cheer them up?

[1] http://online.wsj.com/articles/furor-erupts-over-facebook-experiment-on-users-1404085840

[2] http://www.pnas.org/content/111/24/8788.full

[3] http://www.scilogs.com/from_the_lab_bench/the-facebook-emotion-study-in-a-broader-context/

[4] http://www.theguardian.com/technology/2014/jun/30/facebook-emotion-study-breached-ethical-guidelines-researchers-say

[5] http://www.talyarkoni.org/blog/2014/06/28/in-defense-of-facebook/

[6] http://yro-beta.slashdot.org/story/13/03/11/218221/facebook-knows-if-youre-gay-use-drugs-or-are-a-republican

[7] http://www.nature.com/news/facebook-experiment-boosts-us-voter-turnout-1.11401

[8] http://www.theguardian.com/environment/shortcuts/2011/dec/05/should-we-put-lithium-in-water