Quora Questions are part of a partnership between Newsweek and Quora, through which we'll be posting relevant and interesting answers from Quora contributors throughout the week. Read more about the partnership here.
The issue of buried and obscured "informed consent" is real—this was why I felt some "outrage" over the manipulation of users' emotions via tweaking feed output—but I wonder if there is another facet not directly correlated with "ethics."
I suspect that the true source of outrage in many users stems from the publication of an experiment that detailed the method of manipulation of users' emotions. This then reveals to users that their perceptions can be manipulated to then propel them to behave in a hypothesized manner, and, to add insult to injury, that this manipulation can be so simple.
In other words, this pierces the veil that we are not the rational, discerning beings we fancy ourselves to be, and reveals to us that we are mostly acting unconsciously via conditioned responses, vulnerable to being herded by the most mundane external forces.
There are enough social psychology experiments to prove that peripheral routes of persuasion cut to our hearts so much faster than any amount of fact-based central routes of persuasion, and that we can make key decisions based on the way we feel. This is exactly what Facebook's experiment did: manipulate how we felt, those mofos! And then told us about it, double mofos!!!!
While advertisers and politicians manipulate their targets' emotions "all the time," they are not parading in scientific journals their method for manipulating their targets and basically saying, "Look how easy it is for us to control your behavioral output by controlling your perception input." No, advertisers and politicians will tell you that you have made the right decisions based on your intelligence and critical thinking, when in fact they tap into impulse and noncritical thinking as their method.
So yeah, I'm pissed off because I'm reminded that I'm easily manipulated and that I'm sheep. I'd like to think I'm better than what Facebook can use to manipulate me, but then this peer-reviewed PNAS journal flips a finger to my free will.
Now that I've written the above, I have come to realize that the issue of informed consent is important in terms of research recruiting specifically willing human subjects who know that they are part of a social experiment and, as part of that experiment, get the benefit of a post-experiment debriefing.
In various social experiments, being a responsible researcher means not only that you get informed consent from participants but that you do some type of follow-up after the experiment concludes: a debriefing to ensure that the mental health and well-being of the subjects have not been harmed as a result of the experiment. In this case, the subjects were not "specifically willing"; they only signed on to the terms for using a service in exchange for what they thought was +1 membership, which, as a whole, gives a company like Facebook bargaining power to sell advertisement spots to companies.
Because there was no true informed consent, as one would expect from a typical social psychology experiment (which I think this falls under), the subjects themselves had no way of knowing how they were influenced in whatever way (positive or negative). Then they had no way to get some information or tools that would offset whatever negative effects they may have experienced as a result of "participating" in this experiment.
Read all the responses to the question "Why is there so much outrage over Facebook manipulating user's emotions? Don't all advertisers/politicians do it all the time?"