Feeds

Trick-cyclists defend Facebook emoto-furtling experiment

'All REAL men ignore consent and privacy'

Top 5 reasons to deploy VMware with Tegile

Facebook's “creepy” feed-manipulation experimentation, which has generated an avalanche of outrage among users, isn't without its chums. A growing collection of psychologists and tech pundits is linking arms, standing next to Mark Zuckerburg, and singing “We Shall Overcome” in the direction of mobs carrying metaphorical pitchforks and flaming torches.

Facebook's own defence of its research has been so unconvincing that UK and Irish data watchdogs are now investigating the company.

Others, however, are wading in to defend The Social NetworkTM.

The tl;dr version? You're all wrong, quite possibly ignorant, routinely manipulated, and why should we let ethics get in the way of science? Apparently.

Let's start with this post by Tal Yarkoni, director of the Psychoinformatics Lab at the University of Texas, who is dismissive of “people … very upset at the revelation that Facebook would actively manipulate its users’ news feeds in a way that could potentially influence their emotions.”

His points are:

  • 1. It's okay, because the effect turned out to be tiny.

“The manipulation had a negligible real-world impact on users’ behaviour”, Yarkoni writes.

That can't have been known to the researchers, nor any hypothetical ethics committee, in advance of the experiment, and therefore is irrelevant to whether or not what Facebook did was ethical.

Moreover, Yarkoni is reporting on the aggregate of all results. Rules surrounding psych tests on humans are there not to protect numbers, but individuals. Demonstrating a small effect across hundreds of thousands of people does not show that none of the individuals were harmed.

  • 2. It's okay, because the result tells us about what users posted, not how they felt.

“The fact that users in the experimental conditions produced content with very slightly more positive or negative emotional content doesn’t mean that those users actually felt any differently.”

This is a neat sophistry: since we don't know if users were telling the truth about their feelings, in their posts, we don't know whether the suppression of good or bad news in their feeds actually changed their feelings.

  • 3. It's okay, because all communication on Facebook is manipulated in some way.

“Every single change Facebook makes to the site alters the user experience, since there simply isn’t any experience to be had on Facebook that isn’t entirely constructed by Facebook.”

This is more difficult to answer. It's true, but it's also incomplete.

Facebook's “user engagement” manipulation is designed to make people want to use Facebook more often (and to get them clicking on more advertisements).

There's not much risk, for example, that re-weighting sponsored posts that pop up in a user's feed will make someone with clinical depression feel worse. Does Facebook know that its “emotional contagion” methodology carried no such risk?

  • 4. Human communication is manipulative

“Everybody you interact with–including every one of your friends, family, and colleagues–is constantly trying to manipulate your behaviour in various ways.”

Are all human interactions deliberately manipulative? Even if the answer is “yes”, interpersonal relations have things like visibility, trust, and consent which are all lacking in the Facebook experiment. There is simply no analogy between how partners in a relationship behave, and this experiment.

  • 5. Ends justify means

“If you were to construct a scale of possible motives for manipulating users’ behaviour – with the global betterment of society at one end, and something really bad at the other end – I submit that conducting basic scientific research would almost certainly be much closer to the former end than would the other standard motives we find on the web – like trying to get people to click on more ads.”

This may be perfectly true, but it's still a distraction from the ethics of the experiment.

Gartner: "Man up"

Yarkoni's analysis was then picked up by a Gartner research director, Martin Kihn.

We could suggest that Kihn's “Man up, people” is probably all you need to know about his response. To say that “Worried about academics following ethical standards of behaviour” equates with “lack of manliness” speaks volumes about tech sector culture, and nothing about research ethics.

“The study itself strikes me as being routine, legal, ethical and unsurprising,” Kihn writes – without supporting his assessment of its ethics whatever. Kuhn does reiterate two points from Yarkoni, the likely errors in the textual assessments and the smallness of the experiment's impact, before delivering this statement:

“If we start demanding an academic standard of 'informed consent' for routine A/B and multivariate tests run online, we’re skirting the boundaries of absurdity.”

It seems to have escaped Kihn that the research was published in an august academic publication, the Proceedings of the National Academy of Sciences, with researchers identified to universities as well as to Facebook, making the question of informed consent perfectly legitimate.

And El Reg can't help but wonder why informed consent is a concept that requires scare quotes.

Burying ethics in detail

Even a very serious discussion of the issue, one that picks over the relevant laws and regulations, appears to skirt it at the same time. This piece, by Michelle Meyer, lingers over the detail like it were a fine wine.

Since (as she explains) the Facebook study qualifies as “human subjects research”, Meyer views the question through the prism of legal requirements – was the study subject to the rules governing such research?

Beyond her discussion of methodology and results, Meyer makes the following key points:

  • 1. The research was “conducted and funded solely by an entity like Facebook”, meaning it “is not subject to the federal regulations”.
  • 2. The “involvement in the Facebook study by two academics nevertheless probably did not trigger Cornell’s and UCSF’s requirements” (for ethical review).
  • 3. The study may have passed ethical review if it had been submitted.

Meyer also restates the saw that emotional manipulation is common, citing the advertising industry as an example – and like Yarkoni, puts forward the apparently-disingenuous idea that “everyone does this so its ethical”.

Meyer does, however, make the worthwhile point that corporates can do this sort of stuff without the same constraints that apply to academics. She would like to see academic restrictions lifted; others may not agree.

Don't chill the science

Brian Keegan at Northeastern University has been fairly extensively cited for this piece.

Skipping the now-obligatory recap of the research methodology, let's get to what seems to be the meat of Keegan's argument: (a) “every A/B test is a psych experiment”, (b) nobody's discussing what informed consent should look like anyhow, (c) don't chill science: “All this manning of barricades strikes me as a grave over-reaction that could have calamitously chilling effects on several dimensions.”

A possible conclusion: perhaps Facebook's publication has done the world a favour by lifting the lid on the kinds of behaviours that psych researchers admire and aspire to. ®

Intelligent flash storage arrays

More from The Register

next story
Ex-US Navy fighter pilot MIT prof: Drones beat humans - I should know
'Missy' Cummings on UAVs, smartcars and dying from boredom
Facebook, Apple: LADIES! Why not FREEZE your EGGS? It's on the company!
No biological clockwatching when you work in Silicon Valley
The 'fun-nification' of computer education – good idea?
Compulsory code schools, luvvies love it, but what about Maths and Physics?
Doctor Who's Flatline: Cool monsters, yes, but utterly limp subplots
We know what the Doctor does, stop going on about it already
'Cowardly, venomous trolls' threatened with TWO-YEAR sentences for menacing posts
UK government: 'Taking a stand against a baying cyber-mob'
Happiness economics is bollocks. Oh, UK.gov just adopted it? Er ...
Opportunity doesn't knock; it costs us instead
Sysadmin with EBOLA? Gartner's issued advice to debug your biz
Start hoarding cleaning supplies, analyst firm says, and assume your team will scatter
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.