Original URL: http://www.theregister.co.uk/2011/05/19/google_privacy/

Schmidt explains the Google way to self-erasure

Don't be evil build a creepy facial-recognition database

By Joe Fay

Posted in Security, 19th May 2011 14:16 GMT

Google bent over backwards yesterday to show that it has learned its lessons and is really finally taking individuals' concerns over privacy seriously. Honestly.

And while there were some tangible commitments, they were unlikely to satisfy the regiment of privacy activists, academics and bloggers the ads colossus had arranged to be delivered to its Big Tent privacy conference yesterday ... by coach, via the venue's goods entrance.

Alma Whitten, Google's director of privacy product and engineering, kicked off the day's defence. She flagged up Google's Data Liberation Front initiative, which commits the firm to allowing individuals who use its products to remove any information they have uploaded. Whitten, and Eric Schmidt, have promised a dashboard to achieve this with a single press of a button. Or two.

Beyond dealing with the security of information tied to individuals' Google accounts, Whitten said the firm, like any good computer engineers, sought to model threats to its users' security and privacy.

While some critics took an "extreme perfectionist way of doing this," Whitten said, "I think it's a mistake to only have this conversation about absolutes." Rather, the firm had to do "threat modelling" and decide where the security is good enough for that. "Not the security threshold for some kind of mythical super adversary."

Beyond that, she said, people had to realise that Google could manage the data it holds, and delete it if consumers demanded, but was limited in what it could do about information that has leaked or proliferated elsewhere on the internet.

"From where I sit, it looks like we're still very much adapting to the situation where the internet allows everyone to be a publisher," she said.

What Whitten didn't recognise as a privacy problem was the vast piles of data Google accumulates through user searches, cookies and the like.

"A lot of the really powerful information sets are built on information from people but not about people," she said.

This implies anonymised data, not traceable to any one individual, but delivering a great benefit – in Google's case normally an economic one, in the case of, say, the NHS, a medical or technological one.

Which sidesteps the point that people have a problem with Google collecting it in the first place. Although they could always go to another search engine of course.

Whitten said said there was no way individuals could be identified from this data and that Google anonymised data after nine months. Which makes you wonder why governments, among others, seem so intent on getting their hands on it.

It was down to Eric Schmidt to deliver a full-fat mea culpa, admitting the firm had learned the hard way that it had to work with users' data "with your permission".

Whether the reference to the hard way refers to individuals' anger over the firm's sometimes cavalier approach to privacy, or the fact that the US government started sprinkling around subpoenas wasn't totally clear.

He also made the point that the firm had withdrawn from China because of the pressures brought to bear on it by the Beijing regime, and its resistance to data requests from other governments.

Democracy's best friend...

And Schmidt pushed hard on the way technology, including Google's, would foster openness and enable the spread of democracy and topple bad governments around the world. And deliver economic benefits. Definitely deliver economic benefits.

Throughout his talk, Schmidt argued that it should be down to consumers to decide what they do with their data, while the tech industry should work out a sensible way of regulating itself. Governments should butt out, he argued, as he took pops at upcoming French and EU regulations covering data retention. He also took a swing at the UK's plans to deal with online piracy and institute siteblocking, saying they could constitute a threat to freedom of expression. Which would be a bad thing, especially on YouTube, say.

He warned of "well-meaning people in government who write something that is pretty broad" with unintended consequences. Better to have industry best practice emerge unimpeded by red tape, he suggested. Which suggests this democracy he sees spreading around the world will should have a distinctly US free market tinge.

But every time he strayed into the political world, he didn't linger too long. After all, as he repeatedly said, he's really a computer scientist. And he used this fact as a basis for highlighting advances in facial recognition technology, while saying the idea of generalised facial databases creeped him out – and apparently committing Google to not developing such a product. So, instead of saying it wouldn't be evil, Google has apparently gone from the broad to the specific in pinpointing something it definitely won't do.

Whitten, too, was at pains to point out that she was speaking from an engineering perspective, while assuring us that there is plenty of crossover between Google's engineers and its lawyers and policy wonks.

And that seems to encapsulate Google's dilemma. It sees itself as an engineer-driven firm, and engineers can develop very precise, logically beautiful views of the world. But when it comes to privacy, other factors come into play – irrational human characteristics such as paranoia, personality, ideology, politics, and the like, to which engineering and modelling don't quite apply. And that is a vacuum that in most Silicon Valley firms tends to be filled with lawyers and public policy wonks.

Neither Schmidt nor any of his colleagues mentioned any of the numerous privacy debacles that have dogged the firm in recent years, from its problems with authorities in Germany and Switzerland over StreetView, to its Wi-Fi data slurping, and so on. Strangely, the audience didn't raise them either. None of these incidents involves data that people have provided to the firm themselves, and over which they have limited, if any control.

Still, Schmidt did say, "One of the things we've learned is: let the engineers build the product, but don't let them launch it without a long conversation ... and ruthlessly implement the user permission."

And that conversation should include, said Schmidt, the public policy people, the lawyers, and the advocacy groups. In fact, pretty much everyone else who was in the tent yesterday.

Except regular users, of course, who may well be ignorant of what they're signing up for when they venture online.

Just by having the event, Google appeared to recogise it has a problem. And a couple of commitments gives the Google watchers something to focus on. But as far as one privacy activist is concerned, by focusing on Google account holders' data and sidestepping the issue of its data mountain, it is at best a quarter of the way there. ®