This article is more than 1 year old

Schmidt explains the Google way to self-erasure

Don't be evil build a creepy facial-recognition database

Democracy's best friend...

And Schmidt pushed hard on the way technology, including Google's, would foster openness and enable the spread of democracy and topple bad governments around the world. And deliver economic benefits. Definitely deliver economic benefits.

Throughout his talk, Schmidt argued that it should be down to consumers to decide what they do with their data, while the tech industry should work out a sensible way of regulating itself. Governments should butt out, he argued, as he took pops at upcoming French and EU regulations covering data retention. He also took a swing at the UK's plans to deal with online piracy and institute siteblocking, saying they could constitute a threat to freedom of expression. Which would be a bad thing, especially on YouTube, say.

He warned of "well-meaning people in government who write something that is pretty broad" with unintended consequences. Better to have industry best practice emerge unimpeded by red tape, he suggested. Which suggests this democracy he sees spreading around the world will should have a distinctly US free market tinge.

But every time he strayed into the political world, he didn't linger too long. After all, as he repeatedly said, he's really a computer scientist. And he used this fact as a basis for highlighting advances in facial recognition technology, while saying the idea of generalised facial databases creeped him out – and apparently committing Google to not developing such a product. So, instead of saying it wouldn't be evil, Google has apparently gone from the broad to the specific in pinpointing something it definitely won't do.

Whitten, too, was at pains to point out that she was speaking from an engineering perspective, while assuring us that there is plenty of crossover between Google's engineers and its lawyers and policy wonks.

And that seems to encapsulate Google's dilemma. It sees itself as an engineer-driven firm, and engineers can develop very precise, logically beautiful views of the world. But when it comes to privacy, other factors come into play – irrational human characteristics such as paranoia, personality, ideology, politics, and the like, to which engineering and modelling don't quite apply. And that is a vacuum that in most Silicon Valley firms tends to be filled with lawyers and public policy wonks.

Neither Schmidt nor any of his colleagues mentioned any of the numerous privacy debacles that have dogged the firm in recent years, from its problems with authorities in Germany and Switzerland over StreetView, to its Wi-Fi data slurping, and so on. Strangely, the audience didn't raise them either. None of these incidents involves data that people have provided to the firm themselves, and over which they have limited, if any control.

Still, Schmidt did say, "One of the things we've learned is: let the engineers build the product, but don't let them launch it without a long conversation ... and ruthlessly implement the user permission."

And that conversation should include, said Schmidt, the public policy people, the lawyers, and the advocacy groups. In fact, pretty much everyone else who was in the tent yesterday.

Except regular users, of course, who may well be ignorant of what they're signing up for when they venture online.

Just by having the event, Google appeared to recogise it has a problem. And a couple of commitments gives the Google watchers something to focus on. But as far as one privacy activist is concerned, by focusing on Google account holders' data and sidestepping the issue of its data mountain, it is at best a quarter of the way there. ®

More about

TIP US OFF

Send us news


Other stories you might like