Feeds

Google to anonymize user data

It's about time

Boost IT visibility and business value

Google is to discard some of the information it stores about user search requests in an effort to address concerns by privacy watchdogs and defend itself against government demands for data.

The search giant will scrub personal information from cookies and remove some of the bits in IP addresses after that information has been stored for a set period of time, probably 18 to 24 months, a Google official wrote in a company blog. It expects to roll out the new policy by the end of the year.

Until now, Google has kept information that can link specific searches to individual users indefinitely, potentially providing a trove of data to prosecutors or rogue employees with the proper credentials. Google will continue to log and store user activity but will anonymize it after a period of time. Google said the plan would be altered if laws governing the retention of data required it.

The change is sure to be welcomed by privacy advocates, who have been aghast at the permeability of the walls containing search data that can easily identify those who make the requests. Last year, AOL touched off a firestorm when it published 19m search queries made by more than 650,000 users. AOL had taken steps to anonymize the data, but some searches contained intimate information that allowed readers to identify the requesters. AOL had revealed the data as part of a research project.

Prior to that, the US Department of Justice, working on a case involving child pornography, issued subpoenas demanding several search engines surrender huge amounts of information related to searches. While Yahoo!, MSN and AOL largely caved, Google fought the demand, arguing it would violate user privacy. (The search king, perhaps more transparently, also objected on the grounds that the disclosure would reveal proprietary algorithms.) Google lost part of its bid, and now wisely believes a better tack to take is to discard some of the vast amounts of information it collects.

Google said its decision to continue hoarding identifying information for as long as two years was an attempt to strike harmony among conflicting goals of personalizing its services, safeguarding user privacy, and complying with data retention laws throughout the world. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
Microsoft: We plan to CLEAN UP this here Windows Store town
Paid-for apps that provide free downloads? Really
Snowden on NSA's MonsterMind TERROR: It may trigger cyberwar
Plus: Syria's internet going down? That was a US cock-up
Who needs hackers? 'Password1' opens a third of all biz doors
GPU-powered pen test yields more bad news about defences and passwords
e-Borders fiasco: Brits stung for £224m after US IT giant sues UK govt
Defeat to Raytheon branded 'catastrophic result'
Hear ye, young cyber warriors of the realm: GCHQ wants you
Get involved, get a job and then never discuss work ever again
Chinese hackers spied on investigators of Flight MH370 - report
Classified data on flight's disappearance pinched
Microsoft cries UNINSTALL in the wake of Blue Screens of Death™
Cache crash causes contained choloric calamity
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
BYOD's dark side: Data protection
An endpoint data protection solution that adds value to the user and the organization so it can protect itself from data loss as well as leverage corporate data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?