Gripe to UK, Ireland, Poland: Ad tech industry inhales, then 'leaks' sensitive info on our health, politics, religion
Regulators asked to tackle 'systemic' GDPR breaches
A series of challenges to the practices used by the likes of Google in online behavioural advertising have been filed in the UK, Ireland and Poland, alleging that information slurped up on internet users is not only "highly intimate" but also improperly protected.
The ad tech industry is responsible for "systemic" breaches of the General Data Protection Regulation (GDPR) by allowing "massive leakage" of sensitive data on health conditions, religion and politics, and leaving data subjects unable to control how it is used, said the groups behind the formal objections.
The complaints were filed in the UK by Open Rights Group executive director Jim Killock and privacy research Michael Veale; in Ireland by Johnny Ryan of browser biz Brave; and in Poland by the Panoptykon Foundation.
They argue that the ad industry's real-time bidding (RTB) systems – which they say are obscure by design – breach the GDPR because they fail to protect data transmitted to advertisers by ad giants and data brokers against unauthorised access.
When a person visits a website, various info is transmitted through RTB platforms that allow advertisers to bid for that viewer's eyeballs. These bid requests can include what someone is watching or reading, their location, their IP address, the type of device they're browsing on and unique tracking IDs.
The fact that information specific to a person and their device, along with cookies, is routinely included in these bid requests means companies can build up detailed profiles, and these labels "stick to you for a long time".
And in fresh evidence submitted today, the groups emphasised that the boxes people are put into are defined in ways that would likely be classed as "special category" data under GDPR.
This is information on someone's race, ethnicity, politics, religion, trade union membership, genetics, health, sex life or sexual orientation – and it necessitates extra protections.
The complainants argue that many of the categories in the two main systems in online behavioural advertising – the Interactive Advertising Bureau's widely used openRTB and Google's proprietary Authorized Buyers (formerly DoubleClick Ad Exchange) – would fall into this group.
They include left and right-wing politics, "special needs kids", health conditions like eating disorders, cancer and STDs, drug and alcohol abuse and infertility, and various religious beliefs and ethnic groups.
A previous framework (annotated PDF here), which the IAB claims is "depreciated" but the complainants allege remains "widely used", includes categories for "incest/abuse support" and pornography.
"Actors in this ecosystem are keen for the public to think they are dealing in anonymous, or at the very least non-sensitive data, but this simply isn't the case," said Veale in a statement.
"Hugely detailed and invasive profiles are routinely and casually built and traded as part of today's real-time bidding system, and this practice is treated as though it's a simple fact of life online. It isn't: and it both needs to and can stop."
Ryan, who is chief policy and industry relations officer at Brave, stressed to El Reg that this doesn't mean an end of online advertising.
"If the IAB and Google remove personal data from bid requests, then ad auctions can operate safely," he said.
"We want to see ad auctions fixed, so that they can operate safely under the GDPR. Today we have a data protection free zone. This exposes marketers and publishers to hazard under the new Regulation."
Don't make this personal
The IAB's OpenRTB specification "strongly recommends" that "at least one" vendor-specific ID or buyer-specific ID for the user are involved in the bid request.
These are not defined as "required", Brave's Ryan said, "because unique IDs are not strictly technically necessary for the ad auctions to run, or for ads to be targeted".
But the IAB "recommends" their inclusion in bid requests nonetheless, he said, because "their omission would not break the object, but would dramatically diminish its value". The result is that they are likely to be in all auctions.
Aside from these IDs, Ryan said that other data that is permitted in bid requests could allow someone to be identified. An example bid request on Google's documentation page contains "detected vertical" items that map onto content category codes and information on a user's latitude and longitude, as well as the following:
user_agent: "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134"
"Ad auctions can operate safely if all personal data are no longer permitted in bid requests. There are many non-personal data available for sophisticated ad targeting and ad auction trading. With minor changes, this can happen quickly," said Brave's policy officer.
'Mass data broadcasting'
The complaints set out three issues: first, with what is described as a "mass data broadcast mechanism"; second, that the industry doesn't have control of the data once it has been transmitted; and thirdly, that the data is often special category data.
Data is collected on a scale "well beyond" the information needed to provide the relevant ads, the complaints said, while it is shared "for a range of uses that go well beyond the purposes which a data subject can understand, or consent or object to".
Because there are so many recipients, "those broadcasting it cannot protect against the unauthorised further processing of that data, nor properly notify data subjects of the recipients of the data".
The New Economics Foundation estimated in a December 2018 report that ad auction companies broadcast profiles about an average UK internet user 164 times a day, which are received by thousands of companies. "There is no way of knowing what then is done with these intimate data," the complaint said.
Moreover, the speed at which real-time bidding occurs means that data, including special category data, may be disseminated "without any consent or control" over that data.
"Given that such data is likely to be disseminated to numerous organisations who would look to amalgamate such data with other data, extremely intricate profiles of individuals can be produced without the data subject's knowledge, let alone consent."
The complaints said that IAB Europe's Framework and Google's Guidelines do not provide adequate "integrity and confidentiality" over personal data – because they don't require notification to data subjects of the dissemination of their data or of any intention to broadcast it.
Neither do they give people the chance to make representations to the vendors, to formally object to processing, nor do they provide enough control over unlawful or authorised further usage.
The complaints allege that, for these reasons, the process falls foul of the GDPR, because that requires personal data to be processed "in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss".
The complaint called on regulators to recognise the "systemic nature" of the breaches and step in to ensure industry-wide compliance with data protection laws.
We've contacted IAB for comment.
A spokesperson from Google told us: "We have strict policies that prohibit advertisers on our platforms from targeting individuals on the basis of sensitive categories such as race, sexual orientation, health conditions, pregnancy status, etc. If we found ads on any of our platforms that were violating our policies and attempting to use sensitive interest categories to target ads to users, we would take immediate action." ®