Original URL: https://www.theregister.com/2011/11/03/privacy_commissioner_speaks/

Want to avoid all private-data breaches, ever? Here's how

Wilderness guru speaks, world listens

By Dan Goodin

Posted in SaaS, 3rd November 2011 13:13 GMT

Interview As information and privacy commissioner of Ontario, Ann Cavoukian's jurisdiction is limited to the Canadian province. But that doesn't mean the effects of her post don't extend into territories across the globe.

“What I always say is privacy transcends jurisdiction,” she says. “It knows no boundaries. So if I'm going to protect the privacy of people in my jurisdiction, I'm going to protect privacy everywhere. Everyone is using Google, Facebook. How do you ensure that the information you give to these people or collect from them is safe anywhere? So to me, privacy is a global issue.”

Indeed, the Privacy by Design initiative she spearheaded has become an internationally recognized recipe for embedding privacy protections into the very fabric of a website or product.

Photo of Ann Cavoukian

Ontario Information and Privacy Commissioner Ann Cavoukian

She put those design principles to the test a few years ago when the Ontario Lottery and Gaming Corporation began using facial recognition technology in casinos to spot people who identified themselves as gambling addicts. To prevent police, Casino employees, or others from accessing the database and using the contents for unauthorized purposes, the system adopted what's known as biometric encryption.

Developed by researchers from the University of Toronto, biometric encryption binds a random key with the biometric data to create a private template that's unique and can't be cross-matched with other databases. The key can be retrieved only when a fresh biometric sample from one of the problem gamblers is presented, making it hard for the data to be tapped for other purposes.

El Reg recently caught up with Commissioner Cavoukian at the Web 2.0 Summit in San Francisco.

The Register:Tell us more about Privacy by Design.

Cavoukian: My message through Privacy by Design is try to prevent the harm from arising to begin with. It's a preventative approach. It's proactive, it's holistic. Don't wait for the privacy harm to arise and then you, the business, have to assume the regulatory burden of compliance with the legislation and the ramifications of that breach in terms of loss to your business, to your brand, and the cost of lawsuits.

So that's what Privacy by Design is. It's coming before the privacy harm has arisen. And if you can do that as a business, and we'll tell you how to do it, then you can gain a sustainable competitive advantage. There's a significant payoff to be had by protecting privacy.

Do you find by mandating that certain protections or practices are followed in your province, it ultimately means those practices are going to be followed universally?

It's not a definite, but this one I can say absolutely. I put forth a resolution on Privacy by Design to make it an international standard, and it was unanimously passed [at the International Data Protection and Privacy Commissioners Conference last year in Jerusalem]. What that means is Privacy by Design, this proactive framework, is now being adopted globally, in all jurisdictions, including here in the United States.

[The US Federal Trade Commission adopted Privacy by Design in December.]

So it is an international framework now for privacy protection, and people, regulators, everyone is saying if you can do this Privacy by Design thing you're way better because you prevent so much of the burden that arises after a privacy harm takes place.

'Positive sum model'

In a nutshell, what is Privacy by Design? Is it a set of programming interfaces, is it a concept, practices?

There are three things that are at the heart of Privacy by Design. You must be proactive, try to prevent the harm from arising, as opposed to the regulatory compliance model which we have which is reactive, after the fact, offering redress. Second, privacy must be embedded in design. It can mean embedded in the design of technology, all IT, but it's not just technology. You must get smart about privacy embedded in business practices and networked infrastructure.

The third one is probably the most important. It asks people to adopt what I call positive sum, not zero sum. You know what zero sum is. You can have one or the other. It's a system of balance, its a system of tradeoffs. In that model, privacy always loses out at the expense of some other functionality – security interests, business, marketing, biometrics, whatever.

In a positive sum model, you can have two positive growths and two positive functionalities at the same time, so it's doubly enabling. So it's not that you have to have privacy versus security. You can have privacy and security. Now, it's harder to do both because it requires innovation and creativity, but what could be more important than finding ways to embed privacy into all that we do?

Can you give me an example of Privacy by Design in action?

Three years ago, the Ontario Lottery and Gaming Commission came to me about [the biometric database]. The problem is when you use a regular biometric program, use a biometric template, even if you encrypted it, it can be decrypted. And police can subpoena it. That's an unintended secondary use of the information no one had contemplated at the time.

Biometric encryption uses the biometric as the tool of encryption, almost like a private key, to encrypt meaningless data, an alpha numeric or 100-digit pin. What gets kept in database is that encrypted other information. So if police want that data, you say you're welcome to it, [but] the key resides on the person's face or finger. I don't have the decryption key. That's the only thing that can decrypt it.

That's a long way of saying you can use the biometric for that narrow purpose as it was intended, and totally privately. And it protects the privacy so all the regular patrons who are just coming to gamble for recreational purposes, none of their data is collected or retained in the system. So if there's not a match in the system, we keep nothing out, so you can reassure innocent patrons of the casino there's no threat to them whatsoever.

That's a technical solution to prevent unintended secondary consequences. I've actually called this the year of the engineer. I'm almost talking exclusively to engineers.

Privacy for users of smart grids, too

This is not just a conceptual abstraction. We've done this with the smart grid and smart meters. We've oprationalized the principles so that people can be assured that if they get a smart meter in their house, no one else is going to know about the activities within the house, which is sacrosanct. That's the last bastion of privacy. We embed the principles into the smart meter and the way the data are collected and used such that it is exactly as it was before when the guy used to come to your door, except now there's two-way communication, which allows you to monitor your electrical usage for time of use and [for] electricity conservation, but no information is used by the utility for any additional purposes.

Do you see there being a market for companies to use privacy as a competitive advantage? Can companies use privacy as a feature to compete?

If they don't use privacy, then they will pay. When you build privacy in at the initial stages, it doesn't cost that much. Of course, there's a cost associated, but it's minimal because you're just at the design stage. Nothing's up and running. And you can introduce the protections very efficiently and effectively at the design stage, with minimal cost. If you don't do that, I can virtually guarantee you'll have some sort of data breach at some time.

Data breaches cost companies enormously. Think of Sony. [Millions of dollars] so far and they're facing a class action lawsuit.

The hit to your brand, the hit to your reputation, all of that, not to mention the actual payout to customers, is enormous. So we tell people when you do privacy by design don't do it for altruistic reasons. Yes, it's good for customers and your users, but it's good for you as a business. It's going to save a boatload of cash and resources and damage to your company. So that's why we've been getting such an uptake on the part of companies.

We've heard regulators in Europe talk a bit about the right to be forgotten. Is that something you also advocate?

I'm not going to suggest that I'm opposed to it, because I respect the Europeans' wish to have that right. I think realistically, it's very difficult, in this day and age, not impossible. Maybe design technology so that it self destructs after 60 days, or something that has an end date. Right now, you keep it around forever.

To call it a right, I don't know. I haven't studied it legally. I can see how it might be a desirable goal. I don't know that it reflects the views of the constitution and the existing rights and freedoms that we have here. What I would say to companies, though, is: Whether you think there's a right to be forgotten or not, there's a lot of advantage.

If you've done some things you'd prefer to forget in your youth or, you know, everybody's done something. People would prefer not to have that haunt them for the rest of their lives. And companies, what I would say, is don't sit on data that you have forever, because it will come back to haunt you. It will be out of context, it'll be inaccurate, it'll be the wrong thing. And you're not going to benefit from it. And you have a duty of care to protect that information as long as you hold it.

So there might be this win-win solution of figuring out how long do you really need to keep this information and offer to your customers that we're going to destroy this data unless you want us to keep it after this date. Here's our retention practice. There's a lot of value to be gained by businesses offering it in that light, as opposed to necessarily relying on the right to be forgotten.

'The problem with anonymity'

What do you think about anonymity. Should people have the right to be anonymous on the internet, for instance?

A: Thats a really tough question. Anonymity is wonderful for freedom of speech. People are less restrained to speak. The problem with anonymity is it's negatively correlated with accountability. When people are anonymous, they tend to be less accountable. Think of cyberbullies, which I despise like the plague because they're such cowards and they go after young children and hide behind this anonymous veil. So I personally don't like anonymity when it's used to harm people.

Having said that, I still believe in anonymity because it does promote freedom of speech and the ability to participate in ways that perhaps you would feel unprotected in doing in participating in a forum.

My preference is persistent pseudonyms. It gives you the protection if you feel you can't do it in an identifiable way. In Germany, in fact, in their data protection law, you are required to allow people to use a pseudonym. I think that is a much better solution than total anonymity because it does provide some accountability. If you're saying really outrageous things, the company knows your true identity and can link you if necessary, but it allows you the freedom to express your views that may not be popular. I think anonymity is still a really important right.

What common misconceptions do you see people who are building websites or designing software or hardware have? Do they have common misconceptions about how to ensure privacy, or what privacy even is?

A lot of startups understandably don't have an understanding of privacy. For people like that, what I would say to them is if you can just put privacy on your radar, they should talk to somebody. If they had any questions, they could just email me. We respond to people all the time. The essentials, I would say, of privacy are about control. It's about freedom of choice, so you have to give them control over their data. If it's linked to personally identifiable data, you have to allow your customers to access the data, be transparent, tell them what you're going to do with their information.

You collect information from individuals for a purpose called the primary purpose. You don't collect it to do whatever the heck you want with it. You have to tell your customer here's why we want it, give them full notice. You need to get their consent to use the data, then you limit your use of the data to that purpose. And if you want to use it for a secondary purpose you go back to them and get additional consent. If you did all that you wouldn't have any privacy problems.

So what I would say to new startups and companies is just get a primer on privacy. Just think about it, and you only have to think about it when it's linked to personal identities. If you're collecting data and there's no name or social security number on it, you're golden, do whatever you want with it. But if there're personal identifiers linked to it, you've got to think about these things. Otherwise, it will come back to bite you.

How do you think the Do Not Track initiative spearheaded by the FTC, is going so far?

What drove Do Not Track is the lack of transparency on the part of companies. People didn't know they were being tracked. So when the Apple story broke that iPhones are tracking your geolocation data, nobody knew about it. Look, they need your geolocation data if you're asking for instructions on how to get from Point A to Point B. But they hadn't addressed it. There wasn't the transparency necessary to tell people what they were doing and why it was beneficial.

Long way of saying companies really have to be transparent with their customers, engage their customers. Do not track was a reaction to many things like that.

A lot of tracking is going on behind the scenes. People don't know it's going on. And then it engenders distrust. All of the sudden there's no confidence. That's what Do Not Track is all about. I hope something comes out of it, at the very least, in terms of getting businesses to come clean with what they're doing with your data. ®