Why Tim Cook is wrong: A privacy advocate's view
Apple should be unable to comply with this request
Opinion Apple has recently released an open letter explaining why it will challenge a judicial order requiring it to hack the iPhone of one of the accused San Bernardino terrorists. As someone who believes in individual civil liberties and personal privacy above nearly all other considerations, my first instinct is to applaud Apple. Upon reflection, however, I believe that in this instance it is in the wrong.
Apple's open letter calls the judicial order a "dangerous precedent" and also says that what the judge is asking for is a "back door" into the iPhone's encryption. Both of these are hot topics that instantly polarise debaters, with no apparent room to compromise. Unfortunately, the individual situation is a lot more complicated than black-and-white politics allows.
Things neither side wants to admit
There are two critical things neither side of the privacy/civil liberties versus law enforcement debate wants to admit. Before we can explore Apple's specific situation, we need to clear the air on these topics.
On the privacy/civil liberties side of the equation, the view has become that any law enforcement access to our computers, phones, data in the cloud and so forth must not be allowed. These devices – especially our smartphones – can carry every detail about our lives in them. Every secret we have, every dream, every dirty fantasy, it can all be there.
French clergyman Cardinal Richelieu reputedly said "If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him." What then could you do with my smartphone? It probably contains evidence that I've broken laws I didn't even know existed. I may even have used it to listen to music in a form or from a service that wasn't fully sanctioned, and only TPP negotiators and their demon spawn know how much money they could soak me for if they could prove that!
I try to be a good person, but with my smartphone you could destroy my life, and the lives of most people I know. This leads us to the argument that the law enforcement side of the debate refuses to even acknowledge.
Our law enforcement cannot be trusted, and neither can our government. While individual members may be decent, honest individuals that are truly dedicated to making the world a better place the organizations they serve have proven time and again to be grasping, greedy and corrupt.
If we give law enforcement the ability to access our electronic data they will use it against us. It is only a matter of time.
It might seem reasonable at first to give law enforcement some form of mandated access to electronic data for use in exceptional cases. Terrorism. Treason. Missing minors with parental consent for the privacy invasion. Unfortunately, it never, ever ends there.
Even if the current administration decides they would be restrictive about the powers given to law enforcement there is absolutely no way of ensuring that the next administration would honour that. For that matter, the spy agencies seem to have zero compunction about breaking the law, so passing laws won't restrain them anyways.
And what about airports? It seems that – in the US at least – all your rights go out the window whenever there's an airport involved, so that's another crack in the armour and we haven't even introduced bad actors, power hungry law enforcement officials, politicians looking to score points with fringe groups or those who want us to "think of the children".
How long before "special powers" are devolved into something half the agencies at all levels of government can do without a warrant? How long before smartphone searches become a tool of racism, oppression, or hunting for illegal immigrants to deport?
The risks involved in getting laws around encryption wrong are enormous. If we screw up we could end up in a police state dystopia of our own making, so it behooves us to get this right.
How Apple is wrong
Apple is wrong is in saying that the FBI is asking for a backdoor. It isn't. A backdoor in the context of encryption would be either a key escrow system or a "master key" system that would allow easy access for law enforcement into any Apple product encrypted with the back-doored system.
What is actually being asked for here is that Apple write custom code that allows the FBI to perform a brute-force attack against the iPhone without triggering the "10 strikes and the phone is wiped" protection mechanism. This is a completely different animal.
There is no "backdoor" involved here. What appears to be involved is a design flaw. Something about the iPhone 5C in question is broken. Either it is possible to load a compromised firmware into the phone despite the fact that the phone is locked, or it is possible to read the data off the flash chips and attack it in a VM until the password is brute-forced.
Either way, it would appear to your correspondent that Apple screwed up when designing this device and it left open a means of attack. The judge is asking Apple to use its expertise to exploit this flaw. It's as simple as that.
As far as I am concerned, the judge is in the right here. Apple is not being ordered to create a flaw and distribute it to all devices. It is not being prevented from fixing this flaw in future devices. It is being asked to exploit a flaw that currently exists, and for the privacy-conscious this is actually a good thing.
If it is known that Apple's phones can, in fact, be attacked in this fashion then the pressure is on to create devices which cannot be attacked in this fashion. Apple can only turn over the data if it is actually possible to do so. If it isn't, then the judge can make all the orders they want, but they can't alter reality with a gavel.
Survival of the fittest
If someone in the US does order manufacturers to create an encryption backdoor and distribute it to all devices, then we absolutely should have this out in the open. This should be clearly stated in law so that everyone, everywhere knows not to buy American. If some US politician wants to evaporate their entire tech sector in one moment of blinding law-enforcement lobbying hubris, let them. But make sure this stupidity is out in the open for all to see.
By the same token, if the technologies being developed in the US are vulnerable that should be out in the open too. Marketing claims mean even less than the promises of law enforcement agencies to respect civil liberties. We'll see exactly how well privacy claims match reality when judges start threatening contempt of court for failure to hack devices.
When law enforcement and governments cannot be trusted to respect the privacy and civil liberties of ordinary citizens then we must deny them the ability to do so in the first place. The widespread use of properly implemented encryption and supporting technologies is the only means available to restrain our self-styled rulers from turning our digital data into fodder for the next McCarthyesque witch hunt.
Apple shouldn't be hiding behind technicalities of the law in order to claim that its devices guarantee our privacy. Its devices should not have encryption vulnerable to attack. If these devices are vulnerable to attack, then the judge is within their rights to call on Apple to break that encryption.
Apple – and everyone else that makes anything with a computer in it – should be upping their game if they want to be able advertise that they give a damn about our privacy as much as we do.
In the meantime, if law enforcement and/or politicians really want access to our data, then they can put their thinking caps on and come up with a means to not only rebuild lost trust between the citizens of the world and the people who spy on them, but a means to restrain those who hold the reins of power from abusing the incredible power over individuals that unfettered access to our devices would grant them.
It's time to see some market forces benefiting the little guy for a change. Competition amongst device makers should be building us unassailable devices. Competition amongst lawmakers should be building compromises that meet society's many needs without setting us up for a repeat of history's worst mistakes. I wonder if either group is up to the challenge. ®
Sponsored: DevOps and continuous delivery