Trusted computing: It's BACK, and already in a pocket near you
That bulge could be SOMEONE ELSE's tool
MWC Trusted Computing, the widely-derided idea of computing secured for, and against, its users, is back and the necessary hardware is already in the majority of pockets.
When Intel and Microsoft tried to introduce Trusted Computing, under the Palladium brand, they were pilloried as betraying the freedoms which had made desktop computing such a dynamic industry. But the same idea lurks in 90 per cent of the ARM chips used in mobile phones, and the software necessary was demonstrated today as the industry has gained by inches what it failed to achieve by revolution.
That demonstration was done by Trusted Logic and Wave Systems, utilising the ARM's TrustZone architecture to create secured storage which could be used to hold immutable data such as a hash of a booted operating system to prevent system alteration; software licence keys to prevent piracy, or even cryptographic identity tokens; or (gasp) to screw over the network operators who currently have a monopoly on that kind of thing.
The companies are demonstrating a package compatible with the Mobile Trusted Module (MTM) specification, put together by the Trusted Computing Group as a mobile version of its desktop specification, but while the original trusted computing never gained popular ground thanks to public outcry the public has come a long way since then. These days computer users seem happier to accept a few limits on their freedom in exchange for better security.
Anyone who's forgotten Trusted Computing would do well to read this 2003 piece from Ross Anderson at Cambridge University. It's worth reading if only to see how much his terrifying vision of the future matches the greatly-appreciated experience of iPhone users today (ironically he posits that Mac users will be locked out of the Microsoft-backed Trusted Computing ecosystem). Anderson thought that trusted computing would put too much control in Microsoft's hands, but these days users have other concerns and legislation has demonstrated that governments would step in to prevent many of the scenarios he suggests.
Wave Systems reckon that ARM's TrustZone is already embedded in 90 per cent of smartphones, and while they're demonstrating on Android the technology is applicable to any platform so it will be interesting to see how/if Windows-on-ARM takes advantage of it. To be secure the cryptographic keys and software, need to be installed during manufacture, so what's being shown in Barcelona is a proof-of-concept rather than something one could deploy onto today's phones - but it's a proof-of-concept which could quickly be integrated if the handset manufacturers wish to.
The SIM providers reckon they've nothing to fear either way. They point out that securing a chip within a phone is much harder than securing a separate module with limited (and well known) interfaces. As banking moves onto the phone that will become more important. Payment applications will be able to choose where they want to live: in the SIM, in a proprietary secure module (such as used by Google Wallet) or in the MTM. Network operators are hoping to be able to charge as much as half a Euro in annual rent for space on the SIM, so the additional competition isn't going to be welcome.
However one looks at it our mobile devices are going to get a lot more secure, and as desktop computing subsides that model will, almost inevitably, become de facto standard. ®
Apart from the basic "signing is not security" quagmire -- which this doens't address -- there's something more insidious going on, and it's not been addressed. What riled everyone up back then was that a maker of software (also the byword for shoddy, rushed out the door, fix it in version three software) planned to take away our control of our owned devices in the name of making up for their deficiencies through means that don't actually do that. We're still there, sort-of.
Personally I don't mind code signing, it could be helpful if we figured out how to deploy it usefully. But what this brings starkly to the fore is the simple question of who owns the device. If some other party retains the keys, I don't really own it.
See, for example, sony (in another shining example of how not to treat your customers, oh how the mighty have fallen), and their removal of the "other os" option on ps3 consoles. That they were on the wrong path was already clear with installing a shoddy trojan by way of thanking their customers for buying their music. A more benign example is the smart car, where you don't buy the car, but a "transportation service", meaning that if the thing breaks they'll fix it, or replace it. Some people hadn't read the small print and were upset with getting a replacement back.
So changing the device usage model doesn't need to be a problem as long as everybody knows this to be the case, and understands the implications, but you need to be careful about it. Paladium was anything but subtle. But the point remains: If I don't have the keys, I don't own the device. If it is sold as if I did own it, then I must also receive (all) the keys.
We may have to put that in law. Just to keep the vendors honest.
"[Get your tongue out of their arsehole...
Dogs do that, you're not a dog are you]" Reg?
"These days computer users seem happier to accept a few limits on their freedom in exchange for better security"
Users don't know any different because they've never been allowed to see any different with their mobile phones and fondleslabs. To imply that there was a conscious choice made is completely misrepresenting the situation.
Ask users if they're ok with their carriers deciding not to upgrade their software leaving them with giant gaping security vulnerabilities... or why they're not allowed to remove that NASCAR app that the carrier decided *had* to be there.
You might as well say that veal calfs love being confined to their tiny little pens - they don't know any different either.
"...in a proprietary secure module (such as used by Google Wallet).."
Hint: "secure" generally implies something that cannot be brute-forced in a matter of seconds and also does not fall open and spill its contents when the device is reset (El Reg articles passim).