This article is more than 1 year old

Whistler to include ‘block all unsigned apps’ security mode

Permission needed to run your apps on Windows

Microsoft is to incorporate a "signed application" system in Whistler, the intention being to furnish users with a super-secure mode of operation that just plain stops code executing on the machine. Unsigned code, that is. Speaking about Whistler in London today Microsoft VP for IT infrastructure and hosting Jim Ewel described this as being one of several security modes that can be implemented.

The system doesn't just deal with incoming files - it applies to "every piece of code executing on the machine." There's a list of 40 different kinds of executables, and policies can be set to define which of them can be run, the most secure policy being to run only signed applications.

But although Ewel spoke of virus defence in practically the same breath, signed applications seem to be a lot more about ownership, responsibility and liability. For corporations it can be a massive comfort blanket, because it's an apparently near-absolute mechanism for stopping dumb users running code they shouldn't. The basics of the system already exist in Windows 2000, but the Whistler version will be more extensive.

At the moment, for example, incoming executables in email and ActiveX controls in IE can be blocked from running. Implementing it on a scale in Whistler that will allow system-wide policies to be set also stops staff bringing in their own doobries and installing them, stopping them breaking their machines, compromising the network and generally sowing confusion about what is and what is not installed on the network.

Many system managers will be mad-keen on this kind of approach, because it promises to make their lives a lot easier, but on the flip side you can see how it could be restricting from the users' point of view, and - weirdly - how difficult it would have been for Microsoft and the PC business to penetrate corporate networks from the bottom up, if this level of control had existed at the time.

Signed applications apparently won't give you direct protection against viruses either. If it works like existing signing systems, then you'd get a certificate for an app from an authority like Verisign, or as a user you could get a company-wide certificate, or set up your own internal certificate authority that would allow internal machines to trust anything bearing that certificate.

In part, the key to this is, well, the key. So long as the key to your signatures is secure, then the apps bearing it can be trusted, probably. Even if the key is compromised, stolen by a virus writer and let loose, then it ought to be traceable back to source fairly rapidly. You then have to change your signature, and all of your apps don't work. Virus writers could apply for and secure their own signatures (just don't tell them it's a virus, OK?), but theoretically at least the signature should be traceable straight back to them once the code was in the wild.

Rather worryingly, Ewel says he does not as yet know what mechanisms for distributing signatures for applications are going to be put in place. This might mean Microsoft is rethinking the mechanisms already being used, and if the company leaves it too late there's plenty potential for confusion.

At the moment the system is only partially implemented in Win2k, and therefore isn't particularly widespread. If signed apps policies are to become widespread in business, however, then the issuing and auditing systems are going to have to take a much heavier load than is currently the case. The mere creation of an industry standard signed app system is also likely to attract massively increased interest from virus writers, because if you can get in there in the first place, there's going to be an extremely big target that trusts you to scamper around inside, even if it's only for a brief period. ActiveX's history in this area also does not give one confidence; code signing is used to make ActiveX controls run securely, but ActiveX has nevertheless starred in numerous security holes.

And home users? To some extent the implications will depend on how hard Microsoft wants to push the security blanket at them. Standard commercial apps will be signed, and an 'only trust signed apps' pitch would probably play to many users. But that would favour larger, established software companies, and stop amateurs and enthusiasts getting their code out there. On the other hand, if the certifying net is cast too wide, then the whole system could be discredited because it fails to stop viruses getting in, or just because it allows bad apps to escape.

It might work for corporations, but the notion of having to get permission before you can run your apps on Windows is not the PC industry as we have known it. ®

More about

TIP US OFF

Send us news


Other stories you might like