We know security and usability are orthogonal - do you?
Our recent article about the fine line between security and usability started some very interesting discussions and active criticism, most of which was targeted at us - suggesting that security and usability do not form a one-or-the-other type relationship (or are at least far more independent than dependent on each other).
We already know that, and now you know that.
Why we chose to represent it that way is because many developers, users, and administrators don't see it any other way. To them, if you improve the security of a piece of software it generally means they believe they have given up some usability to achieve that goal, and vice versa.
While a greater understanding of secure development practices and security as a part of the design process means that more applications are being created without needing to sacrifice usability for security, many users have been conditioned into thinking that a tightening of security means reduced usability.
If you still think that is not the case, consider two recent examples that have affected both Apple and Microsoft.
In Apple's case, the way its firewall handled incoming connections to various services when "Block all incoming connections" was selected was not very well defined - leading to public criticism and complaints. While the firewall was performing an okay job, the lack of accurate information presented to the user marked an example where usability trumped security and there was a perceived problem. An update to OS X to address this issue brought security and usability into more harmony, without either having to suffer.
In Microsoft's case, the User Account Control (UAC) feature associated with Vista is a security mechanism that seeks to help protect the user and warn the user of applications and data that is trying to access system components that would otherwise require an administrator-level account.
The only problem is that decades of applications being able to run at administrator equivalent levels has resulted in a glut of software that claims to require administrator access to successfully install and operate on Windows. The net result is that the user encounters a usability issue with their shiny new operating system, as a security component seems to be running in overdrive and actually reducing the efficiency of the user with seemingly constant interruption.
Even a simple Google search for "UAC Vista" brings up numerous guides and how-tos for disabling UAC, immediately after the Microsoft TechNet entry describing what UAC is. If the added security provided by the UAC wasn't causing such a usability problem, then those guides on disabling UAC would not appear so prominently (and so many of them) on a simple, generic "UAC Vista" search.
Other criticism focused on the discussed JET engine vulnerability. As far as it is concerned, differentiating between "not critical" and "critical" is something that can be a hot topic.
Some recent comments suggested the vulnerability may not be as bad as initially reported, since it is possible to use the .mdb file as a vehicle to execute other code, such as scripts or anything that might wipe out a database. This is true, but if access to JET-dependent software has been properly ACLed then it only affects the local user/what the user has access to. What has been described by cocoruder is a mechanism that does not rely upon this behaviour - instead it exploits a stack overflow to take control over the system running the JET service.
This is a critical problem as it allows a lower privileged user, who has access to operate access, the ability to take full control of the system - a problem when it comes to shared environments like web hosts. Normally they would have the ability to go haywire in the context of their account access, but now it offers them the ability to reach outside their account and gain system-wide access.
It would be like discovering that .vbs files could be compromised to run at SYSTEM level through a simple overflow, when it is already well known that they can contain system commands and any number of other potentially harmful directives.
That is why it is considered a critical vulnerability.
Doubters are welcome to argue their position, and all criticism is welcome. Before you do, though, consider the arguments raised above and whether they completely address what you would argue from.
This article originally appeared on Sûnnet Beskerming.
© 2007 Sûnnet Beskerming Pty. Ltd.
Sûnnet Beskerming is an independent information security firm.
Unsafe file types.
So why is it that an MDB file is an "unsafe file type", blocked by MS applications like Outlook, but an XLS file is not? It would seem that security still runs second to marketing. But that is no excuse for not fixing the JET vulnerability. The problem is that MS decided long ago that JET should be discarded, not fixed.
BTW, I found while testing that VBA (used in Access and Excel, but not the source of the Jet vulnerability) would correct stack corruption errors in the user code, where Open Office would crash on the same code.
It hasn't to be but
A good article. Security except good coding practices, and so on does NOT belong to applications. It is asking a disaster.
You have end-point security ( usually the user ), device security, connection ( network ) security, another device security ( server ), application use security and information access security ( and how often, when, what access is allowed / disallowed ). Layer that with managed security NOT with separate products! Everything is there, user has the normal hassle, ID, password, keycard, challenge, whatever BUT nothing else. Any piece missing in chain - close, disconnect, alert, kill someone, etc. but it is not more difficult than a normal login or scheduling a job. Details really don't matter, the protocols are there, the encryption, the key exchange, the ACL's, even the DB views allowed for this user by this application, etc.
Now - it needs pre-planning on infrastructure and system level NOT in applications, too late there.
Wasn't it a long time ago already the rule, manage externally, not in code, don't write filenames, directories, IP addresses, keys, access rights, and so on in program. If you do the same ( manage externally ) with security it is not very complicated except in business ( and political ) sense.
BUT, as the article almost says, how to change what was learned when PC time started, no security at all in design?
Wrong approach to security.
The key issue with computer security is the C language, with its lack of control over variable-bounds. The majority of exploits involve buffer-overflows of one sort or another, and while these may RESULT from programmers' coding-mistakes, the underlying CAUSE is the security-failings of the language itself.
That, and frivolous 'multimedia' gimmicks in email and Web-browsers, which provide numerous opportunities for malicious code to get itself run.
Meanwhile we have 'Cloak-and-Dagger Security' in the form of multiple useraccounts on what is actually a personal computer, hypercomplex filesystem-permissions which no-one outside of Redmond fully understands, Forced password-complexity and frequent password-changes, and now UAC. All that this does is to infuriate users. In some case it may actually make for lower security, for example forcing users to keep changing password simply results in the password being put onto a post-it. You thus have a system with zero security, whereas peviously it had some.
In many ways the Win95/98 platform offered better security. Whilst it lacked thelabyrinthine userpermissions of NT-based systems, it also offered far fewer exploits to the potential intruder.