Original URL: http://www.theregister.co.uk/2010/03/25/developers_not_liable/

Don't blame Willy the Mailboy for software security flaws

In defense of developers

By Matt Stephens

Posted in Developer, 25th March 2010 05:16 GMT

There's a low rasp of a noise being made in the software world. Customers want software vendors to hold programmers responsible if they release code containing security flaws.

Actually, that's not strictly true. Security vendors want customers to start wanting software vendors to hold the programmers responsible.

As we recently reported, the annual Top 25 programming errors announcement urged customers to let software vendors know that they want secure products. This desire is captured and bottled in a draft Application Security Procurement contract provided by security certification vendor SANS. The majority of the contract discusses liability in terms of the vendor. But the occasional clause stands out, like this one:

Developer warrants that the software shall not contain any code that does not support a software requirement and weakens the security of the application...

In other words, when it comes to application security and QA, the buck stops with the developer. And that's in a contract that likely won't even be seen by the developer and will be signed on his behalf by his employer. It renders the contract unenforceable - so why add a clause like that in the first place?

It reminds me of the Dilbert book Bring Me the Head of Willy the Mailboy. No one wants to take responsibility, so the blame is passed down through the ranks in an Ayn Rand-ian shoulder shrug, until the atomic unit in the trenches (the programmer) is reached. The process has failed, management has failed, QA has failed and the customer's blood is boiling. So the answer's obvious: sue the little guy!

That said, no one's saying that programmers should be impervious to blame. Those dilettantes who refuse to adhere to corporate guidelines can still be fired after all. But it's understandable that managers want some formal assurance that their staff have a penny's worth of discipline on the job.

So what's the answer to? Certification in some vendor or another's technology stack?

Uncertain certification

There is Sun/Oracle's Java Certified Programmer, for example. But certification is of questionable use to the employer, as certification involves rote learning without requiring a true understanding of the subject. For programmers, certification is also risky, quickly outdated. And it can be costly. Anyone out there still have a Certified Novell Engineer docket stapled to their yellowing CV?

Then there are more general-purpose, rigor-oriented qualifications for coders: the all-encompassing SWEBOK offers an IEEE-stamped certification. But - more's the pity - employers care little for such things. Invariably, they return to the time-honored qualification - the university or college degree. But even that tends to be a guideline: hands-on experience ends up being more important for many employers.

Nevertheless, security vendors such as Fortify would like to train - and certify - your entire IT workforce to be more aware of SQL and LDAP injection attacks, cross-site scripting, and so forth. Such vendors thrive on the corporate paranoia brought about by high-profile web front-end attacks and customer data-loss snafus. Their Fortify 360 product will act as a gateway to your source control system and monitor any code commits for security vulnerabilities.

While I have some reservations about Fortify 360's usefulness, it's a good example of how management should be taking the lead to impose processes on development, rather than blaming the programmers for a breakdown in process. It's good to see that QA - which this really is - has come back into fashion, despite being re-branded in the guise of security and the paranoia of an "unseen enemy that cannot be defined".

The vulnerabilities highlighted by Fortify's Audit Workbench are good and worthy flaws, and there's a good chance that your organization will be more secure if they embrace Fortify wholeheartedly. At the very least, your chief executive will sleep better. But there's a whole category of bugs and rules that look like they really are just there to tuck your CEO in at night. For example, the following code is incredibly insecure and may bring down all of civilization, apparently:

public void login(String username, String password) {
}

First, the password is called "password," so someone scanning the compiled code could find that quite easily. Second, it's stored in a String, and Java likes to intern all its Strings in a big reusable cache - so a wily hacker will head straight for the String table and scan it for passwords. So whatever you do, store your passwords in char arrays, and obfuscate your variable names to throw hackers off the scent - "absolutelyNotAPasswordNoWay," perhaps.

There we have an example that managers will love because it seems profound, and it allows them to impose a process while still blaming programmers. But let's face it, if the user has sufficient access to the JVM to trawl the String table, then he's already pwned your organization, and it has siphoned several billion of your corporate dollars into his suburban Jacuzzi extension project.

A far more insidious problem is with insider programmers who hack the system to launder funds - the recent TJX hack case being a case in point.

But in such cases, we still have the traditional "after the event" fallback known as "the law," and preventative, compulsory regulations such as PCI Compliance, which instructs management to put barriers in their organization to prevent the wrong people from seeing credit card numbers or personal account details.

In most cases, the onus is on management to enforce regulatory procedures, which means there's no need for programmers' heads to be on the block. But as long as there are security consultants and conferences to speak at, developers will continue to be blamed for breakdowns in process and QA. ®

Matt Stephens is co-author of Use Case Driven Object Modeling with UML: Theory and Practice, and the upcoming Design Driven Testing: Test Smarter, Not Harder.