This article is more than 1 year old

Designing security into software

What can the developer do?

Editor's blog In previous lives I have worked in software QA and in internal control in a major bank, and I am convinced that security must be designed into software from the start.

Bolting on security to an insecure design is fraught with problems (just ask Microsoft):

  • You decide that making things secure needs a major rewrite of the underlying software - very expensive;
  • Addition of the security makes the software hard to use or too slow;
  • Adding security has a serious impact on delivery dates;
  • In practice, compromises are made and the bolted-on security is largely cosmetic anyway.

But that is a counsel of perfection - what does the poor developer (who'll probably cop the blame after a successful exploit) do if s/he is faced with having to write "secure code", without having much input to the security design? The question is, is there anything useful the developer can do?

Well, I was interested to hear Mike Armistead (co-founder and vice president of Fortify Software, which sells code analysis software) claiming that, yes, there is, when I talked to him recently. At least, that it's possible to address a useful number of the general application-directed threats that your software will encounter by making the effort to "write secure code", helped by code analysis tools that detect known security issues. Of course, if the basic software design simply doesn't allow for user authentication, you're rather on your own, security-wise...

So, what is the problem as you see it?

Many people know that developers are often under intense pressure to deliver more features on time and under budget, with few getting the chance to review their code properly for potential security vulnerabilities. When they do get the time, they often don't have secure-coding training and lack the automated tools to prevent hackers from exploiting vulnerabilities that can trigger malicious attacks.

Not only this, but how do you predict the future? The attack landscape is changing all the time and hackers are always finding new and ingenious ways of subverting business logic or getting at the data your app protects - whether it is server, desktop or web-based.

So what can software producers do? In a sense, a big part of the answer is relatively old fashioned; the developers need to be accountable to their employers and provided with incentives, better tools and proper training.

So, it all comes down to effective management?

In most jobs on the planet, employees get regular performance evaluations, typically receiving raises and promotions for good performance, and pointers for growth and improvements as needed.

In the software industry, developers who produce securely coded software should have that information considered as part of their performance reviews. When taken out of context, holding individual developers accountable for the security of their code may sound like a draconian step, but in reality it is little more than saying that developers are responsible for developing robust, innovative and secure code for their employers.

Harold Schmidt, former Microsoft and eBay chief security officer and White House advisor on cybersecurity often tells a story about when he bought a sports jacket in which he discovered an "inspected by" tag. Schmidt goes on to suggest that software developers could be similarly connected to their final work through a quality assurance process that issued a version of such a tag.

Taking this idea further, employers should consider providing a system of financial rewards for developers who write secure code as a way to offer positive incentives, though this adjustment would naturally require checkpoints along the way. The most successful organisations offer a "stick" with the "carrot" - your code doesn't make it into production or get delivered if it fails security audits/tests. Many developers take considerable pride in the quality of their code, and they should be held accountable - and compensated - for the quality and security of it.

And do the employers also benefit from this approach, as well as their developers and customers?

Software vendors also stand to benefit from a financial rewards system because security flaws are typically easier and less costly to fix early in the software development life cycle when developers are initially writing the code. By contrast, plugging vulnerabilities later in the development process or after a product ships is a frustrating and expensive undertaking. Further, clear incentive systems help management communicate the company's security values and offer benchmarks for success.

Why not just use a "fitness for purpose" warranty to encourage vendors to write secure systems?

Well, some have suggested that the way to reduce software vulnerabilities is for customers to sue vendors or take other legal action. This is not something that I would sign up to, as long as we continue to see market forces improve software. Introducing vendor liability to solve security flaws hurts everybody, including employees, shareholders, and customers, because it raises costs and stifles innovation.

So, it's down to the organisation producing the software to get its security act in order then?

Employers have to provide workers with the tools and training they need to succeed. Developers certainly understand this, as there are literally thousands of developer tools on the market. Despite this proliferation, until recently application-focused security tools were ponderous and had limited effectiveness because they had limited feature sets and served up numerous false positives.

By contrast, today there is an entire category of source-code analysis products that offer automated security testing capable of checking software code against databases containing thousands of common coding flaws. These products arm developers with powerful tools to fix major problems before they ship, saving considerable time, money, and receding hairlines.

In addition to helping developers, this new generation of security tools available from a number of vendors, including Fortify Software, offers adaptable security metrics that can be used for a number of management purposes. Managers can track the number of high, medium and low-level security vulnerabilities in an individual developer's code as part of an incentive system designed to reduce flaws and train developers in how to avoid making the same errors in the future.

Lastly, developers need more training in secure coding. Developers have learned their craft in many ways - in tech schools, through self instruction, or from computer-science classes. Until recently, even the people who received formal education in development were rarely taught common best practices in secure coding, yet books by, for example, Gary McGraw, Michael Howard and David LeBlanc offer some of the best resources on this topic.

So, training; and metrics that can be the basis of process-improving feedback. If security metrics form the basis for a "blame culture", things could get very dysfunctional.

In the end, what security requires is the same attention any business goal needs. Employers should expect their employees to take pride in and own a certain level of responsibility for their work. And employees should expect their employers to provide the tools and training they need to get the job done. With these expectations established and goals agreed on, perhaps the software industry can do a better job of strengthening the security of its products by reducing software vulnerabilities. ®

More about

TIP US OFF

Send us news


Other stories you might like