Feeds

Researchers look to predict software flaws

Vulnerability formula

The essential guide to IT transformation

Want to know how many flaws will be in the next version of a software product? Using historical data, researchers at Colorado State University are attempting to build models that predict the number of flaws in a particular operating system or application.

In an analysis to be presented at a secure computing conference in September, three researchers used monthly flaw tallies for the two most popular web servers - The Apache Foundation's Apache web server and Microsoft's Internet Information Services (IIS) server - to test their models for predicting the number of vulnerabilities that will be found in a given code base.

The goal is not to help software developers to create defect-free software - which may be so unlikely as to be impossible - but to give them the tools to determine where they need to concentrate their efforts, said Yashwant Malaiya, professor of computer science at Colorado State University and one of the authors of the paper on the analysis.

"The possible reasons that vulnerabilities arise are much smaller than the reasons for the number of defects, so it should be possible to reduce the number of vulnerabilities," Malaiya said. "It would never be possible to reduce the issues to zero, but it should be possible to reduce it to a much smaller number."

The research could be another tool for developers in the fight to improve programmers' security savvy and reduce the number of flaws that open up consumers and companies to attack. While the number of vulnerabilities found in recent years leveled off, web applications boosted the number of flaws found in 2005.

Moreover, the advent of data-breach notification laws has forced companies, universities and government agencies to tell citizens when a security incident has put their information in peril. The resulting picture painted by numerous breach notifications has not been heartening.

The latest research focuses on fitting an S-shaped curve to monthly vulnerability data, positing that a limited installed based and little knowledge of new software limits the finding of vulnerabilities in a just-released application, while exhaustion of the low-hanging fruit makes finding vulnerabilities in older products more difficult.

The researchers found that the number of vulnerabilities found in Windows 95, Windows NT and Red Hat Linux 7.1 fit their model quite well, as does those found in the Apache and IIS web servers between 1995 and the present. The web server analysis, which will be discussed in the September paper, suggests that IIS has reached a saturation point, with a lower rate of vulnerabilities discovered than Apache. Moreover, that analysis found that the S-curve relationship holds for broad classes of vulnerabilities, such as input validation errors, race conditions, and design errors.

Some software developers believe that such research could allow product managers to make better decisions about when a software program is ready to be shipped and how many vulnerabilities will likely be found.

"There isn't an engineering manager that wouldn't love to know the number of vulnerabilities they should expect to have after pushing out a product," said Ben Chelf, chief technology officer for Coverity, a maker of source-code analysis tools that can be used to detect potential software flaws. "A VP of engineering can, on the release date, say, 'We expect to find 50 more security issues in this code'. That helps mitigate cost and risk."

Yet, the researchers' predictions have been hit or miss, even with a large margin of error of 25 per cent. A paper released in January 2006 predicted that the number of flaws found in Windows 98 would saturate between 45 and 75; at the time, data from the National Vulnerability Database showed that 66 vulnerabilities had been found, but that number has continued to increase to 91 as of July.

However, the researchers' prediction for Windows 2000 has apparently been accurate: The current number of vulnerabilities for the operating system is 305, just within the 294-to-490 range given in the computer scientists' paper.

Whether the models become more accurate may rely on getting better data on the number of software flaws discovered after development. The models used for prediction of future vulnerabilities assume that defect density - the number of software flaws per 1,000 lines of code - remains the same between software versions.

It's not an unreasonable assumption: Historically, the researchers found that a company's programming teams tend not to get better, making the same number of mistakes in one version of software as the next, said CSU's Malaiya.

However, such observations use data that predates the increasing use of static code analysis software and initiatives among developers, such as Microsoft, to improve the security of their products.

Boost IT visibility and business value

More from The Register

next story
Munich considers dumping Linux for ... GULP ... Windows!
Give a penguinista a hug, the Outlook's not good for open source's poster child
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Intel's Raspberry Pi rival Galileo can now run Windows
Behold the Internet of Things. Wintel Things
Microsoft cries UNINSTALL in the wake of Blue Screens of Death™
Cache crash causes contained choloric calamity
Eat up Martha! Microsoft slings handwriting recog into OneNote on Android
Freehand input on non-Windows kit for the first time
Time to move away from Windows 7 ... whoa, whoa, who said anything about Windows 8?
Start migrating now to avoid another XPocalypse – Gartner
You'll find Yoda at the back of every IT conference
The piss always taking is he. Bastard the.
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.