Buggy spreadsheets: Russian roulette for the corporation
The chasm between common sense and computer programming
How many scenarios can you imagine where a momentary loss of concentration could cost over $1bn? Perhaps a nuclear power station meltdown...or if a currency trader hit a few wrong keys? Well, another possibility is a simple spreadsheet error.
In October 2003, soon after announcing third quarter earnings, Fannie Mae had to restate its unrealised gains, increasing them by $1.2bn. This highly unwelcome outcome was said to stem from "honest mistakes made in a spreadsheet used in the implementation of a new accounting standard".
The really, really bad news is that millions of similar errors are almost certainly being made every year, many of them in business-critical financial spreadsheets. Although they are the quintessential end-user tool, spreadsheets of any complexity are just as hard to write and maintain as any other kind of software - if they are to yield consistently accurate results, anyway.
At Cutter Consortium's inaugural European Summit in March, I took part in a spirited panel discussion about SOA and related matters. At one point the redoubtable Oliver Sims suggested that Excel was the world's most widely-used software development tool - a statement that struck me as probably correct. But if so, how grave are the implications? After all, the overwhelming majority of spreadsheet users are not professional programmers, and it is doubtful whether most of them have even been formally trained to write spreadsheets.
If only organisations realised the harm they are laying themselves open to, they might be inclined to seek advice from a suitably qualified expert. Such as Dr Louise Pryor, an independent software risk consultant who specialises in complex financial models and spreadsheets.
As a Fellow of the Institute of Actuaries, with a PhD in Computer Science and extensive hands-on experience in commercial software development, Pryor is ideally equipped to explain how spreadsheets should be designed, written, and tested. Her website provides detailed and convincing evidence of prevailing spreadsheet error rates, some hair-raising war stories, good advice, and loads of other useful information.
Other valuable resources include the spreadsheet research (SSR) site maintained by Professor Ray Panko of the University of Hawaii, and the European Spreadsheet Risks Interest Group (EuSpRIG). Several books have been written on the subject; for instance Patrick O'Beirne's Spreadsheet Check and Control.
Panko has collected the best available evidence for spreadsheet error rates, based on field audits by organisations such as Coopers and Lybrand, KPMG, and HMCE (the UK's Customs and Excise department). Of 54 spreadsheets audited between 1997 and 2000, no fewer than 49 were found to contain significant errors - a defect rate of 91 per cent. In a more recent exercise, every single one of 30 project financing spreadsheets scrutinised had at least one mistake.
This is not to suggest that spreadsheets are uniquely bug-ridden; there is convincing evidence that virtually all non-trivial software contains defects. According to Panko's Human Error website, "spreadsheet error rates look like error rates in normal program development and in code inspection". He estimates typical human error rates at about 0.5 per cent for simple mechanical tasks like typing, and five per cent for more complex logical activities such as programming.
Sponsored: Today’s most dangerous security threats