If software vendors made everything else
Atishoo atishoo, who cares if it falls down
Comment So what is it about quality? We all want it, we all expect it in other products that we buy and yet a goodly number of us try to cut corners on delivering it given half a chance. That is how it seems from a couple of responses to last week’s Blog piece `Unbreakable’? Software? Harr!
I made the silly mistake of suggesting that `unbreakable’ software, while a practical improbability, was a good target nonetheless. This was on the basis that the closer you get to it the better off users will be. Confidence in the reliability of applications is what they want – a bit like knowing it is likely the walls of your house will stand around for a while so long as nothing unusual happens.
“Actually I think you'll find that house building is heading the way of software these days,” fired back Mark Colby. “Last new house I saw, all the grub screws in all the light switches and sockets (and I do mean *all*) were loose - the wires had just been poked in and the covers put on. And the back boxes were secured to the walls by a single bent nail rather than two screws.”
It is a sad thought, but perhaps he’s right. Software vendors have demonstrated for years that selling half-baked products, where seeing what breaks in the field is the preferred Quality Assurance methodology, is a successful business plan. And it is: they get away with it.
Move that business model to another industry – house building, or better still bridge building – and ponder on what might happen. It is a bit like every one of them being still being built like the Tacoma Narrows Bridge just to see how many user complaints get generated.
Bill Nicholls also got in touch to suggest that, while this was largely true of many software developments, it needn’t be that way. “Two things could dramatically change that situation,” he wrote. “First, instead of using typical C type programming languages, use Ada95 or Ada05. I know this will raise screams from the majority of programmers, but frankly C++ (or similar languages) is not a good choice for reliability.”
Oh it does raise screams, Bill, it does. I got an earful earlier in the year when Reg Developer carried this about Ada. I put it down to the religious bigotry we are all prone to now and then, though being `religious’ in the pursuit of poor quality is an interesting concept.
His second suggestion was that using proper Quality Assurance testing procedures in the first place would be seen as a jolly good idea – maybe not by a company’s Chief Bean Counter, but sure as hell by the customers. He then provided a simple guide to what he meant and gave us permission to quote from it – so I will, extensively. What a nice man.
I can already hear Chief Bean Counters squealing the word `expense’, of course, but maybe we have to change the software industry’s view of quality before the rest of the world adopts that approach for everything else made, used and eaten. That way, we’re doomed. So, here is Bill’s simple eight-point plan for getting good quality into your software.
1. QA starts with a spec.... Okay, you can stop laughing now.... No, I mean it, stop laughing... Most projects don't have specs worth a damn, and I had to deal with that too.
But the big difference between testing done by programmers and QA done by a separate team is that QA can be very effective because it is an independent group. That's not just important, it's critical. QA must not report to the programming manager, but at least one step, preferably two up the chain. In my case, the programming managers, the marketing manager and I all reported to the VP, and I had the authority to deny shipment if the product did not meet QA. You would not believe the screaming that caused, but without it, QA would not have succeeded.
2. QA is philosophically different from testing. Testing (usually) says "Let's see if the standard cases work." QA says "Let's see if any input can break the product." Programmers hate this, but from results, marketing people love it because products that pass QA don't screw up in the field. 3. My QA efforts were concentrated on a sophisticated product - a three level distributed Point Of Sale application with all three levels programmed: Cash registers, store systems and corporate level systems.
4. Each QA effort started with nailing the marketing folks to a chair (or some similar effort) until I had extracted all of their customer promises. This was the starting point for the spec. I also checked with the programmers to see what other info they had, and usually found (sometimes considerable) misunderstanding between the two. This in itself was beneficial.
This part (building the spec) should be extended into a signoff document between marketing, programming and QA. This would have reduced a number of problems I ran into later.
5. Given the specs, QA test design started - an exhaustive test of each component starting at the cash register functions, but integrated into the overall test process. The details are extensive, but *this* is the critical step for a good QA.
6. The QA is executed, all errors noted in writing with test references for the programming staff. This was two or three type-written lines per error, and my first QA ran to five pages!
7. The programmers hated me, except a mature couple (in 40s) who thanked me. This is one way to find the best project leaders. Programmers need to be educated as to the value of a working program delivered to a customer - most of the young ones haven't a clue.
8. There is a lot more to be said, but this outlines the major parts of the process. QA can and does work, when properly implemented. It will make a significant difference in customer relations and company profitability. ®