Feeds

How to counter premature optimisation

Is optimisation the root of all evil - or, at least, many bugs

Combat fraud and increase customer satisfaction

There was a time in my career, in the 1960s, when optimisation wasn't optional. System memories were measured in kilobytes, instruction times in tens or hundreds of microseconds. We planned our programs around those limited resources.

No longer. In fact, program optimisation is rarely needed these days despite programs that are several orders of magnitude larger. It’s been less expensive to pay for more computer capacity than for more programmer time for at least a decade. Yet programmers still spend time optimising even before there is a demonstrated need for it.

This causes a number of problems - let me count the ways:

  1. It wastes programmer time.
  2. It makes code more complex, leading to maintenance problems.
  3. It introduces subtle bugs.
  4. It increases debugging time.
  5. It delays initial production use of the program
  6. It often becomes irrelevant after the next processor upgrade.
  7. It makes modification of the optimised parts of the application very difficult, even by the original programmer.
  8. In rare cases, the optimised version discovers previously unknown hardware glitches.
  9. Optimised code can depend on undocumented side effects, failing when the hardware is updated.
  10. Optimised code is often the basis for bragging competitions amongst top programmers, unfortunately spreading the infection.

I’ve personally made most of the mistakes listed above, starting when they were a necessity and continuing past when they were even useful. One personal instance comes to mind:

In 1967, working as a systems analyst on a CDC3300 with cards for source and object code, the source code was punched in a compressed format when I ran a compile. It was easier to handle, but decompressing was really slow. I replaced the Fortran decompressor with assembler, and it ran six times as fast.

Getting it to work, installing in the system, building a standard command card and bragging to the staff how fast it was took about two weeks. Three months later, the centre added extra 7.5 MB disks for the programmers. Source was kept on disk, edited batch or online, and the decompressor was history.

Saving a minute per compile never came close to the hundred hours or so that I had invested. I didn't learn that lesson for another nine years.

Optimisation good practice

The following quote is right to the point:

Rules of Optimisation (from M.A. Jackson):

  • Rule 1: Don't do it.
  • Rule 2 (for experts only): Don't do it yet.

When hardware accelerated in performance during the 1980s, most expert programmers continued their optimising habits, to the net detriment of overall software delivery and reliability. It only became clear in hindsight that optimisation was no longer, in most cases, a useful practice. But, perhaps it was fun…

Today, the key list of development deliverables often doesn't even mention program performance, because the systems available can easily scale grow to extremely high capacities. Only the very largest loads, such as the IRS and Social Security, still need to use optimisation techniques, mostly in peripheral usage rather than for the processor.

At the lower end of system size, small and medium business, new techniques of virtualisation and replication of software in an array of identical servers can create the same expansion capabilities that have existed in the mainframe world for years at much lower incremental costs.

Optimisation hasn't disappeared, but like the industry, it has enlarged its scope. We now optimise software systems for throughput, hardware for reliability, and clusters for power and air-conditioning efficiency. And, perhaps, to take advantage of multicore processors. But this is usually done for us by the vendors rather than by individual application programmers.

The overall capability of IT installations as a whole is now viewed as providing a competitive edge in the marketplace and this includes in-house software costs as well as purchased software, equipment, floor space and power requirements and operational personnel. Optimisation of this environment against company performance and profitability is now the usual measure of effectiveness in the executive suite.

Optimisation hasn't gone away, it has grown up.

Bill Nicholls is an IT industry veteran. From 1964 with Univac; to 1985 with Weyerhaeuser; then software developer and writer for Byte and Byte.com.

Combat fraud and increase customer satisfaction

More from The Register

next story
OpenBSD founder wants to bin buggy OpenSSL library, launches fork
One Heartbleed vuln was too many for Theo de Raadt
Got Windows 8.1 Update yet? Get ready for YET ANOTHER ONE – rumor
Leaker claims big release due this fall as Microsoft herds us into the CLOUD
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Ubuntu 14.04 LTS: Great changes, but sssh don't mention the...
Why HELLO Amazon! You weren't here last time
Patch iOS, OS X now: PDFs, JPEGs, URLs, web pages can pwn your kit
Plus: iThings and desktops at risk of NEW SSL attack flaw
Next Windows obsolescence panic is 450 days from … NOW!
The clock is ticking louder for Windows Server 2003 R2 users
Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
Admins dab straining server brows in advance of Trusty Tahr's long-term support landing
Red Hat to ship RHEL 7 release candidate with a taste of container tech
Grab 'near-final' version of next Enterprise Linux next week
Apple inaugurates free OS X beta program for world+dog
Prerelease software now open to anyone, not just developers – as long as you keep quiet
prev story

Whitepapers

Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.