Feeds

A case for software benchmarking

Wheat and chaff

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Comment I do not normally hold much truck with benchmarking, at least for such public tests as TPC-C (the standard for transaction processing) and TPC-H (for data warehousing). These are typically marketing figures – the tests are artificial, limited, and vendors with big bucks can throw enough money at the tests to ensure that they do well. Thus, for example, I did not publicise IBM's TPC-C figure for the latest release of DB2 last year, despite the fact that it was more than twice as good as any previous figure. No, to my mind the only useful benchmarks are those that are performed specifically for individual customers with their data and their workload. But these, of course, are expensive to run.

This, at least, was my thought until recently. However, I have recently been discussing benchmarking with one of the founders of benchmarking, then at the University of Wisconsin, back in the 80s. The point was made to me that back in the early days, benchmarks really did have a useful purpose. For example, it turned out that some of the databases that were tested way back then couldn't do such things as three-way joins: the optimiser just thrashed about and produced nothing. The contention therefore, is that benchmarking can be useful where there is only uncertainty and doubt and a lot of marketing hype about performance – in other words, in the early days of a technology.

Is there such a technology? Yes, there is. Vendors (recent examples include Software AG, Sunopsis and Informatica) are pouring into the EII (enterprise information integration) space to offer federated query capability. That is, supporting real-time queries against heterogeneous data sources.

To date, much of the focus in this market has been on how you set up and define these queries and only a relatively few vendors (notably Composite Software, Callixa and IBM) have been putting much emphasis on performance. But performance is a big issue. Indeed, if you don't get good performance across a spectrum of queries, then it is not worth investing in the technology in the first place. In fact, I know of one major company (which will remain nameless) that, to its credit, has not introduced a product into this space precisely because it cannot get it to perform.

I am inclined to think that some benchmarking, perhaps based on a distributed TPC-H model, would be of benefit in this instance; at least in the short term. Vendors can currently wave performance figures around willy-nilly without fear of contradiction, and it is pretty much certain that some federated query projects will fail because the selected solution fails. While benchmarks are only a snapshot, in this case they may serve as an initial method for sorting the wheat from the chaff (if only because the chaff will not participate). To that extent I am in favour of them.

© IT-Analysis.com

Related stories

Sun could quell database hunger with Unify buy
IBM benchmark leaves server rivals breathless
IBM and HP take phony benchmark war up several notches

Security for virtualized datacentres

More from The Register

next story
Microsoft WINDOWS 10: Seven ATE Nine. Or Eight did really
Windows NEIN skipped, tech preview due out on Wednesday
Business is back, baby! Hasta la VISTA, Win 8... Oh, yeah, Windows 9
Forget touchscreen millennials, Microsoft goes for mouse crowd
Apple: SO sorry for the iOS 8.0.1 UPDATE BUNGLE HORROR
Apple kills 'upgrade'. Hey, Microsoft. You sure you want to be like these guys?
ARM gives Internet of Things a piece of its mind – the Cortex-M7
32-bit core packs some DSP for VIP IoT CPU LOL
Microsoft on the Threshold of a new name for Windows next week
Rebranded OS reportedly set to be flung open by Redmond
Lotus Notes inventor Ozzie invents app to talk to people on your phone
Imagine that. Startup floats with voice collab app for Win iPhone
'Google is NOT the gatekeeper to the web, as some claim'
Plus: 'Pretty sure iOS 8.0.2 will just turn the iPhone into a fax machine'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.