Benchmarks, bunkum and baloney

Performance figure hype

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Comment There are three types of claims made by vendors with respect to benchmarks: pure performance figures, competitive performance comparison, and TPC figures. In my view, none of these has marketing value, though the second of these potentially has some utility for the supplier. Let me explain.

A good example of the over-hyping of pure performance figures can be found in the event streaming market. More or less every vendor in the market puts out figures about its event handling capability, ranging from tens, through hundreds of thousands of transactions per second to claims in the millions of transactions per second.

Now, what don't these claims encompass? First, a definition of what an event consists of: if my event is 10 bytes and yours is 20 bytes then, all other things being equal (which they aren't), I am going to get better results than you are. Second, even if we agree on an event size, then this takes no account of the fact that some vendors can process deltas and some can't. That is, the ability to process only changes to the data rather than having to process events in their entirety.

Thirdly, there is the processor issue. Unless everybody is running on the same hardware the relative merits of one performance figure versus another are opaque to say the least. Moreover, if you are going to have to spend hundreds of thousands of dollars for an implementation and you need a two CPU system as opposed to a single CPU system (or even 16 instead of eight) then how much difference is the extra cost of hardware going to make?

And finally, how much throughput do you really need? In equity trading in New York 150,000 transactions per sec is roughly the current rate. On the other hand, for mobile call rate look-up less than a tenth of this figure is more than enough. While rates are likely to increase, the performance of all the vendors is more than enough or, where it is not, you will find that these vendors do not target the very highest throughput requirements such as algorithmic trading.

Anyway, enough of pure performance and on to competitive performance benchmarks. An example of this that recently came up was an ETL vendor that claimed to have the fastest throughput on the market. Apart from the fact that it had done a performance comparison with just one other vendor (and that was not Ab Initio) as opposed to the 40 or so in the market, it, too, made a big thing of the fact that its figures were for a single CPU system as opposed to a two CPU system—see my remarks above.

However, in the ETL space it isn't just about throughput, there are also issues depending on the transformations you are doing, for example, and this vendor did a number of different tests, in some of which it came off better than others.

Naturally, it published only the ones most favourable to it (incidentally, I am not naming the company here since, in my experience, every vendor does the same thing), which is why the figures are nothing more than marketing puff. However, where there was real value in doing these comparisons was in allowing the supplier to discover where it did not do so well, so that it could work on this in the labs and improve its performance in the future.

Finally, as far as TPC figures are concerned: these are artificial, easily suborned and are won by the companies that have the most money to throw at them. There is, however, one other class of benchmark that has value, and these are the proofs of concept and benchmarks that users undertake. They have value because they directly relate to the user's requirements and then you can get a true feel for the cost of ownership.

To conclude, benchmarks have value when they are conducted for the benefit of individual users and for vendors wanting to identify weaknesses in their product. As for the marketing hype, it's like boys in the locker room claiming "mine's bigger than yours" followed by "no, it isn't - yes, it is" arguments. And it's of about as much value. I could go on with this metaphor but I think I'd better stop.

Copyright © 2006, IT-Analysis.com

Security for virtualized datacentres

More from The Register

next story
Microsoft WINDOWS 10: Seven ATE Nine. Or Eight did really
Windows NEIN skipped, tech preview due out on Wednesday
Business is back, baby! Hasta la VISTA, Win 8... Oh, yeah, Windows 9
Forget touchscreen millennials, Microsoft goes for mouse crowd
Apple: SO sorry for the iOS 8.0.1 UPDATE BUNGLE HORROR
Apple kills 'upgrade'. Hey, Microsoft. You sure you want to be like these guys?
ARM gives Internet of Things a piece of its mind – the Cortex-M7
32-bit core packs some DSP for VIP IoT CPU LOL
Microsoft on the Threshold of a new name for Windows next week
Rebranded OS reportedly set to be flung open by Redmond
prev story


Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.