Sun declares war on TPC benchmarks
In for a long campaign?
Sun Microsystems won't be releasing any industry-standard TPC benchmarks for its new Sun Fire 15000 Server.
It's going to take its bucket-and-spade and play in the application-specific sandpits, such as the SAP, Oracle Apps, PeopleSoft benchmarks. At the Sun Fire launch yesterday it cited SAP, SPEC's Java benchmarks and the scientific computing Fluent benchmarks instead.
The news isn't so surprising: Sun has had a downer on TPC tests for some time, arguing that they're a synthetic measure of transaction processing or database performance. As such they don't reflect real world returns. But Sun is still the first major vendor to shun TPC quite so boldly.
"Nobody gets 200,000 transactions per minute in the real world," Ken Wong, Sun's Group Marketing Manager for Enterprise Servers, told us.
Rivals have been keen to suggest that this was because Sun's former systems - based on its ageing UltraSPARC II processor - were lagging theirs in the performance stakes. But few of the rivals have, under the gentlest interrogation by us, been prepared to argue that the TPC benchmarks are anything other than a synthetic test.
Sun capitilised on this at its launch yesterday, quoting from the IBM book, literally. It cited IBM's Mainframe RedBook which notes "most Unix systems cannot achieve the throughputs and process utilisations that reflect TPC-C results." And Sun COO Ed Zander added, "we're optimizing for them less and less every day."
Naturally the Transaction Processing Council sees it differently:
"TPC-C's performance measurement metric, tpm-C, does not just measure a few basic computer or database transactions, but measures how many complete business operations can be processed per minute," it says in its introduction to the latest TPC-C benchmark.
(There are three TPCs, the OLTP 'C' mark being the one that most hotly contested, as it best measures the database performance og the big iron guys).
However the great advantage of the Transaction Processing Council is that it's an neutral organisation: there are 25 full members including the leading hardware and software vendors. So it's harder to massage a test to favour any particular hardware vendor. That integrity isn't so clear with much smaller, commercial application vendors.
In supporting material, Sun boasts that its systems trounce the competition when application specific benchmark results are compared, and that user-application metrics show all aspects of the system's performance (IO, memory)
"TPC-C really only used by competitors & industry analysts" reads one Sun bullet point. But wishing TPC away isn't going to be that easy, we suspect, if you the customers find that finance directors need to see some kind of objective performance figures - no matter how synthetic - before signing off. ®
Sponsored: 2016 Cyberthreat defense report