Feeds

Nemesis looms over TPC-C benchmark

Expensive and oudated

Secure remote control for conventional and virtual desktops

TPC-C - that venerable server-come-database benchmark from the Transaction Performance Processing Council - looks set to be pensioned off at last with the recent announcement of a new OLTP (OnLine Transaction Processing) benchmark specification, TPC-E.

The new benchmark is aimed at moving the touchstone of database and server performance into the world of today rather than yesterday. This is being achieved by shifting the fundamental premise on which the specification is built to a model more fitting to current business needs.

TPC-C was built around a model of a typical database system needed to run a parts warehouse. With TPC-E the model has been shifted to a model of a typical brokerage operation. This should give a better simulation of the modern real-world transactions that take place over the Internet – including the time delays and fractures that can occur between distributed contributors to the smooth running of a transaction. In operation, the benchmark goes through the process of interacting with the financial markets to execute customer orders and update the relevant account information files.

The Council has also aimed at getting more reality into the benchmark by creating `real’ names, addresses and business details for the test database from data culled from US census and New York Stock Exchange information.

TPC-E is also scalable, so that the number of customers can be varied. This should expand the benchmark’s usefulness in allowing database and server vendors to pitch at small and mid-range market sectors specifically, and at a price that smaller vendors can afford. The price of conducting benchmark tests has been one of the recent criticisms of TPC-C, so much so that only the major vendors now attempt it.

One reason for this is the suggestion that results can be affected by factors such as the number of disk drives used, rather than the total disk capacity. So, more small capacity disks can mean a better test result. But it also means an impractical installation configured specifically for the test, sometimes with several thousand disk drives and a price tag running well into the millions. This is decreasingly relevant to the average business requirement, though the benchmark results are still seen as good guidance.

TPC-E is still currently in draft as Version 1.0.0 and more information, together with the detailed specification, can be found here. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.