TPC adds power suckage to benchmarks

Performance (per watt) anxiety

Remote control for virtualized desktops

If server makers are already anxious about how big their iron is, they'll now also need to start worrying about how cool they are.

The Transaction Processing Council is a consortium of server, operating system, and database software makers that steers the development, running, auditing, and reporting of a suite of online transaction processing and data warehousing benchmark tests. The results of these tests are used for one-upmanship by vendors and as part of purchasing decisions by IT departments, and now TPC is adding another set of metrics for them to take into consideration: energy.

The TPC-C OLTP test, which has been in use since 1992 and which is in its fifth revision, is arguably the most popular system-level benchmark test in the history of the computer industry. The TPC-C test is a collection of different workloads that are associated with the data processing needed to run a warehouse - a physical warehouse with forklifts and pallets, not one crammed with data being tickled by SQL. The workload includes order processing, inventory, and other operations, and the TPC-C metric is the number of new orders a database can process per minute while supporting the other work.

Like All TPC tests, all of the hardware and software used in the TPC-C test has to be itemized, and vendors have to provide list prices for all of the components as well as discounts that a typical customer would get. The results have to be audited by experts certified by the TPC, and both performance and price/performance metrics are required.

The more modern TPC-E test, which was launched in March 2007, simulates the data processing of an online stock brokerage, and uses real data along with customers that are simulated based on census data from the United States and Canada.

The TPC-E test was designed to be easier to implement, but harder to game. And as you might expect, it has not exactly been popular with server and systems software makers - even though they designed it by committee starting in 2005 and were expected to ratify it in 2005. Only 29 systems in three years have been tested, which makes the TPC-E test basically useless.

The other test that is getting an energy component is the TPC-H data warehousing test, which tests how well or poorly a system or a cluster of systems can process ad hoc queries.

The TPC-Energy spec is a an optional component of these three tests, not a requirement, says Mike Nikolaiev, who is chairman of the committee that drafted the spec and who gets his paycheck as the manager of the systems performance group at Hewlett-Packard. While the Standard Performance Evaluation Corporation has a larger family of benchmarks, a number of which measure server performance and a few which have an energy component, the three TPC tests and their energy metric overlay are distinct in that they require pricing on the systems and independent auditing.

"We want to make sure we have a level playing field here," Nikolaiev said. And in a possible good sign, the 22 members of the consortium who were around to vote on the spec (there are 24 members in total, including all the key server and operating system/database players) unanimously ratified the spec. "That has never happened with a TPC benchmark before," according to Nikolaiev.

The TPC-Energy spec shows vendors how they need to attach power meters to the systems under test, and it will not only look at the electricity consumed by the entire system under test, but also examine the idle power of the same system when it is not processing transactions but kept in a state of being able to process the first transaction.

To keep vendors from gaming the test, all elements of the system under test (including any funky cooling elements) have to be commercially available, and vendors have to measure air intake on the system racks and keep an ambient intake temperature of 20 degrees Celsius - no super-chilling the data center to allow a machine to do more work per watt.

Energy use is measured in three parts of the system - the application servers, the database servers, and the storage systems - so vendors can show the relative efficiency of different components. The TPC has also come up with a software toolset called the Energy Measuring System that all system testers will use to monitor energy usage and collect data during the test. This means the collection of energy data will be absolutely consistent across different vendors.

Nikolaiev is hopeful that the TPC-Energy overlay to the TPC tests will be popular, and with vendors looking for every angle they can find to peddle systems, it seems reasonable that a different kind of arms race will start based on performance per watt instead of just performance.

"The main vendors will jump in right away," says Nikolaiev. "And this will create a lot of peer pressure."

The lack of peer pressure and the closing of loopholes in the TPC-E test has doomed that better benchmark to the back benches after an initial boost of enthusiastic jawboning from the server makers. HP is going to be putting out TPC-Energy specs soon, and Nikolaiev says others will start rolling out results in the next couple of months. He expects server makers to run double tests, pitting solid state disks against disk drives within the same server and running the same software stack to show why SSDs are worth all that extra money. No doubt the first tests will come on systems using new x64, Itanium, and Power7 processors that that are expected in February and March. ®

Security for virtualized datacentres

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY?
China's Memblaze says they've got it in PCIe. Yow
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
This time it's SO REAL: Overcoming the open-source orgasm myth with TODO
If the web giants need it to work, hey, maybe it'll work
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
prev story


Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.