Pet vs Pet: .NET ‘trounces’ Java

It's a dog's life

Java expert The Middleware Company has optimized Sun's showcase J2EEPet Store application, and reckon it still runs like a dog.

Or in some cases, like a dog with a kennel tied to its hind legs: by refusing to function at all. Sun slammed the tests, which conceal important information, although the testers acknowledge that app servers from different vendors are being used, without naming them - but conceded that on low end hardware Wintel will perform faster. Both platforms were tested on Xeon-based Compaqs.

The original Pet Store wasn't designed to be a benchmark, but a demonstration of a range of programming techniques. Version 2.0 of each application added XML-based web services, and distributed database access with rollback, each of which the Middleware Company measured in a separate suite of tests. They also threw in a price/performance metric.

.NET significantly outperformed the J2EE version on 2,4 and 8-way machines on all three suites: the web benchmarks, TP and web services. In one case, the J2EE Pet Store couldn't handle the transactions at all.

"There are different app servers in each case," Sun's David Harrah told us. "Why that's even been published, I don't even know."

"It's no surprise to us or our engineers that Windows on Intel is faster: it's their home ground. The first Pet Store comparison, that was widely repudiated, showed a 10x advantage. This one shows a 2x and they've got home field advantage."

"That's not our value proposition - Java runs across a spectrum of devices from cellphones to mainframes."

But wasn't the 4 and 8 way the sweet spot of the market?

"I'd say the larger part of the market is developing client side software for devices from cell phones to smartcards," was the reply.

The price performance calculation was based on the system cost of a J2EE app server price totaling $84,990 (to .NET's $36,990). They could just as easily have used Sun's app server which is now bundled for free with Solaris, he pointed out.

The testers explain that the web services part of the application couldn't use the 1.4 version of the Java run time, which they note was 50 per cent faster than the 1.3 they used, thanks to better garbage collection. That alone explains why so many connections were refused in the web services test, they say.

But the benchmark throws up the remarkable statistic that the Java version required 14,004 lines of code, while the .NET version featured just 2,096 (and not the other way round, as we originally stated). The benefit of hindsight, or is one of our class libraries missing? ®

Update: There's a forensic examination of the benchmarks here

Related Link

the benchmarks - [PDF, 2.2MB]

Related Stories

Pet vs Pet: MS opens .NET benchmarking wars
Sun shuns MS 'gutter' benchmark challenge

Sponsored: Network DDoS protection