Feeds

AMD gases up Bulldozers for Intel push back

Can Intel match 16 cores at 3.5GHz?

3 Big data security analytics techniques

The big five ride

None of the top-five server makers hurt themselves getting their Opteron 4100 and 6100 products out the door last year, but Hewlett-Packard and Dell fielded a reasonably wide range as did server upstart Acer, all of which will be field upgradable to the Bulldozer chips this year. IBM put a single four-socket box out that crammed all the electronics and lots of memory into a 2U space, but did not put out any other machines, be they rack, tower, blade, or tray servers. Oracle has stopped making Opteron boxes in the wake of acquiring server maker Sun Microsystems in January 2010.

Rozanovich tells El Reg that the shift from dedicated server hosting to cloudy public infrastructure will play into AMD's favor, as will the increasing use of server virtualization inside the world's data centers.

"Three years ago, when companies outsourced their workloads, they wanted a real physical server," explains Rozanovich. "Now, with virtualized servers, people don't really care if they have a dedicated server. What they care about is a rock-solid service level agreement and the ability to expand and contract their workloads and control their costs." And, the hosters want to set up a more standardized infrastructure stack that lets them achieve efficiencies they could not with dedicated hosting. (All of those unused clock cycles, disk spins, and memory chips devoid of data are wasted money, sitting on the books, making the CFO grumble.)

If the success that AMD has had with the Opteron 6100s in certain hosting and HPC accounts is any indication, then AMD thinks it has a pretty good shot at a revival in its server biz thanks to what Rozanovich calls "straight through computing."

Take server virtualization for example. At cloud providers, their virtualized systems are now running at 80 to 90 per cent CPU utilization these days, according to Rozanovich, much higher than the 5 to 20 per cent utilization a typical x64 server had running a single workload. "When your CPU is running at that high utilization rate, HyperThreading doesn't work," says Rozanovich. "The system doesn't have time or the capacity to hyperthread. So now, people running normal virtualized server workloads are turning off HyperThreading on their Xeon-based servers, just like supercomputer shops have been doing for years."

For virtualized servers, having more cores in an Opteron box compared to an equivalent Xeon box gives AMD a slight advantage because companies tend to pin a virtual machine to a core, not a thread. So if AMD's server partners can put 48 Opteron 6100 cores in a 2U box compared to 32 cores for a Xeon 7500 machine - both with big wonking memories - AMD wins. (Of course, last year AMD lost the memory capacity war because its Opteron 6100 memory controllers topped out at 512GB compared to 1TB or sometimes 2TB for Xeon 7500 boxes. The Interlagos Opterons have a reworked DDR3 memory controller that will sport terabytes of main memory.)

The reworking of software to take advantage of more cores and threads is also helping AMD as much as it is helping Intel and suppliers of other processors. To illustrate how the strong core philosophy of AMD's chip design is panning out, Rozanovich uses Monte Carlo simulation, which is used by financial institutions to value their stock and bond portfolios and help them make trades as the markets move.

"The old rule for Monte Carlo was that the fastest frequency and the lowest latency always wins," says Rozanovich. "And so in 2005 and 2006, AMD won the majority of the Monte Carlo deals." Particularly when power was factored in the equation. While data centers can often cope with a rack of super-dense servers that burn 25,000 to 30,000 watts of juice, the data centers near Wall Street, the City of London, and other financial centers can typically only supply 9,000 watts per rack. So every watt and clock really counts.

But with the "Nehalem-EP" Xeon 5500 chip launch back in March 2009, AMD lost the clock and latency edge, thanks to the low-power Nehalem core and the QuickPath Interconnect for linking cores together out to main memory and peripherals. But thanks to the 18 to 24 month upgrade cycle that financial institutions have for their simulation platforms, AMD started to get traction with the dozen-core Magny-Cours chips from the Monte Carlo crowd - and this business is still building.

"Magny-Cours is getting an adoption rate that we haven't see in a while," says Rozanovich.

Part of the reason, he says, is that programmers at financial institutions are learning how to program in parallel. "Some financial institutions running Monte Carlo simulations are now getting much better performance using the slower 12-core Magny-Cours than the faster eight-core versions," Rozanovich tells El Reg. "As developers get experience with parallelization, they are going to start programming to the cores."

At financial institutions that are working on skinny thermal budgets, power consumption is driving what chips Intel and AMD design, according to Rozanovich. Some big banks, brokerages, and hedge funds are coming to them with power budgets and they are sitting down together to determine what number of cores at what clock speed they can deliver in a future chip. It isn't quite custom silicon, but both chip makers have to offer overclocking for high-speed trading systems and thermally conscious workhorses for Monte Carlo and other simulations. ®

SANS - Survey on application security programs

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Brit boffins use TARDIS to re-route data flows through time and space
'Traffic Assignment and Retiming Dynamics with Inherent Stability' algo can save ISPs big bucks
Microsoft's Nadella: SQL Server 2014 means we're all about data
Adds new big data tools in quest for 'ambient intelligence'
prev story

Whitepapers

Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Top three mobile application threats
Learn about three of the top mobile application security threats facing businesses today and recommendations on how to mitigate the risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.