This article is more than 1 year old

The Steve Jobs of supercomputers: We remember Seymour Cray

Fast, cool, simple. Repeat

Those who counted, counted on Cray

The CDC 6600 might have been Cray’s greatest achievement, but it was the Cray-1 which cemented the eminence of his business for the next three decades.

The first Cray-1 system went to a skeptical Los Alamos National Laboratory on loan in 1976 – and stayed. The first international sale came a year later, going to the European Center for Mid-Range Weather Forecasts. The USSR was off limits.

The market was initially small but rich: the military for weapons design, encryption at the NSA and weather forecasting. "No single person has dominated the industry like that," Spicer said. "Through his extensive talent, the Cold War military situation and Cray’s reputation as a genius – he was as close to Steve Jobs among that community as you could get."

Sembrat attributed Cray's popularity to that focus on delivering machines capable of doing billions of calculations – unlike, say, IBM. "That was the reason why everyone in HPC preferred and wanted a Cray," Sembrat told us.

By the 1980s, more supers were selling to scientists doing chemical analysis and conducting protein-folding studies, to car giants like Ford performing crash testing, companies involved in oil exploration and engineers conducting computational fluid dynamics, but it was still Cray Research and Seymour Cray who dominated the field. Nobody else got a look-in and his firm sold successors to the Cray-1 that in today’s prices landed at $43m.

Without him on board, CDC foundered and the Cray-1 thrashed CDC’s STAR-100. That was the 1970s, but as in so many other fields, things changed in the late 1980s and 1990s.

Cray’s next work was the Cray-3 for his latest firm: Cray Computer Corp. But this time his genius was caught out, tripped by the changing economics of computing. He picked gallium arsenide for his semiconductors, a material that gave up to eight times the performance of its predecessor in half the size and using less power. But it helped make the Cray-3 more expensive to manufacture. Development of the Cray-3 itself also ran late.

No Cray-3s were ever sold but one was loaned to the US National Center for Atmospheric Research (NCAR) to run climate and weather models. It encounted problems and Tom Engel, who worked at Cray Research and who went on to work with others on the Cray-3 and Cray-4 at Cray Computer, was sent to stablilise it. Engel told The Reg NCAR had access a second Cray-3 that sat in Cray Computer's checkout bay at the firm's HQ in Colorado Springs, and that NCAR used this machine "extensively."

Silicon, meanwhile, had improved as a technology while its price was falling, a fact that rallied the broader industry. A chipmaker called Intel invested billions in plants and R&D on silicon and the early 1990s saw the rise of the personal computer using Intel hardware. Server makers such as Fujitsu, NEC and Hitachi picked up on this and began adopting silicon, moving up the food chain into supercomputers.

Cray is criticised by history for picking gallium arsenide, but Sembrat reckons it was a consistent strategy, looking towards the Cray-5 and Cray-6 and pushing towards systems that were ever smaller and faster, thanks to the use of the material.

Seymour Cray cropped, photo: Cray

Gallium arsenide chips, Fluorinert cooling, fibre-optic interconnects – the world is still catching up to Cray's vision

Early work had begun in the 1960s on lasers using gallium arsenide and it is possible one reason Cray selected this material was that he envisaged the possibility of ditching copper for a future of fast transmissions based on fibre optics. He might have conceived a future in which lasers were embedded in the gallium arsenide chips.

Whatever the possible future developments may have been, reality intruded. The rise of cheap silicon, with its backing by Intel, coincided with the end of the Cold War. Cray's big, cash-rich audiences in the military and espionage communities, who'd loved his expensive supercomputers, saw their defence-related budgets drastically cut.

Then came IBM's Roadrunner.

Roadrunner was the first supercomputer to break the petaflop barrier and it used 6,563 dual-core AMD chips, each linked to a graphics processor, costing $120m and taking top slot in the Top 500 supercomputer rankings three times, the first in June 2008.

Roadrunner was used by Los Alamos to run nuclear weapons simulations.

The Roadrunner supercomputer at Los Alamos

IBM's Roadrunner victory was sweet at Los Alamos

Today, the headlines are the US and Chinese supercomputer powers facing off. Meanwhile, there’s competition from general-purpose systems, with massive clusters running the web-scale operations at the likes of Twitter, Google and Facebook.

But supers are still in business. The UK’s Met Office is rolling out a £97m, 16-petaflop Cray for advanced weather modeling and prediction, replacing an IBM machine in the process. The attraction is the ability to reliably run massively parallel highly critical workloads.

The concepts established by Cray remain guiding principles: speed, density and simplicity, with – of course – the by-product of heat management. Freon and Fluorinert have gone, in favour of cheaper and more environmentally friendly water-based cooling. Energy efficiency, to reduce the trademark gargantuan power consumption, is also becoming increasingly important.

The answer is silicon, coupled with faster interconnects and GPUs from the likes of AMD and Nvidia, previously more at home in gaming and media environments. Intel is firing back with its Knights Landing Xeon Phi – now also banned from going to China – which combines multi-core architecture with more on-board memory, to stop offloading data.

You can thank Seymour Cray for that. ®

More about

TIP US OFF

Send us news


Other stories you might like