Feeds

Nvidia gets biological with life sciences nerds

Better shampoos through GPUs

Next gen security for virtualised datacentres

Nvidia has a substantial lead over rivals Intel, Advanced Micro Devices, and IBM when it comes to peddling graphics co-processors, and it wants to keep that lead and extend it, if possible. That means doing boring old stuff that server and operating system makers have to do, such as lining up application software vendors so they can take full advantage of the Tesla family of GPU co-processors.

To that end, Nvidia has corralled a dozen popular life sciences applications vendors and made sure their code has been run through the CUDA programming environment and can leverage the substantial number-crunching power of Tesla co-processors.

According to Sumit Gupta, senior product manager of the Tesla line at Nvidia, there are more than 500,000 scientists worldwide who are using the computational methods employed in these applications. They are used to simulate molecular compounds and rudimentary organisms (or subsets of them), and they are mostly relying on beefed up x64 workstations and lots of time or lots of iron on a supercomputer but only a small slice of time to do their simulations.

With the Tesla Bio Workbench launched today, Nvidia and its software and hardware partners want life sciences researchers to get greedy and to crave more powerful workstations. These machines - a combination of x64 processors and GPU co-processors - mean that they won't have to share to run their simulations locally. They also want to add GPUs to supercomputer clusters to either simulate more complex molecules and organisms or run longer simulations than are possible on the workstations.

There is a direct relationship between the flops in a box and the complexity or duration of a simulation that box can run, and life sciences applications are no exception. Back in 1982, a top-of-the-line supercomputer with one gigaflops of performance could simulate the 3,000 atoms in the protein aprotinin, also known as bovine pancreatic trypsin inhibitor.

By 1997, a supercomputer with hundreds of gigaflops could simulate the 36,000 atoms in an estrogen receptor, and by 2003, a teraflops-class super could model the 327,000 atoms in the F1 portion of ATP synthase, which is cool because it is a molecular rotor powered by proton gradients inside the cell.

Huge progress has been made in recent years - the 2.7 million atoms in a ribosome being simulated in 2006, for example. The downer is that it took eight months on a massively parallel supercomputer with many hundreds of teraflops to simulate a mere 2 nanoseconds of the ribosome's behavior.

A petaflops supercomputer will be able to simulate the 50 million atoms in a chomatophore - pigment cells found in fish, lizards, amphibians, and other animals often used for camouflage. An exaflops super, by contrast - that's 1,000 petaflops, a performance level we might hit in two, three, or four years depending on who you ask - should be able to simulate a whole bacteria with billions of atoms.

But simulation time needs to increase and the time to run the simulation needs to decrease for these simulations to be useful. Gupta says that researchers need to be able to simulate somewhere between 1 to 100 microseconds, sometimes milliseconds, to do useful modelling of molecular interactions that might, for instance, show how a drug interacts with a cell.

Such speedups are going to require GPU co-processors, says Nvidia, and lots of them. And there are plenty of HPC researchers who are not sure that even this will be enough, as supercomputer designers face daunting power and cooling issues as they try to push up to the exaflops performance level.

The Tesla Bio Workbench is not just about complex molecular and bacterial simulations at the largest supercomputer centers, but doing practical things like discovering new drugs or designing a better shampoo or detergent more quickly than currently can be done.

5 things you didn’t know about cloud backup

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Solving today's distributed Big Data backup challenges
Enable IT efficiency and allow a firm to access and reuse corporate information for competitive advantage, ultimately changing business outcomes.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.