Data Center



Grid Computing for real

Things your PC can't do (yet)

For the first time supercomputers in the UK and the US have been linked to carry out a larger than life scientific experiment.

During the Supercomputing 2003 conference last week in Phoenix, Arizona, Three of the most powerful computing resources in the world worked in parallel with each other (using optical bandwidth from BT), to carry out complex lattice Boltzmann calculations. An experiment on this scale has never done before, let alone in such a short time span (72 hours).

Grids are to supercomputers what the World Wide Web is to documents. They connect thousands of computers to divide and farm out complex calculations. Linking multiple high performance computers through the internet allow scientists to expand their knowledge of small (microscopic) systems to much larger real-world situations.

True: the term ‘grid’ confuses a lot of people these days. It is also used for other types of computing, including utility or on demand computing, or populist experiments as the SETI screensaver, which makes use of the spare capacity and processing power of home PCs to search for alien life.

Although connecting dormant home or office PCs can be used to solve scientific problems, scientists often argue that only supercomputers can do the job properly for serious calculations.

The Boltzmann method in particular is extremely resource-intensive. Running these simulations (for modeling self-assembly and fluid flow) on smaller computers is not very practical because of the lack of memory resources and long processing times.

The TeraGyroid experiment was jointly funded by the UK’s Engineering and Physical Sciences Research Council (EPSRC)and the National Science Foundation in the USA (NSF) as part of the e-Science pilot project, RealityGrid. ®

Related stories

Sun baits Dell as OracleWorld focuses on grid
HP sends services team into the grid

Sponsored: The Nuts and Bolts of Ransomware in 2016