Internet, schminternet: Boffins propose private 100Gbps HYPERNET

Pacific Research Platform will perform the internet's original purpose on own fat pipes

Fibre Optic by Barta IV https://www.flickr.com/photos/98640399@N08/ cc 2.0 attribution https://creativecommons.org/licenses/by/2.0/
Hypernet: Boffins reckon their proposed private network could reach 100Gbps throughput

A "hypernet" to be known as the Pacific Research Platform, shuffling data at up to 100Gb/s, will be established between US West Coast laboratories and several supercomputers thanks to a $5m grant from the National Science Foundation (NSF).

The grant will span five years, reports the New York Times, and will connect the NSF's existing infrastructure, which has accounted for more than $500m of investment into roughly 100 US university campuses' internal networking.

The platform will allow boffins to share gargantuan data sets between institutions, in data-heavy fields such as physics, astronomy, and genetics.

The NYT reports that the platform "will also serve as a model for future computer networks in the same way the original NSFnet, created in 1985 to link research institutions, eventually became part of the backbone for the net," said Larry Smarr, an astrophysicist who is director of Calit2, and the principal investigator for the new project.

"I believe this infrastructure will be, for decades to come, the kind of architecture by which you use peta-scale and exa-scale computers," Smarr told the NYT, referencing the enormous amounts of data currently being captured by boffins.

Insulating the new network from the peasants' internet is key, as big data transmission is often laboriously slow over traditional network connections.

Smarr adds that one internet-connected server at the University of California recently received 35,000 false login attempts in a single day. Keeping the Pacific Research Platform off of the plebnet is a large part of its hardware security design.

The network will additionally remove the necessity for location-based studies of data sets, allowing "new kinds of distributed computing for scientific applications possible".

Referencing the data collection at the Large Hadron Collider, Frank Wuerthwein, a University of California at San Diego physicist, notes that duplicates of the data sets were held globally for different researchers to analyse.

He told the NYT that increasingly fast networking allows experimental data to retained in a single location, while boffins can run their analysis programs from remote locations at significant cost savings. ®


Biting the hand that feeds IT © 1998–2017