Feeds

Deutsche Telekom shatters data-transfer speed record

734 kilometers, one second, 77 CDs

Build a business case: developing custom apps

Researchers at Deutsche Telekom's T-Labs have blasted bits at impressive velocity down a single optical fiber, breaking the previous long-distance data-transfer record by more than a factor of two.

The bit boffins achieved a 512 gigabits-per-second transmission rate over a single optical fiber from Berlin to Hanover and back, a distance of 734 kilometers. Subtracting out the error-correction overhead, the total usable bandwidth was 400Gb/s – enough, T-labs points out, to transmit a stream of data equivalent to 77 music CDs in one second.

Just last December, a team of Canadian and US researchers managed to sustain a computer-to-computer data transfer of a combined 186Gb/s between the University of Victoria Computer Centre and the SuperComputing 2011 convention in Seattle – and that was a combined 88Gb/s in one direction and 98Gb/s in the other.

T-Labs popped all their bits down a single 100GHz fiber line at just over five bits per cycle.

"This tremendous transmission performance was reached using innovative transmission technology with two carrier frequencies, two polarization planes, 16-QAM quadrature amplitude modulation and digital offline signal processing for the equalization of fiber influences with soft-FEC forward error correction decoding in the receiver," T-Labs explains.

This new technology, they say, would enable a standard 48-channel, 100GHz optical-fiber transmission/reception setup to achieve a total throughput of 24.6 terabits per second. A quick bit of multiplication shows that system to be able to squirt 3,696 CDs-worth of data in one second.

There was no need to replace the fiber itself. As T-Labs notes, the channel was in place, and the modifications were made to the transmitters and receivers. As such, the improvements in data-transmission rates could be achieved without the expense of laying new fiber.

"We are very proud of having attained this tremendous transmission performance over the Internet under real-world conditions," said T-Labs manager Heinrich Arnold.

In The Reg's humble opinion, if these rates are stable, repeatable, and relatively easily and inexpensively deployable, Herr Arnold and his team have much to be proud of. ®

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?