Feeds

Deutsche Telekom shatters data-transfer speed record

734 kilometers, one second, 77 CDs

Security for virtualized datacentres

Researchers at Deutsche Telekom's T-Labs have blasted bits at impressive velocity down a single optical fiber, breaking the previous long-distance data-transfer record by more than a factor of two.

The bit boffins achieved a 512 gigabits-per-second transmission rate over a single optical fiber from Berlin to Hanover and back, a distance of 734 kilometers. Subtracting out the error-correction overhead, the total usable bandwidth was 400Gb/s – enough, T-labs points out, to transmit a stream of data equivalent to 77 music CDs in one second.

Just last December, a team of Canadian and US researchers managed to sustain a computer-to-computer data transfer of a combined 186Gb/s between the University of Victoria Computer Centre and the SuperComputing 2011 convention in Seattle – and that was a combined 88Gb/s in one direction and 98Gb/s in the other.

T-Labs popped all their bits down a single 100GHz fiber line at just over five bits per cycle.

"This tremendous transmission performance was reached using innovative transmission technology with two carrier frequencies, two polarization planes, 16-QAM quadrature amplitude modulation and digital offline signal processing for the equalization of fiber influences with soft-FEC forward error correction decoding in the receiver," T-Labs explains.

This new technology, they say, would enable a standard 48-channel, 100GHz optical-fiber transmission/reception setup to achieve a total throughput of 24.6 terabits per second. A quick bit of multiplication shows that system to be able to squirt 3,696 CDs-worth of data in one second.

There was no need to replace the fiber itself. As T-Labs notes, the channel was in place, and the modifications were made to the transmitters and receivers. As such, the improvements in data-transmission rates could be achieved without the expense of laying new fiber.

"We are very proud of having attained this tremendous transmission performance over the Internet under real-world conditions," said T-Labs manager Heinrich Arnold.

In The Reg's humble opinion, if these rates are stable, repeatable, and relatively easily and inexpensively deployable, Herr Arnold and his team have much to be proud of. ®

Remote control for virtualized desktops

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY?
China's Memblaze says they've got it in PCIe. Yow
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
This time it's SO REAL: Overcoming the open-source orgasm myth with TODO
If the web giants need it to work, hey, maybe it'll work
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
Storage array giants can use Azure to evacuate their back ends
Site Recovery can help to move snapshots around
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.