Feeds

Deutsche Telekom shatters data-transfer speed record

734 kilometers, one second, 77 CDs

Combat fraud and increase customer satisfaction

Researchers at Deutsche Telekom's T-Labs have blasted bits at impressive velocity down a single optical fiber, breaking the previous long-distance data-transfer record by more than a factor of two.

The bit boffins achieved a 512 gigabits-per-second transmission rate over a single optical fiber from Berlin to Hanover and back, a distance of 734 kilometers. Subtracting out the error-correction overhead, the total usable bandwidth was 400Gb/s – enough, T-labs points out, to transmit a stream of data equivalent to 77 music CDs in one second.

Just last December, a team of Canadian and US researchers managed to sustain a computer-to-computer data transfer of a combined 186Gb/s between the University of Victoria Computer Centre and the SuperComputing 2011 convention in Seattle – and that was a combined 88Gb/s in one direction and 98Gb/s in the other.

T-Labs popped all their bits down a single 100GHz fiber line at just over five bits per cycle.

"This tremendous transmission performance was reached using innovative transmission technology with two carrier frequencies, two polarization planes, 16-QAM quadrature amplitude modulation and digital offline signal processing for the equalization of fiber influences with soft-FEC forward error correction decoding in the receiver," T-Labs explains.

This new technology, they say, would enable a standard 48-channel, 100GHz optical-fiber transmission/reception setup to achieve a total throughput of 24.6 terabits per second. A quick bit of multiplication shows that system to be able to squirt 3,696 CDs-worth of data in one second.

There was no need to replace the fiber itself. As T-Labs notes, the channel was in place, and the modifications were made to the transmitters and receivers. As such, the improvements in data-transmission rates could be achieved without the expense of laying new fiber.

"We are very proud of having attained this tremendous transmission performance over the Internet under real-world conditions," said T-Labs manager Heinrich Arnold.

In The Reg's humble opinion, if these rates are stable, repeatable, and relatively easily and inexpensively deployable, Herr Arnold and his team have much to be proud of. ®

3 Big data security analytics techniques

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Brit boffins use TARDIS to re-route data flows through time and space
'Traffic Assignment and Retiming Dynamics with Inherent Stability' algo can save ISPs big bucks
Microsoft's Nadella: SQL Server 2014 means we're all about data
Adds new big data tools in quest for 'ambient intelligence'
prev story

Whitepapers

Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.