Feeds

What's most important? Bandwidth over kilo-miles, or milli-watts?

Big Blue boffins, AT&T brainboxes beg to differ

5 things you didn’t know about cloud backup

AT&T boffins reckon they can fling 400Gb/sec down 12,000km of fibre using a new modulation technique. Meanwhile, IBM's bods say they managed 25Gb/sec over just a few millimetres - but using just 24 milliwatts.

Both teams will present their research at next week's OFC-NFOEC conference in Anaheim, California, where the future of optical communications will be discussed by 12,000 or so delegates. The mega-corps' technologies lie at two extremes of optical chatter: AT&T is focussed on getting data around the world while IBM just wants to reach the other side of the computer.

Getting data around with a circuit board may seem trivial, what with the copper tracks all over the place, but engineers are increasingly looking at radio and optical links between components that can't be connected easily across a packed motherboard. The team from IBM, funded by everyone's favourite boffinry bankrollers DARPA, is looking to fit the technology into "exascale" computers it expects to be developed by 2020.

To that end, the Big Blue team has created a laser that consumes only 24 milliwatts while transmitting data at 25Gb/sec, smashing previous records (we're told) thanks to a collection of technologies too obscure to have decent acronyms: silicon-on-insulator complementary metal-oxide-semiconductor (SOI CMOS) combined with advanced vertical cavity surface emitting lasers (VCSELs), according to Science Daily.

Telecoms giant AT&T, meanwhile, is more interested in getting data around the world, and has developed a new encoding technique that can apparently throw 400Gb/sec down 12,000km of fibre by running eight separate signals, 100GHz apart, multiplexed by wavelength to pack more data into the transmission.

Disappointingly, the transmitter and receiver weren’t actually 12,000km apart, but instead the signal was sent over the same 100km lengths until it degraded, suggesting that it could have survived the 12K trip if it had needed to.

Radio communications today only uses about 100GHz of spectrum in total, right at the bottom of the dial, but we've spend decades squeezing more data into each megahertz of bandwidth. Visible light, way up in the terahertz range, can't go through walls nor bounce off the ionosphere, but there are plenty more frequencies available below that and we've a lot to learn about how to make the most of them. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
So, Apple won't sell cheap kit? Prepare the iOS garden wall WRECKING BALL
It can throw the low cost race if it looks to the cloud
EE accused of silencing customer gripes on social media pages
Hello. HELLO. Can EVERYTHING EVERYWHERE HEAR ME?!
Time Warner Cable customers SQUEAL as US network goes offline
A rude awakening: North Americans greeted with outage drama
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
BT customers face broadband and landline price hikes
Poor punters won't be affected, telecoms giant claims
Broadband slow and expensive? Blame Telstra says CloudFlare
Won't peer, will gouge for Internet transit
prev story

Whitepapers

Best practices for enterprise data
Discussing how technology providers have innovated in order to solve new challenges, creating a new framework for enterprise data.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?