What's most important? Bandwidth over kilo-miles, or milli-watts?
Big Blue boffins, AT&T brainboxes beg to differ
AT&T boffins reckon they can fling 400Gb/sec down 12,000km of fibre using a new modulation technique. Meanwhile, IBM's bods say they managed 25Gb/sec over just a few millimetres - but using just 24 milliwatts.
Both teams will present their research at next week's OFC-NFOEC conference in Anaheim, California, where the future of optical communications will be discussed by 12,000 or so delegates. The mega-corps' technologies lie at two extremes of optical chatter: AT&T is focussed on getting data around the world while IBM just wants to reach the other side of the computer.
Getting data around with a circuit board may seem trivial, what with the copper tracks all over the place, but engineers are increasingly looking at radio and optical links between components that can't be connected easily across a packed motherboard. The team from IBM, funded by everyone's favourite boffinry bankrollers DARPA, is looking to fit the technology into "exascale" computers it expects to be developed by 2020.
To that end, the Big Blue team has created a laser that consumes only 24 milliwatts while transmitting data at 25Gb/sec, smashing previous records (we're told) thanks to a collection of technologies too obscure to have decent acronyms: silicon-on-insulator complementary metal-oxide-semiconductor (SOI CMOS) combined with advanced vertical cavity surface emitting lasers (VCSELs), according to Science Daily.
Telecoms giant AT&T, meanwhile, is more interested in getting data around the world, and has developed a new encoding technique that can apparently throw 400Gb/sec down 12,000km of fibre by running eight separate signals, 100GHz apart, multiplexed by wavelength to pack more data into the transmission.
Disappointingly, the transmitter and receiver weren’t actually 12,000km apart, but instead the signal was sent over the same 100km lengths until it degraded, suggesting that it could have survived the 12K trip if it had needed to.
Radio communications today only uses about 100GHz of spectrum in total, right at the bottom of the dial, but we've spend decades squeezing more data into each megahertz of bandwidth. Visible light, way up in the terahertz range, can't go through walls nor bounce off the ionosphere, but there are plenty more frequencies available below that and we've a lot to learn about how to make the most of them. ®
I am massively impressed by fibre
I'm still astonished at how they can create glass that's so transparent, light can actually go that distance..
For distance you have to use fibre because of the invariable ratio of power to distance squared and this limits you to light frequencies, by no means all of which are currently being used.
Radio propagation is well understood and, while, there are still plenty of bands available (the ITU carves up spectrum in not the most efficient way) you have to trade bandwidth for propagation.
The developments are complementary and impressive in their own right. Squeezing more out of an existing underwater cable is cheaper than laying a new one. Reducing power consumption while boosting data transmission will be welcome along the chain: in the server but also in the switches of the various NICs.
High frequencies are attenuated, but can be used up to ~ 300Ghz for high bandwidth directional or short range applications.
The Americans are also using 95GHz for their favourite method of communications: http://en.wikipedia.org/wiki/Active_Denial_System