Feeds

Do we really want 100Gig Ethernet?

When do we want it?

Secure remote control for conventional and virtual desktops

Remember when Ethernet networks were invented? Probably not: it was over 30 years ago, after all, and you are probably too young.

Even if you are not, you have probably dismissed from memory the woefully inadequate 10 meg of bandwidth on offer at the time – less than you get with most broadband services these days.

Still, the world struggled on with 10Mbps for well over a decade before first Fast (100Mbps), then Gigabit Ethernet became available.

Now we are firmly into 10Gig territory, with even faster 40Gig products starting to appear and 100Gig likely to make it into the mainstream next year.

Do we really need technology about 100 times quicker than Gigabit Ethernet or, to put it into perspective, 10,000 times faster than the original Ethernet standard?

There is no easy answer, unless you are talking about the desktop, where 100Gig is way beyond what anyone needs right now, or maybe ever – although perhaps we should never say ever.

Speed freaks

Similar pontifications were heard when Gigabit Ethernet was introduced and now, just a few years later, it is standard on most desktop and notebook PCs. There is even talk of Gigabit Wi-Fi fairly soon.

Like the past, however, the data centre is a different country. If there’s any demand at all for 100Gig it is here that you are most likely to find it, for two main reasons.

First is the need to support increasingly bandwidth-greedy applications, such as video streaming and private cloud computing.

The second, and arguably more pressing, reason is the growth in server virtualisation and the hardware consolidation that goes with it.

It’s not rocket science. The more virtual machines a physical server has to host, the smaller the share of the available network bandwidth each is likely to get, even when it is sliced up and dynamically allocated using sophisticated virtual networking software.

First in line

Bandwidth shortages really put the brake on virtual machine scalability, a problem some server vendors address by building 10Gig interfaces onto the motherboards in high-end boxes.

However, with the arrival of ever newer, faster processors with rocketing core counts, those 10Gig interfaces are already starting to look inadequate. Hence the move to develop 40Gig and 100Gig replacements.

Not everyone wants to deploy highly scalable servers hosting hundreds of virtual machines. Specialist service providers and hosting companies are likely to be the first takers, particularly those looking to make an impact in the cloud market, followed closely by larger corporates with pockets deep enough for the hefty costs involved.

Should I stay or should I go?

There are questions too as to whether it is better to go for 40Gig now or wait around for 100Gig products. The second option is undeniably more future-proof but calls for a much bigger investment in the supporting fibre infrastructure.

At the same time there is bound to be a demand for both 40Gig and 100Gig in the backbone network, as more and more servers with multiple 10Gig interfaces come online and aggregate bandwidth in the supporting infrastructure starts to get used up.

Indeed, it is more or less a rule of thumb that when you start to connect servers with a bandwidth equal to that of the fastest uplink, it is time to consider something a lot meatier.

Demand for bandwidth continues to grow ahead of supply

Lower down the food chain, however, there could be much more inertia. Many mid-range organisations are still ripping out Gigabit hardware and replacing it with 10Gig technology.

It will take a long time to recoup that investment, and even longer before those companies are ready to consider yet another bandwidth-busting exercise.

So, do we need 100Gig Ethernet? The answer is a confident yes, but maybe not as soon as some vendors might like.

Is it the ultimate solution? Definitely not. Demand for bandwidth continues to grow ahead of supply and already there is discussion as to what is next, with some confidently putting down a Terabit Ethernet marker some time around 2015.

Will we need that as well? It looks like we will. ®

Beginner's guide to SSL certificates

More from The Register

next story
Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?
The Fourth Amendment... and it IS better
Don't wait for that big iPad, order a NEXUS 9 instead, industry little bird says
Google said to debut next big slab, Android L ahead of Apple event
Microsoft to enter the STRUGGLE of the HUMAN WRIST
It's not just a thumb war, it's total digit war
Ex-US Navy fighter pilot MIT prof: Drones beat humans - I should know
'Missy' Cummings on UAVs, smartcars and dying from boredom
Netscape Navigator - the browser that started it all - turns 20
It was 20 years ago today, Marc Andreeesen taught the band to play
A drone of one's own: Reg buyers' guide for UAV fanciers
Hardware: Check. Software: Huh? Licence: Licence...?
The Apple launch AS IT HAPPENED: Totally SERIOUS coverage, not for haters
Fandroids, Windows Phone fringe-oids – you wouldn't understand
Apple SILENCES Bose, YANKS headphones from stores
The, er, Beats go on after noise-cancelling spat
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.