Feeds

Google hints at the End of Net Neutrality

This cache makes perfect sense

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Updated Network Neutrality, the public policy unicorn that's been the rallying cry for so many many on the American left for the last three years, took a body blow on Sunday with the Wall Street Journal's disclosure that the movement's sugar-daddy has been playing both sides of the fence.

The Journal reports that Google "has approached major cable and phone companies that carry Internet traffic with a proposal to create a fast lane for its own content."

Google claims that it’s doing nothing wrong, and predictably accuses the Journal of writing a hyperbolic piece that has the facts all wrong. It's essentially correct. Google is doing nothing that Akamai doesn’t already do, and nothing that the ISPs and carriers don't plan to do to reduce the load that P2P puts on their transit connections.

Caching data close to consumers is sound network engineering practice, beneficial to users and network operators alike because it increases network efficiency. More people are downloading HDTV files from Internet sources these days, and these transactions are highly repetitive. While broadcast TV can deliver a single copy of “Survivor” to millions of viewers at a time, Internet delivery requires millions of distinct file transfers across crowded pipes to accomplish the same end: this is the vaunted end-to-end principle at work.

There’s nothing wrong with Google's proposed arrangement, and quite a lot right with it. The main beneficiary is YouTube, which accounts for some 20 per cent of the Internet’s video traffic and was recently upgraded to a quasi-HD level of service. Taking YouTube off the public Internet and moving it directly to each ISP’s private network frees up bandwidth on the public Internet. Google’s not the only one doing this, and in fact so many companies are escaping the public Internet that researchers who measure Internet traffic at public peering points, such as Andrew Odlyzko, are scratching their heads in wonderment that the traffic they can measure only increases at 50 per cent a year. Researchers who study private network behavior see growth rates closer to 100 oer cent per year, and caching systems like Google’s and Akamai’s make this kind of traffic distribution possible.

While there’s nothing to see here of a technical nature, the political impact of this revelation is study in contrasts.

Beginner's guide to SSL certificates

Next page: Cache from Chaos

More from The Register

next story
Of COURSE Stephen Elop's to blame for Nokia woes, says author
'Google did have some unique propositions for Nokia'
FCC, Google cast eye over millimetre wireless
The smaller the wave, the bigger 5G's chances of success
It's even GRIMMER up North after MEGA SKY BROADBAND OUTAGE
By 'eck! Eccles cake production thrown into jeopardy
Mobile coverage on trains really is pants
You thought it was just *insert your provider here*, but now we have numbers
Don't mess with Texas ('cos it's getting Google Fiber and you're not)
A bit late, but company says 1Gbps Austin network almost ready to compete with AT&T
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.