Feeds

Verizon makes nice with P2P

We can help ISPs turn internet into big TV set

Secure remote control for conventional and virtual desktops

From an ISP’s point of view, P2P traffic can appear to be exceptionally daunting. If they choose to block it, as some have accused almost all of the major US ISPs of doing, then their networks would become ghost networks, with virtually no traffic in sight. But if they embrace it, their networks are fast moving crazy places, where suppliers have to sprint to keep their network surviving.

So what’s it to be? Well Verizon appears, at least to be considering a middle road, one where instead of working against P2P, or just putting up with its traffic costs, it will offer protocols to help co-operate with P2P networks to deliver entertainment, by better understanding the conditions of the network it is traveling over. That really IS open.

The initiative began last July and is through the auspices of a Distributed Computing Industry Association (DCIA) working group called P4P, which stands for Proactive network Provider Participation for P2P. The two founder members and chairs come from Pando Networks and Verizon Communications. Pando is one of the new breed of P2P companies trying to eek out a living in legal P2P file delivery.

This is really a club for ISPs and P2P suppliers in which they can work out their differences and it is so much more of a positive approach than whining about network traffic and investing purely in “traffic shaping".

Statements from this workgroup claim that software that is already being tested which can improve download speed between 200 per cent and 600 per cent, purely by offering up a set of network APIs, which let a P2P application know which parts of a network are busy, and using this to intelligently decide which P2P nodes should be uploading in support of a file or stream delivery. It’s not rocket science, and if a CompSci grad student had been given the problem he could have come up with the same answer, but it is how to phrase that question which is interesting.

If the question was “How do we get traffic zingin around the internet, for nothing, without the help of the ISP and despite its best efforts to stop us,” then that definitively is the wrong question. If it were simply told “you have a network and multiple copies of large files distributed around that network, how do you build a rapid file delivery mechanism,” then naturally you reach the DCIA answer.

It is the history of ISPs and P2P suppliers being at each other’s throats for so long, that makes it hard to see how this might ever have come about.

In fact what needed to happen was that the livelihood of ISPs needed to be threatened, where the average customer was expecting more and more from the ISP, while the average monthly price for ISP service went down and down, and traffic on their networks went up and up, forcing more and more investment. At that point, P2P traffic is taken as a fact of life, not something that the ISP looks to the US Supreme Court to make illegal.

ISPs cannot block all P2P activity because Verisign’s Kontiki P2P client, which is now used to deliver millions of hours of TV services around the world from respectable broadcasters, Skype, as well as Joost and Babelgum, are not breaking any laws. Even Kazaa and Bit- Torrent may now be carrying more legal than illegal traffic, or if not yet, they should lean that way over time.

If we look beyond this simple set of proposals we see more and more which might be done. By bringing ISPs and P2P suppliers closer, perhaps the handshakes for this type of co-operative routing might also include some form of legitimate traffic audit. So we perhaps reach a point where if P2P traffic from your software passes some kind of “threshold” test of mostly sending legitimate files (something that deep packet inspection might still be needed for) then the APIs to sense the condition of the network are open to your client software, and it is pushed higher up the food chain in terms of the priority attached to the traffic.

If mostly copyrighted material appears to be traveling across the network, then perhaps that API co-operation is refused by the network nodes and the resulting traffic packets will be treated as low priority. That would create an underclass and upperclass of P2P clients, each with a signature which would trigger the various treatments by ISPs.

Secure remote control for conventional and virtual desktops

More from The Register

next story
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
Troll hunter Rackspace turns Rotatable's bizarro patent to stone
News of the Weird: Screen-rotating technology declared unpatentable
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.