Feeds

It's time to presume the web is guilty

Trevor's got a plan to save the internet

  • alert
  • submit to reddit

Maximizing your infrastructure through virtualization

Sysadmin blog The security defenses available to us are clumsy and inadequate. Anti-malware applications are grand at dealing with well known threats, but pathetic and worthless at dealing with emerging ones. Software vendors are too entrenched in politics, feasibility studies and bad attempts at public relations to bother to properly and expediently patch their software.

Meanwhile our economy becomes ever more dependant on the interconnectivity of computer systems: we have come too far to go back. Governments know this and see the failure of academia, corporations and private citizens to mitigate the threats. If we, as corporations and individuals, want the internet to remain free and open as it is today, then we have to solve these problems before the governments of the world try to do it for us.

The internet was built on the presumption of innocence. Basic protocols such as email don’t inherently contain a way to verify that the sender is legitimate. We all know how well that has worked out. Peer-to-peer protocols have many legitimate uses, but their nature lends them to illegal uses and so the vast majority of peer-to-peer traffic infringes copyright. Even the venerable Domain Name System is under attack: most new domain registrations are malicious.

It could be that the only to preserve the freedom of the internet is to do away with the presumption of innocence. I believe that, if we do not do this in the next ten years, we will lose control of the internet to government and we will never get it back.

 Look at email. Currently we rely on blacklists (such as Spamhaus) to tell us which email domains exist only to send spam. As noble as these projects are, this is completely backwards. A series of central registries with whom operators of legitimate email servers can (freely) register is the only way to make spam go away. If you are caught spamming, you fall off the planetary whitelist and getting back on should not be easy.

Similarly, peer-to-peer technologies could benefit from exactly the same concept. I rely on peer-to-peer to get access to things like Linux ISOs that are vital for my work. At the same time, however, I do not want to allow peer-to-peer traffic on my corporate network, in case copyright infringement is traced to my corporate IP. The ability to tell my firewall “deny all peer-to-peer traffic except that which has been registered with this whitelist as legitimate” would solve the problem. But short of assembling that list myself, there currently exists no such beast.

The same is becoming true of the DNS system itself. DNS blacklists are a fantastic first step, but they don’t go far enough. The day has come to start building confidence ranking into the DNS system itself. This is starting to take shape now with the controversial concept of DNS reputation.

If I had the time and capital to start a tech company out of my basement, I would be pursuing all of these ideas. Assembling blacklists is a losing battle, but there is money to be made in assembling whitelists. Individuals and corporations who prefer to experience the web in its raw form should have the option to do so, but as someone who has several networks under my care, I know that I would prefer a whitelisting approach.

We are rapidly approaching the point where due diligence means presuming all traffic to be malicious unless it can be proven otherwise. It makes no sense for each company and individual in the world to independently build and maintain their own whitelists of legitimate sources of traffic. The market is wide open for the creation of a handful of whitelists to which we could subscribe.

 Building protocol whitelists certainly won’t solve all our problems, but it would be more secure than what we are doing now. Human nature is what it is: so securing the internet means the end of presumed innocence. ®

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.