Feeds

It's time to presume the web is guilty

Trevor's got a plan to save the internet

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

Sysadmin blog The security defenses available to us are clumsy and inadequate. Anti-malware applications are grand at dealing with well known threats, but pathetic and worthless at dealing with emerging ones. Software vendors are too entrenched in politics, feasibility studies and bad attempts at public relations to bother to properly and expediently patch their software.

Meanwhile our economy becomes ever more dependant on the interconnectivity of computer systems: we have come too far to go back. Governments know this and see the failure of academia, corporations and private citizens to mitigate the threats. If we, as corporations and individuals, want the internet to remain free and open as it is today, then we have to solve these problems before the governments of the world try to do it for us.

The internet was built on the presumption of innocence. Basic protocols such as email don’t inherently contain a way to verify that the sender is legitimate. We all know how well that has worked out. Peer-to-peer protocols have many legitimate uses, but their nature lends them to illegal uses and so the vast majority of peer-to-peer traffic infringes copyright. Even the venerable Domain Name System is under attack: most new domain registrations are malicious.

It could be that the only to preserve the freedom of the internet is to do away with the presumption of innocence. I believe that, if we do not do this in the next ten years, we will lose control of the internet to government and we will never get it back.

 Look at email. Currently we rely on blacklists (such as Spamhaus) to tell us which email domains exist only to send spam. As noble as these projects are, this is completely backwards. A series of central registries with whom operators of legitimate email servers can (freely) register is the only way to make spam go away. If you are caught spamming, you fall off the planetary whitelist and getting back on should not be easy.

Similarly, peer-to-peer technologies could benefit from exactly the same concept. I rely on peer-to-peer to get access to things like Linux ISOs that are vital for my work. At the same time, however, I do not want to allow peer-to-peer traffic on my corporate network, in case copyright infringement is traced to my corporate IP. The ability to tell my firewall “deny all peer-to-peer traffic except that which has been registered with this whitelist as legitimate” would solve the problem. But short of assembling that list myself, there currently exists no such beast.

The same is becoming true of the DNS system itself. DNS blacklists are a fantastic first step, but they don’t go far enough. The day has come to start building confidence ranking into the DNS system itself. This is starting to take shape now with the controversial concept of DNS reputation.

If I had the time and capital to start a tech company out of my basement, I would be pursuing all of these ideas. Assembling blacklists is a losing battle, but there is money to be made in assembling whitelists. Individuals and corporations who prefer to experience the web in its raw form should have the option to do so, but as someone who has several networks under my care, I know that I would prefer a whitelisting approach.

We are rapidly approaching the point where due diligence means presuming all traffic to be malicious unless it can be proven otherwise. It makes no sense for each company and individual in the world to independently build and maintain their own whitelists of legitimate sources of traffic. The market is wide open for the creation of a handful of whitelists to which we could subscribe.

 Building protocol whitelists certainly won’t solve all our problems, but it would be more secure than what we are doing now. Human nature is what it is: so securing the internet means the end of presumed innocence. ®

Internet Security Threat Report 2014

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.