Web browsers face crisis of security confidence
Good enough for Donald Rumsfeld. But not for you
But while the browser makers and web application developers continue to add improvements around the edges, the most glaring and menacing vulnerabilities remain untouched. For more than a decade, for example, browsers have made it easy for unauthorized people to access corporate intranets and other off-limits areas by linking public IP addresses with those cordoned behind a firewall. This opens up all kinds of nasty possibilities, including intranet port scanning, which can reveal weaknesses in corporate networks and drive-by pharming, in which attackers use a victim's browser to change crucial home router settings. Browsers have long been able to contact a PC's inner IP address known as the localhost, an ability that's essential for programs like Google Desktop to work, but also one that unnecessarily exposes huge amounts of a machine's most sensitive innards.
At fault is the net's lack of what's known as zone separation, or a mechanism for classifying some addresses as public and others as private. Designers of the Arpanet never designed the mechanism into their creation, and no one has bothered to try since.
The net also lacks a robust way to authenticate users or reliably establish secure communications channels between trusted websites and end users. As a result, each bank, ecommerce site and online broker offers a different mishmash of authentication cookies, session log-outs and one-time tokens to establish their customers' identities. And secure sockets layer, while proving surprisingly versatile in preventing man-in-the-middle attacks, has its own Achilles Heel that net architects have ignored for years.
Don't count on many of these flaws getting repaired anytime soon. Despite its lack of a foundation, the net has proved adept at supporting a dizzying number of frameworks, many that were designed to work on top of the old, buggy ways of doing things. Fixing many of these long-standing flaws will have the unintended consequence of breaking major parts of the internet. It's a little bit like digging a cellar for a 10-story building.
Tackling the problem will also require representatives from hundreds of companies to come together to forge new standards to replace the old ones. So far, no one at Microsoft, eBay, Mozilla, Cisco or anywhere else we're aware of is showing much leadership in marshaling their peers to arrive at a industry-driven solution, so the old, broken methods are allowed to continue year after year.
From the Department of Big Lunches
About the only hopeful sign we've seen is a highly experimental set of security protections that Firefox developers have been tinkering with. First reported here, the technologies insulate users against two broad classes of attacks. One would minimize exposure to XSS and cross-site request forgery attacks by allowing site developers to define which domains are allowed to initiate or answer cross-site requests for code, cookies and other site resources. A second would guard against so-called DNS rebinding attacks like those researcher Dan Kaminsky has demonstrated.
For now, the protections will largely be implemented as a Firefox extension that will serve as a proof of concept. Depending on how they're received, they could blossom into open specifications that website developers could use to enforce policies across any participating browser.
Short of Mozilla's noble experiment, browser makers, web app developers, network providers and others shaping the direction of the internet have shown woefully few signs they're ready to confront the significant challenges they've inherited. Instead, they make incremental security improvements and pass them off as achievements on par with the cure for polio.
And then they make excuses for not doing more. Eventually, the world's net users will grow tired of this course of action, and the net's architects will run out of time. Just ask Donald Rumsfeld. ®
Sponsored: Network DDoS protection