Feeds

IETF plans to NSA-proof all future internet protocols

Standards boffins promise bloody fight for those who seek to sniff private data

Intelligent flash storage arrays

The IETF has taken the next small step down the long, long road of protecting user traffic from spooks, snoops and attackers, setting down the basic architectural principle that new protocols should resist monitoring.

It's not going to be a trivial undertaking: practically every layer of the Internet protocol stack has its origins in a more innocent era.

The new document, RFC 7258 (here), formalises the decision reached at the Vancouver IETF plenary in November [video] that pervasive monitoring is an attack on Internet users (and, in fact, “Pervasive Monitoring is an Attack” is the title of the RFC).

Unlike the blithe statements from law enforcement around the world that metadata collection is innocuous, the RFC explicitly includes metadata collection in its list of threats to Internet users, along with the collection of protocol artefacts, application content, active and passive wiretaps, traffic analysis and cryptographic subversion.

The aim of the new RFC, it says, is to record “the IETF community's consensus” and establish “the technical nature of PM.”

However, the RFC also makes the admission that we're never going to beat the spooks and snoops, because traffic necessarily traverses public networks and reveals where it's from and where it's going. “In all cases, there will remain some privacy-relevant information that is inevitably disclosed by protocols.”

Instead, the document states, protocol design in the future should “significantly increase the cost of attacking, force what was covert to be overt, or make the attack more likely to be detected”.

The RFC puts the onus on protocol developers to think about whether they're creating a new risk (or replicating an old one) early in the process: “adequate, early review of architectural decisions including whether appropriate mitigation of PM can be made is important”, because fixing mistakes late in the process is expensive.

The authors also note that the practice of reusing existing technology, while normal developer behaviour, can “significantly impact” how easy it is to monitor traffic in a new protocol. ®

Choosing a cloud hosting partner with confidence

More from The Register

next story
Knock Knock tool makes a joke of Mac AV
Yes, we know Macs 'don't get viruses', but when they do this code'll spot 'em
Feds seek potential 'second Snowden' gov doc leaker – report
Hang on, Ed wasn't here when we compiled THIS document
Why weasel words might not work for Whisper
CEO suspends editor but privacy questions remain
DEATH by PowerPoint: Microsoft warns of 0-day attack hidden in slides
Might put out patch in update, might chuck it out sooner
BlackEnergy crimeware coursing through US control systems
US CERT says three flavours of control kit are under attack
China is ALREADY spying on Apple iCloud users, claims watchdog
Attack harvests users' info at iPhone 6 launch
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.