Feeds

IETF plans to NSA-proof all future internet protocols

Standards boffins promise bloody fight for those who seek to sniff private data

Choosing a cloud hosting partner with confidence

The IETF has taken the next small step down the long, long road of protecting user traffic from spooks, snoops and attackers, setting down the basic architectural principle that new protocols should resist monitoring.

It's not going to be a trivial undertaking: practically every layer of the Internet protocol stack has its origins in a more innocent era.

The new document, RFC 7258 (here), formalises the decision reached at the Vancouver IETF plenary in November [video] that pervasive monitoring is an attack on Internet users (and, in fact, “Pervasive Monitoring is an Attack” is the title of the RFC).

Unlike the blithe statements from law enforcement around the world that metadata collection is innocuous, the RFC explicitly includes metadata collection in its list of threats to Internet users, along with the collection of protocol artefacts, application content, active and passive wiretaps, traffic analysis and cryptographic subversion.

The aim of the new RFC, it says, is to record “the IETF community's consensus” and establish “the technical nature of PM.”

However, the RFC also makes the admission that we're never going to beat the spooks and snoops, because traffic necessarily traverses public networks and reveals where it's from and where it's going. “In all cases, there will remain some privacy-relevant information that is inevitably disclosed by protocols.”

Instead, the document states, protocol design in the future should “significantly increase the cost of attacking, force what was covert to be overt, or make the attack more likely to be detected”.

The RFC puts the onus on protocol developers to think about whether they're creating a new risk (or replicating an old one) early in the process: “adequate, early review of architectural decisions including whether appropriate mitigation of PM can be made is important”, because fixing mistakes late in the process is expensive.

The authors also note that the practice of reusing existing technology, while normal developer behaviour, can “significantly impact” how easy it is to monitor traffic in a new protocol. ®

Beginner's guide to SSL certificates

More from The Register

next story
NASTY SSL 3.0 vuln to be revealed soon – sources (Update: It's POODLE)
So nasty no one's even whispering until patch is out
Russian hackers exploit 'Sandworm' bug 'to spy on NATO, EU PCs'
Fix imminent from Microsoft for Vista, Server 2008, other stuff
US government fines Intel's Wind River over crypto exports
New emphasis on encryption as a weapon?
To Russia With Love: Snowden's pole-dancer girlfriend is living with him in Moscow
While the NSA is tapping your PC, he's tapping ... nevermind
Forget passwords, let's use SELFIES, says Obama's cyber tsar
Michael Daniel wants to kill passwords dead
Slap for SnapChat web app in SNAP mishap: '200,000' snaps sapped
This is what happens if you hand your username and password to a 3rd-party
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.