Feeds

IETF plans to NSA-proof all future internet protocols

Standards boffins promise bloody fight for those who seek to sniff private data

The essential guide to IT transformation

The IETF has taken the next small step down the long, long road of protecting user traffic from spooks, snoops and attackers, setting down the basic architectural principle that new protocols should resist monitoring.

It's not going to be a trivial undertaking: practically every layer of the Internet protocol stack has its origins in a more innocent era.

The new document, RFC 7258 (here), formalises the decision reached at the Vancouver IETF plenary in November [video] that pervasive monitoring is an attack on Internet users (and, in fact, “Pervasive Monitoring is an Attack” is the title of the RFC).

Unlike the blithe statements from law enforcement around the world that metadata collection is innocuous, the RFC explicitly includes metadata collection in its list of threats to Internet users, along with the collection of protocol artefacts, application content, active and passive wiretaps, traffic analysis and cryptographic subversion.

The aim of the new RFC, it says, is to record “the IETF community's consensus” and establish “the technical nature of PM.”

However, the RFC also makes the admission that we're never going to beat the spooks and snoops, because traffic necessarily traverses public networks and reveals where it's from and where it's going. “In all cases, there will remain some privacy-relevant information that is inevitably disclosed by protocols.”

Instead, the document states, protocol design in the future should “significantly increase the cost of attacking, force what was covert to be overt, or make the attack more likely to be detected”.

The RFC puts the onus on protocol developers to think about whether they're creating a new risk (or replicating an old one) early in the process: “adequate, early review of architectural decisions including whether appropriate mitigation of PM can be made is important”, because fixing mistakes late in the process is expensive.

The authors also note that the practice of reusing existing technology, while normal developer behaviour, can “significantly impact” how easy it is to monitor traffic in a new protocol. ®

Next gen security for virtualised datacentres

More from The Register

next story
Ice cream headache as black hat hacks sack Dairy Queen
I scream, you scream, we all scream 'DATA BREACH'!
Goog says patch⁵⁰ your Chrome
64-bit browser loads cat vids FIFTEEN PERCENT faster!
NIST to sysadmins: clean up your SSH mess
Too many keys, too badly managed
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
Researchers camouflage haxxor traps with fake application traffic
Honeypots sweetened to resemble actual workloads, complete with 'secure' logins
Attack flogged through shiny-clicky social media buttons
66,000 users popped by malicious Flash fudging add-on
New Snowden leak: How NSA shared 850-billion-plus metadata records
'Federated search' spaffed info all over Five Eyes chums
Three quarters of South Korea popped in online gaming raids
Records used to plunder game items, sold off to low lifes
Oz fed police in PDF redaction SNAFU
Give us your metadata, we'll publish your data
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?