Private browsing isn't: Boffins say smut-mode can't hide your tracks

MIT researchers want web devs and sites to protect you. Good luck with that, chaps

A group of boffins working at MIT's Computer Science and Artificial Intelligence Laboratory believe that “private” browsing modes aren't private, so have given developers a framework to fix it.

The problem, wrote Frank Wang with his thesis advisors (Nickolai Zeldovich and luminary James Micken), is that even if you're using “private” or “incognito” mode on standard browsers, several leak vectors remain: the file system, the browser cache, the DNS cache, and even “on-disk reflections of RAM such as the swap file”.

In a paper (PDF) delivered last week at the Network and Distributed Systems Security Symposium, the three presented the fix: a framework called "Veil" that puts an onus on site operators to stop the leaks. Developers, Wang writes in the paper, do control what's sent to browsers, and which servers deliver the content.

Of course, there are plenty of sites either indifferent or hostile to user privacy. The Veil framework won't change their minds, but Wang says sites that want to protect privacy but “lack the technical skill”, and those who are “actively invested” in protecting users need help.

Encryption is at the heart of Veil, but it's used differently to something like an HTTPS deployment. Here's how VEIL works

  • Compilation – HTML and CSS files are passed through the Veil compiler, which uses encryption to create a URL that can't be linked to the original. These are called “blinded references”, and a runtime library injected into each page also forces dynamic requests to be blinded;
  • Servers – the compiler sends Web page objects to Veil's “blinding servers”, which send content to users, and which also mutate the content (HTML, CSS and JavaScript) to protect users' client-side memory artefacts. The result is that different users get a unique client-side representation of the page;
  • Client-side management – Veil forces the operating system's “least-recently-used” algorithm to keep sensitive RAM pages in memory, so they don't land on a disk cache;
  • Document Object Model (DOM) hiding mode – the highest privacy level, this treats the browser as a dumb graphic terminal – no executable code is ever sent to the user. Instead, pages are rendered at the server-side and only the image is sent, so there's no chance of a privacy leak from the browser;
  • State encryption – Veil can store private, persistent state by encrypting the state. It gets a blinded reference the user generates, not the site.

As MIT explains here, even the “DOM hiding mode” doesn't stop a user interacting with a site. The browser records the location of a click, that location is sent to the Veil server, and the server sends the new page image.

As MIT's announcement notes, Veil imposes an extra infrastructure requirement on website operators. Apart from adopting the Veil framework, they also need to be willing to host the extra server infrastructure.

That makes Veil more likely to be of interest to sites that stake their reputations on privacy-protected services.

Wang also believes the performance penalty is bearable, writing: “Experiments show that Veil’s overheads are moderate: 1.25x–3.25x for Veil with encrypted client-side storage, mutated DOM content, and heap walking; and 1.2x–2.1x for Veil in DOM hiding mode.” ®

Sponsored: Minds Mastering Machines - Call for papers now open

Biting the hand that feeds IT © 1998–2018