Feeds

Google's HTTP Archive merges with Internet Archive

One records pages. The other records speed

Secure remote control for conventional and virtual desktops

Velocity The HTTP Archive – a fledgling effort to record the performance of sites across the interwebs – has merged with the Internet Archive, whose Wayback Machine has long kept a similar record of internet content.

Google's Steve Souders – who founded the HTTP Archive and will continue to run it – announced the merger this morning at the O'Reilly Velocity conference in Santa Clara, California. The ultimate goal of the project is to improve the overall performance of the web by exposing its bottlenecks.

"I've had the idea of doing this for the past four or five years, where I saw that a large number of websites – even the most popular ones – weren't tracking very critical statistics about performance, like size of JavaScript or the number of script requests," Souders said.

"I thought [the project] had a lot of synergy with what the Internet Archive was doing. They were kind of two sides of the same coin. The Internet Archive – the Wayback Machine – is tracking the content of the web, whereas the HTTP Archive is tracking how that content is built and served."

Essentially, the HTTP Archive is now a sub-project of the Internet Archive, a not-for-profit based in San Francisco.

Souders also announced that in merging with the Internet Archive, the project has attracted several big name sponsors, including Google and Mozilla as well as New Relic and Strangeloop. New Relic offers an online service for measuring site performance, while Strangeloop provides a service for accelerating website load times.

Souders founded the HTTP Archive this past fall. Using the Webpagetest.org tool created by Google's Patrick Meenan, the project originally crawled about a thousand URLs, and a month later, it expanded to roughly 18,000. With those sponsors behind the project, the new goal is to track the performance of a million of the top sites.

Steve Souders

Steve Souders

Basically, the project runs sites through the Webtestpage batch API, and the results are shuttled into a MySQL base available to world+dog. The tests track not only how fast pages are, but how they serve their content and how much data is downloaded.

During a lightning demonstration at today's conference – where Souders is co-chair – he compared the performance and makeup of the top 100 websites (by traffic) with the top 1000. With the top 100, for instance, the average page size is about 437KB, while the top 1000 sites average 690KB. In top the 100, 26 per cent of resource requests fail to use caching headers, compared to 40 per cent in the top 1000. And, predictably enough, the top 100 also use significantly less Flash (36 per cent versus 50 per cent).

Meenan, who also spoke at today's conference, created Webpagetest while at AOL, but this fall, he was hired away by Google, which put him to work fulltime on Webpagetest and beefed up the project with additional engineering resources. Both Webpagetest and the HTTP Archive are open source projects. Webpagetest is under a BSD License. The HTTP Archive is under an Apache license.

The HTTP Archive has not yet accepted patches, but it has about six contributors at this point, including Souders and Meenan. You can readily browse data at HTTPArchive.org or you can download data as a MySQL dump.

One of Google's core missions is to improve the speed of the web, from one end to the other. Souders has long been at the forefront of the company's efforts to improve site load times. Previously, he was chief of performance at Yahoo!, where he built the company's YSlow performance tool. ®

Providing a secure and efficient Helpdesk

More from The Register

next story
Not appy with your Chromebook? Well now it can run Android apps
Google offers beta of tricky OS-inside-OS tech
Greater dev access to iOS 8 will put us AT RISK from HACKERS
Knocking holes in Apple's walled garden could backfire, says securo-chap
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.