Feeds

Use of web archive was not hacking, says US court

Though bypassing its protection measures could be

Reducing the cost and complexity of web vulnerability management

The use of web archive The Wayback Machine did not constitute hacking in the case of a law firm which used the web archive to see pages which owners did not want it to see, a US court has ruled.

The deliberate bypassing or evasion of the archive's protection measures could still be deemed hacking, though, said Judge Robert Kelly, the judge in the Eastern District of Pennsylvania. In this case, protection mechanisms put in place by the page owners had failed.

In a dispute over intellectual property, patient advocacy group Healthcare Advocates sued Health Advocate Inc. The company being sued was represented by law firm Harding Earley Follmer & Frailey.

Law firm Harding viewed a number of Health Advocates' web pages on The Wayback Machine on 9 July. On 7 or 8 July that company's president, Kevin Flynn, had put a robots.txt file on its pages which should have barred the Wayback Machine from accessing its pages. But lawyers at Harding were able to view the pages because of a malfunction at The Wayback Machine.

"Plaintiffs' expert, Gideon Lenkey, has testified that the Harding firm was able to view archived screenshots of Healthcare Advocates' website because the servers at Internet Archive were not respecting robots.txt files," said Kelly's ruling. "Mr Lenkey also testified that the Harding firm did not engage in 'hacking'."

Circumventing an electronic protective measure breaks federal law in the US, and Healthcare Advocates brought a law suit against Harding.

Kelly ruled, though, that because Healthcare Advocate's protections malfunctioned, there was no protection to break or bypass.

"When the Harding firm accessed Internet Archive’s database on 9 July, 2003, and 14 July, 2003, it was as though the protective measure was not present," he wrote. "Charles Riddle and Kimber Titus simply made requests through the Wayback Machine that were filled. They received the images they requested only because the servers processing the requests disregarded the robots.txt file present on Healthcare Advocates' website.

"As far as the Harding firm knew, no protective measures were in place in regard to the archived screenshots they were able to view. They could not avoid or bypass any protective measure, because nothing stood in the way of them viewing these screenshots. The Harding firm did not use alter code language to render the robots.txt file void like the defendant in Corley did with the encryption," said Kelly.

"They did not 'pick the lock' and avoid or bypass the protective measure, because there was no lock to pick. The facts show that the Harding firm received the archived images solely because of a malfunction in the servers processing the requests."

Healthcare Advocates also claimed that Harding had breached copyright law in their viewing and use of the web pages, but Kelly ruled that the law firm's activity constituted fair use of the material.

The company also claimed that the activity broke the Computer Fraud and Abuse Act, a claim Kelly also rejected.

Kelly granted summary judgment in Harding's favour. He said in his ruling: "It would be an absurd result if an attorney defending a client against charges of trademark and copyright infringement was not allowed to view and copy publicly available material, especially material that his client was alleged to have infringed."

The ruling said that in this case the placing of a robots.txt file, which is most often used to give instructions to search engine "robots" on what pages of a website should not be indexed, constitutes a "technological measure" within the DMCA.

That ruling will have limited relevance in other cases, though. No court in the US has yet said that such a file constitutes a technological measure in every case, and Kelly warned against interpreting his specific ruling in that way.

"The only way to gain access would be for Healthcare Advocates to remove the robots.txt file from its website, and only the website owner can remove the robots.txt file. Thus, in this situation, the robots.txt file qualifies as a technological measure effectively controlling access to the archived copyrighted images of Healthcare Advocates," he said. "This finding should not be interpreted as a finding that a robots.txt file universally qualifies as a technological measure that controls access to copyrighted works under the DMCA."

Copyright © 2007, OUT-LAW.com

OUT-LAW.COM is part of international law firm Pinsent Masons.

Security and trust: The backbone of doing business over the internet

More from The Register

next story
Phones 4u slips into administration after EE cuts ties with Brit mobe retailer
More than 5,500 jobs could be axed if rescue mission fails
JINGS! Microsoft Bing called Scots indyref RIGHT!
Redmond sporran metrics get one in the ten ring
Driving with an Apple Watch could land you with a £100 FINE
Bad news for tech-addicted fanbois behind the wheel
Murdoch to Europe: Inflict MORE PAIN on Google, please
'Platform for piracy' must be punished, or it'll kill us in FIVE YEARS
Phones 4u website DIES as wounded mobe retailer struggles to stay above water
Founder blames 'ruthless network partners' for implosion
Found inside ISIS terror chap's laptop: CELINE DION tunes
REPORT: Stash of terrorist material found in Syria Dell box
Sony says year's losses will be FOUR TIMES DEEPER than thought
Losses of more than $2 BILLION loom over troubled Japanese corp
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
Protecting users from Firesheep and other Sidejacking attacks with SSL
Discussing the vulnerabilities inherent in Wi-Fi networks, and how using TLS/SSL for your entire site will assure security.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.