This article is more than 1 year old

Anti-hacking laws 'can hobble net security'

Good Samaritans discouraged by threat of prosecution

The working group's report, available from the Computer Security Institute (registration required), includes four case studies including that of Eric McCarty.

In June 2005, McCarty, a prospective student at the University of Southern California, found a flaw in the school's online application system and notified SecurityFocus of the issue.

SecurityFocus contacted the school at the request of McCarty and relayed the information to USC, which initially denied the seriousness of the issue but eventually acknowledged the vulnerability after McCarty produced four records that he had copied from the database. In April 2006, federal prosecutors leveled a single charge of computer intrusion against McCarty, who accepted the charge last September.

As part of its policy, SecurityFocus did not publish an article on the issue until USC had secured its database.

While CSI's Peters believes that good Samaritans should be given some leeway, a few of the comments found on McCarty's computer by the FBI - and repeated in court documents - suggested that vengeance was a motive. For that reason, Peters suggests that security researchers who decide to look for vulnerabilities in websites use discretion in dealing with site owners.

"You can't let anyone run wild and hack into websites indiscriminately," Peters said. "If you publicly disclose a vulnerability in a website you are pointing a big red arrow at a single site, so there needs to be some discretion."

The working group also concluded that the web is becoming increasingly complex as more sites share information and increase interactivity, characteristics of what is referred to as Web 2.0. Earlier this year, security researchers warned that Asynchronous JavaScript and XML (AJAX), a technology that many sites use to add Web 2.0 features, brings additional risks to the table for security researchers and vulnerability analysts.

"AJAX is not necessarily adding more vulnerabilities to the landscape, it is making it more difficult for the scanner vendors to find the vulnerabilities," said WhiteHat Security's Grossman, who is also a member of the working group. "The sites still have vulnerabilities, but they are harder to find."

Independent researchers finding vulnerabilities in websites could put pressure on site owners to secure their part of the internet. However, the working group could not agree on whether the law should be changed to allow for good Samaritans.

That likely leaves liability as the best stick, said Grossman, who website owners should be held liable to some extent for any consumer data lost due to a vulnerability in their site.

"I think the motivation has to monetary," he said. "Right now, the website owners are the ones that have to pay for the security, but the consumer is the one bearing all the costs of failure."

Such an equation, he said, is unlikely to add up to better security.

This article originally appeared in Security Focus.

Copyright © 2007, SecurityFocus

More about

TIP US OFF

Send us news


Other stories you might like