Feeds

Anti-hacking laws 'can hobble net security'

Good Samaritans discouraged by threat of prosecution

Beginner's guide to SSL certificates

The working group's report, available from the Computer Security Institute (registration required), includes four case studies including that of Eric McCarty.

In June 2005, McCarty, a prospective student at the University of Southern California, found a flaw in the school's online application system and notified SecurityFocus of the issue.

SecurityFocus contacted the school at the request of McCarty and relayed the information to USC, which initially denied the seriousness of the issue but eventually acknowledged the vulnerability after McCarty produced four records that he had copied from the database. In April 2006, federal prosecutors leveled a single charge of computer intrusion against McCarty, who accepted the charge last September.

As part of its policy, SecurityFocus did not publish an article on the issue until USC had secured its database.

While CSI's Peters believes that good Samaritans should be given some leeway, a few of the comments found on McCarty's computer by the FBI - and repeated in court documents - suggested that vengeance was a motive. For that reason, Peters suggests that security researchers who decide to look for vulnerabilities in websites use discretion in dealing with site owners.

"You can't let anyone run wild and hack into websites indiscriminately," Peters said. "If you publicly disclose a vulnerability in a website you are pointing a big red arrow at a single site, so there needs to be some discretion."

The working group also concluded that the web is becoming increasingly complex as more sites share information and increase interactivity, characteristics of what is referred to as Web 2.0. Earlier this year, security researchers warned that Asynchronous JavaScript and XML (AJAX), a technology that many sites use to add Web 2.0 features, brings additional risks to the table for security researchers and vulnerability analysts.

"AJAX is not necessarily adding more vulnerabilities to the landscape, it is making it more difficult for the scanner vendors to find the vulnerabilities," said WhiteHat Security's Grossman, who is also a member of the working group. "The sites still have vulnerabilities, but they are harder to find."

Independent researchers finding vulnerabilities in websites could put pressure on site owners to secure their part of the internet. However, the working group could not agree on whether the law should be changed to allow for good Samaritans.

That likely leaves liability as the best stick, said Grossman, who website owners should be held liable to some extent for any consumer data lost due to a vulnerability in their site.

"I think the motivation has to monetary," he said. "Right now, the website owners are the ones that have to pay for the security, but the consumer is the one bearing all the costs of failure."

Such an equation, he said, is unlikely to add up to better security.

This article originally appeared in Security Focus.

Copyright © 2007, SecurityFocus

Remote control for virtualized desktops

More from The Register

next story
Regin: The super-spyware the security industry has been silent about
NSA fingered as likely source of complex malware family
Why did it take antivirus giants YEARS to drill into super-scary Regin? Symantec responds...
FYI this isn't just going to target Windows, Linux and OS X fans
Privacy bods offer GOV SPY VICTIMS a FREE SPYWARE SNIFFER
Looks for gov malware that evades most antivirus
Patch NOW! Microsoft slings emergency bug fix at Windows admins
Vulnerability promotes lusers to domain overlords ... oops
HACKERS can DELETE SURVEILLANCE DVRS remotely – report
Hikvision devices wide open to hacking, claim securobods
'Regin': The 'New Stuxnet' spook-grade SOFTWARE WEAPON described
'A degree of technical competence rarely seen'
Home Office: Fancy flogging us some SECRET SPY GEAR?
If you do, tell NOBODY what it's for or how it works
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Seattle children’s accelerates Citrix login times by 500% with cross-tier insight
Seattle Children’s is a leading research hospital with a large and growing Citrix XenDesktop deployment. See how they used ExtraHop to accelerate launch times.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Business security measures using SSL
Examines the major types of threats to information security that businesses face today and the techniques for mitigating those threats.