US vuln info-sharing plan draws fire
More harm than good?
A long-anticipated program meant to encourage companies to provide the federal government with confidential information about vulnerabilities in critical systems took effect Friday, but critics worry that it may do more harm than good.
The so-called Protected Critical Infrastructure Information (PCII) program allows corporations who run key elements of U.S. infrastructure - energy firms, telecommunications carriers, financial institutions, etc. - to submit details about their physical and cyber vulnerabilities to a newly-formed office within the Department of Homeland Security, with legally-binding assurances that the information will not be used against them or released to the public.
The programme implements controversial legislation that bounced around Capitol Hill for years before Congress passed it in the wake of the September 11 attacks as part of the Homeland Security Act of 2002. Security agencies have long sought information about vulnerabilities and likely attack points in critical infrastructures, but have found the private sector reluctant to share, for fear that sensitive or embarrassing information would be released through the Freedom of Information Act (FOIA).
As of Friday, federal law now protects that vulnerability information from disclosure through FOIA, and makes it illegal for government workers to leak it, provided companies follow certain procedures and submit the data to the new PCII office.
It's been a long time in coming, says former White House cyber security advisor Howard Schmidt. "For a long time there has been informal reporting - sort of the off-the-record discussion, but nothing anyone can document," says Schmidt. "Under this you can get much more detailed information and do some good analysis [of] whether an incident is run-of-the-mill hacking activity, or something that requires government action."
But some public interest advocates worry about the law. One of their arguments is that the PCII program takes the exact opposite approach of what it should: depriving the government of any means to compel companies to fix vulnerabilities, either by fiat, or by bringing public pressure to bear.
"Basically, the information goes into government, and that's the dead end," says Sean Moulton, a senior policy analyst at OMB Watch. "Aside from encouraging the companies to do something, as far as my reading of the statute, they don't have much authority at all, and they can't warn the public."
Moulton says a more effective approach would compel companies to report vulnerabilities to the government, and give the government the power to enforce reforms, or, alternatively, warn the public. "I think the companies shouldn't even have the option of not reporting, if it's a real critical infrastructure vulnerability," says Moulton.
Shadow versus Sunshine
But critics don't just worry that the PCII program may be ineffective: they say the new regulations might even provide companies with legal cover for their own negligence - resulting in less security, not more.
A key provision of the law bars the government from using the vulnerability information in any enforcement action against the company, or from using it as the basis for proposing new legislation or regulations on industry. And if the information does somehow leak out, it cannot be used in court against the company.
"You're going to have scenarios where the company discloses negligent maintenance of a system to the government, and the government can't disclose it to anybody," says David Sobel, an attorney with the Electronic Privacy Information Center, who testified against the proposal before Congress. "Then some disaster occurs... and we'll have a hard time holding the company accountable because they'll say, 'We told the Department of Homeland Security about this, and the law says we're immune from any liability.'"
Of course, the law wasn't intended as a shield for corporate negligence: information that comes to the government independently of the PCII reporting is still fair game. If the EPA inspects a chemical plant and finds a leak of toxic chemicals, it can still take action against the company, even if the polluter reported the rusty pipes to the DHS as a "vulnerability." Likewise, someone injured as a result of a company's negligence can still sue the company - they just can't get any PCII information from the government.
But Sobel dreads the new range of arguments that corporate lawyers could raise under the law. A company facing an enforcement action could claim that government watchdogs were tipped off by the DHS, forcing the EPA, for example, to prove that they developed their information independently of the PCII program.
Or the plaintiff in a lawsuit could face additional hurdles in subpoenaing documents that prove that a company had knowledge of a vulnerability, because the company could argue they only compiled the information to provide it to DHS under the PCII program, and it should therefore be considered privileged. "There's going to be this burden on the victims of this disaster who are trying to hold the company accountable," says Sobel.
These arguments might find purchase in the courts, says Sobel. Moulton agrees. "If you're a company and you think there's an inspection coming, why not take a chance: 'Lets report it to DHS and see if we can cover ourselves,'" says Moulton. "It creates whole new evidentiary hurdles."
'Good Faith Effort'
The same doubts about the PCII program can be found within the security community. "I think it's a good faith effort by government to overcome a historical industry reluctance to report information," says Richard Forno, a security consultant and author. "But I think this could be a gift to big business, because they could lump a lot of things in under the guise of critical infrastructure information... and keep it away from environmentalists and lobbyists."
But without the PCII program, or something like it, Schmidt says the only way infrastructure vulnerability information gets to the right people is through word of mouth: a company security officer mentions an incident at an Infragard meeting, "a couple security people talk to each other, one talks to somebody else, and so on." By the time the nugget gets to someone in the government capable of putting it in an overall national security context, "it becomes anecdotal information nobody can really act on."
"There's got to be correlation done with these things," says Schmidt, who also rejects the notion that public disclosure is the best cure for festering vulnerabilities. "There are things that are just too critical... In most cases, this stuff will need to be dealt with in a very classified manner until the mediation plans can take place, in the interest of public safety."
In the end, the divide seems to be over how one views corporate America in the Enron Age. Schmidt, a veteran of top cyber security jobs in both government service and the private sector, says the critics are overly cynical.
"When I meet with CEOs, I've yet to meet with somebody where I get a bad feeling that they're going to use this as a cover and walk away," he says. Since September 11, "they really do understand that this is an issue, and they really will work to remediate these things if possible." And if they don't? "With the current environment... nobody's going to let somebody drop a vulnerability on the table and not take some action, if indeed it's dramatic."
"This is not a game," adds Schmidt. "Everybody's looking to do the right thing."
Sponsored: Today’s most dangerous security threats