White House CyberSecurity ignores bad (MS) software

Clarke's office more bureaucrat gravy train than think tank

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

Diane Frank's December 5 article Bills Aim at Cyber R&D in Federal Computer Week reports that the Cyber Security Research and Development Act, introduced by US Representative Sherwood Boehlert (Republican, New York), seeks to provide nearly half a billion dollars in funding for research and education of information security matters. The proposed legislation provides a $233 million check to the National Science Foundation for research into what Frank calls "basic cybersecurity issues."

In a related policy-driven attempt to shore up information security, White House Cybersecurity Coordinator Richard Clarke announced recently that his office would create a national map of the information grid (networks, power grids, and related infrastructures) to provide for research and plan for future problems. This will be accomplished through a National Center for Infrastructure Simulation and Analysis to be established in 2002.

Both of these proposals mean more money, more jobs, and more research on the long-term security issues. While this is admirable -- not too mention much needed -- it neglects to address the immediate, real-world problems plaguing the Internet. The proposed bill would give $90 million for colleges to develop graduate degree programs in cybersecurity, as if earning a degree confirms that its bearer is any wiser in the ways of information security than someone with twenty years of hands-on experience. Wisdom occurs through trial, error, and experienced over a significant period of time. You can’t create an expert overnight, or in two years, or with millions of dollars. An academic degree or professional certification doesn’t necessarily mean the bearer is any more competent or experienced; and it shouldn’t be the determining factor in hiring security folks.

At a recent IT summit in Washington, Clarke stated that: "We need to decide that IT security functionality will be built into what we do. It's not an afterthought anymore." True, but where was Clarke (who was, after all, computer security "Czar" in the Clinton Administration) for the past ten years while critical information infrastructures were designed without the appropriate and necessary security processes? Why did he and his government cronies not step forward previously to ensure security was an integral part of all aspects of IT infrastructure, including software?

The federal government could have used its legislative force to hold software vendors liable for producing and distributing insecure products. Furthermore, it could have thrown its considerable economic weight around and refused to repeatedly purchase buggy software. Now that such bug-ridden products are the unfortunate rule rather than the exception, Clarke is proposing that software vendors provide automatic updates to their products when problems are discovered. This will save users from having to perform such updates themselves and would usually be accomplished by placing trusted vendor backdoors in the software.

This is laughably ironic: planting back doors in programs in order to provide security updates unfortunately means putting in place a vulnerable path for intruders to exploit. Such a strategy violates the first rule of network security, which is to deny all traffic and accept only known, trusted connections. Besides, vendors already provide update services, such as Windows Update, Red Hat Patch Updater, and Apple Software Update. Yet power users (and most security folks I know) usually disable such remote features in order to maintain full control over their systems.

It does not take much foresight to anticipate the day when such a vendor update system is manipulated by a cyber-varmint, putting us right back to square one. What will be Clarke’s answer then? Will the vendor whose update features were compromised be held accountable for the situation? Or will it simply be dismissed as the latest 'cost of doing business' on the Internet? Instead of "pushing updates down the throats of users", as Clarke said, why not take active steps to ensure the software isn’t so damn buggy and exploitable in the first place? That would make more sense, don’t you think?

What Clarke and Co. Still Don't Get

Clarke et al. are on the right track -- at least they're beginning to recognize the enormity of the computer security issue. Research is a long-term investment, and something we certainly need; but it shouldn’t be seen as a substitute for remedying immediate problems. Rather than waste taxpayer dollars on corporate welfare, government jobs programs, and more research, the federal government should focus on two critical areas of IT.

First, they need to consult less with CEOs and marketers and more with CTOs, CIOs, and line officers in corporate IT departments. These are people who understand the nature of the problem and can provide advice that can contribute to the development an effective national information assurance program. Hiring Microsoft Security Advisor Howard Schmidt is a good first step -- I know Schmidt personally -- he’s been in the IT trenches for many years and can provide operational guidance on the issue from personal experience and not just media hype. (I just hope he didn’t drink too much of the Kool-Aid during his tenure in Redmond.)

Secondly, part of the half-billion or more dollars being proposed for various long-term cyber-security initiatives should be spent doing both an objective design review of our critical information infrastructures. This should include holding vendors accountable for failing to provide appropriate security and availability guidance in the infrastructure design process. We should not have to pay them to fix mistakes caused by their profit-driven shortsightedness. It should also include a line-by-line software code assessment of any Microsoft product being used in a critical system. I’d even suggest some of that half-billion be used to foster open-source software development to give enterprise users a choice in their IT infrastructures. As the Irish famine illustrated, it’s bad karma to rely on a single crop; so why do the same with software, especially a product so disease-prone as Windows?

© 2001 SecurityFocus.com, all rights reserved.

Remote control for virtualized desktops

More from The Register

next story
Bladerunner sequel might actually be good. Harrison Ford is in it
Go ahead, you're all clear, kid... Sorry, wrong film
Euro Parliament VOTES to BREAK UP GOOGLE. Er, OK then
It CANNA do it, captain.They DON'T have the POWER!
Musicians sue UK.gov over 'zero pay' copyright fix
Everyone else in Europe compensates us - why can't you?
I'll be back (and forward): Hollywood's time travel tribulations
Quick, call the Time Cops to sort out this paradox!
Megaupload overlord Kim Dotcom: The US HAS RADICALISED ME!
Now my lawyers have bailed 'cos I'm 'OFFICIALLY' BROKE
Forget Hillary, HP's ex CARLY FIORINA 'wants to be next US Prez'
Former CEO has political ambitions again, according to Washington DC sources
prev story


Free virtual appliance for wire data analytics
The ExtraHop Discovery Edition is a free virtual appliance will help you to discover the performance of your applications across the network, web, VDI, database, and storage tiers.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Business security measures using SSL
Examines the major types of threats to information security that businesses face today and the techniques for mitigating those threats.