Feeds

NASA's cloud strategy panned by NASA auditors

Houston, you have a problem

Beginner's guide to SSL certificates

NASA's cloud strategy has been panned by its own auditors, proving that even technically competent oganizations can sputter when trying to soar into the tech stratosphere.

In a lengthy report released on Monday by NASA's Office of Audits, investigators slammed the agency's cloud governance, risk management, and security policies.

"We found that weaknesses in NASA's IT governance and risk management practices have impeded the Agency from fully realizing the benefits of cloud computing and potentially put NASA systems and data stored in the cloud at risk," the OA report says.

As of mid-2013 NASA spends a piddling $10m of its $1.5bn annual IT budget on public cloud services, but the agency expects that 75 percent of its new IT projects in the next five years will be all cloudy, and 40 percent of legacy systems will be migrated.

For this reason, "as NASA moves more of its systems and data to the cloud, it is imperative that the Agency strengthen its governance and risk management practices to safeguard its data while effectively spending its IT funds," the OA writes.

The problems began when the auditors asked NASA for the number of cloud services and associated service providers it was using. They found that NASA's Office of the Chief Information Officer (OCIO) "was unaware of two of the eight companies providing cloud services to NASA," and found that only three of NASA's 15 organizations believed they needed to coordinate with the agency when bringing in new cloud projects.

A further problem was that of five cloud-computing contracts reviewed during the investigation, none "came close to meeting recommended best practices" for security and management. The OA found that the contracts rarely had defined roles and responsibilities for the provider, nor guaranteed any level of system availability. None of the contracts satisfied data-privacy requirements, nor data retention and destruction policies.

The most severe problem identified by the auditors was lax security policies within NASA.

"We found that NASA's internal and external portal, which includes more than 100 websites, was operating without system security or contingency plans and with an operating authorization that expired in 2010," the OA wrote. "Even more troubling, a test of security controls on the IT services provided by the NASA Portal had never been undertaken to determine whether the system's controls were implemented correctly, operating as intended, and producing the desired results of securing the system and its data."

The team made six recommendations to NASA to help it tighten up its security and governance models for cloud projects, and NASA is in the process of implementing those, the report says.

Changes will include establishing a dedicated cloud-computing program management office, making sure that NASA organizations use contracts that mitigate risks and meet FedRAMP (a certification that assures federal buyers of cloud compliance) standards, closely monitoring migrations of moderate-to-high-impact NASA systems to public clouds and making sure this conforms with federal requirements, making all NASA CIOs look closely at FedRAMP and ensure that existing contracts can conform to it, mandating the development of NIST-compliant security and contingency plans with cloud providers and brokers, and, finally, ensuring that security officers review all IT security documentation for projects.

Though these changes seem like common sense, the fact the auditors need to recommend that NASA perform them highlights the gulf between cloud nirvana and cloud reality. Even a technically advanced organization like NASA can have trouble effectively putting clouds in place, proving that cloud computing for the public sector is not rocket science – its harder. ®

Intelligent flash storage arrays

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Seattle children’s accelerates Citrix login times by 500% with cross-tier insight
Seattle Children’s is a leading research hospital with a large and growing Citrix XenDesktop deployment. See how they used ExtraHop to accelerate launch times.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.