Feeds

Japan weather bods rain on using cloud for tsunami warning data-crunching

Don't call us, we'll warn you

Internet Security Threat Report 2014

The agency which predicts tsunamis and earthquakes in Japan has poured cold water on the idea of using the cloud to underpin its supercomputing operations.

While cloud vendors have been touting supercomputing on tap for a few years now, Tatsuya Kimura, head of the office of international affairs at the Japan Meteorological Agency, questioned their suitability for the crucial predictions his agency has to make.

In the event of a major earthquake, the agency has to make a call in minutes as to whether to issue a tsunami alert. As well as providing Japan's weather services including tracking typhoons, the agency also issues earthquake warnings for the “Tokai” area, where the tectonic plates are particularly well understood.

“It’s a time-critical service,” he told journalists at agency's Tokyo HQ today. “We can’t say the warning was late because of the cloud service... I think it’s a little unlikely to move to the cloud.”

JMA’s current supercomputer is a 847 Teraflop beast supplied by Hitachi and housed in Tokyo - itself somewhat quake-prone. Fujitsu provides comms and other ICT services. Kimura said in the event of the JMA's supercomputer copping it, it doesn’t have a redundant backup, and would initially have to rely on weather data from other agencies such as the UK’s Met Office for its weather predictions.

The agency’s tsunami warnings are decided by humans, who rely on a previously compiled database of models covering different magnitudes and depths of quake across key locations. Japan can experience up to 1,000 quakes a day.

The system for tsunami warnings was overhauled in the wake of the devastating 2011 quake, which resulted in a tsunami that killed over 10,000.

Kimura said that quake was off the scale - the agency’s seismometers were “saturated” and could initially not give a reading for its magnitude, leading to an underestimation of the danger of the tsunami.

Kimura said the agency’s new protocol meant if a tsunami of more than 1m in magnitude is expected, it issues an immediate evacuation notice for areas likely to be hit.

While questioning cloud providers' suitability for underpinning its warning system, Kimura did say the agency uses cloud services for disseminating information, and will do so with imaging data from its upcoming new weather satellite, due to launch in October.

However, cloud vendors are unlikely to be able to change the agency’s mind any time soon. It upgrades supercomputer every five years, and has just put the advisory team together for the next refresh in four years time. Outsourcing the service is not on the agenda.®

Secure remote control for conventional and virtual desktops

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.