Feeds

Google adds storage buckets to plug cloud gap

Plonk your data where you like

Secure remote control for conventional and virtual desktops

Google has introduced a feature that lets administrators locate their cloud data next to their rented servers to reduce latency, bringing the Chocolate Factory's cloud closer to its rivals in terms of capability.

The Regional Buckets feature was announced by Google on Monday along with Object Lifecycle Management and Automatic Parallel Composite Uploads, as the Chocolate Factory tries to give administrators greater control of their IT resources within Google's monolithic software stack.

"With a tiny bit of upfront configuration, you can take advantage of these improvements with no changes to your application code," Google developer programs engineer Brian Dorsey wrote on Monday.

The probable reason why no re-coding is required is that Google's cloud is a layer pasted over the Chocolate Factory's notoriously advanced monolithic software system, so the changes can be made by introducing new API commands for users that dovetail into the secret infrastructure.

With Regional Buckets, developers can co-locate data stored in Google's low-cost storage service (Durable Reduced Availability storage) inside the same region as their Google Compute Engine instances. Letting the data and compute sit on the same network fabric in this way can lead to more predictable performance, and Google says this is "particularly appropriate for data-intensive computations."

The new feature lets you choose from among five data centers to locate your data, whereas previously it only let you select a preference for "US" or "Europe" facilities.

Google also announced a new feature named Object Lifecycle Management, which lets admins create data retention and deletion policies to reduce the cost of storing data. "For example, you could configure a bucket so objects older than 365 days are deleted, or only keep the 3 most recent versions of objects in a versioned bucket," Dorsey writes.

A new tech for Google's gsutil command line cloud controller called Automatic Parallel Composite Uploads also reduces the time it takes to load data into the Google cloud by splitting large files into component pieces that are uploaded in parallel across available network connections.

The three features see Google's cloud get closer to its key competitors Amazon and Microsoft in terms of capability, both of which have offered features around versioning, management, and granular regional selection for data for some time. ®

Beginner's guide to SSL certificates

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Free virtual appliance for wire data analytics
The ExtraHop Discovery Edition is a free virtual appliance will help you to discover the performance of your applications across the network, web, VDI, database, and storage tiers.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.