Feeds

Google forges BigTable-based NoSQL datastore

Takes out BigTable, thwacks Amazon DynamoDB over the head

Internet Security Threat Report 2014

Google I/O If you're Google, building cloud services for the public must be frustrating – after spending a decade crafting and stitching together software systems for use internally, when you try and sell them to the outside world you need to unpick them from one another.

It seems more like butchery than creation, but that's the name of the cloud game, and so on Wednesday Google further fragmented its services by ripping a scalable NoSQL datastore away from Google App Engine (GAE) and making it into a standalone service named Google Cloud Datastore.

This strategy of designing integrated products and fragmenting them for the general public runs throughout Google's cloud history. For example, its cloud portfolio started out with platform services via GAE, but after Amazon started raking in vast amounts of cash from IaaS services on AWS, Google separated out basic VM services into the Google Compute Engine infrastructure cloud.

With Datastore, Google has taken another bit of App Engine and stuck it on its own plinth. The service is a columnar datastore which supports ACID transactions, has high availability via multi-data center replication, and offers SQL-like queries.

It will compete with Amazon's DynamoDB NoSQL row-based datastore. Though roughly equivalent in terms of capability, the two services have some architectural differences that influence how they work: BigTable-based Datastore has strong consistency for reads and eventual consistency for queries, whereas DynamoDB offers people a choice of eventual or strong consistency, depending on pricing. Both systems are heavily optimized for writes.

The systems' storage substrates also differ. DynamoDB uses an SSD-backed set of hardware, but Google indicated its Datastore may use both flash and disk. "We do use them [SSDs], we sort of use them behind the scenes," Greg DeMichillie, a Google Cloud product manager, told The Register. "Frankly we think what people really want is a certain performance level but they really couldn't care whether it's this technology or that behind it. We don't surface inside the storage stack where we happen to be using SSDs and where we don't."

The base cost for Google storage is $0.24 per gigabyte per month, with writes charged at $0.10 per 100,000 operations and reads charged at $0.07 per 100,000. This compares favorably with DynamoDB, which costs $0.25 per GB per month, plus $0.0065 per hour for every 10 units of write capacity, or $0.0065 per hour for every 50 units of read capacity. Harmonizing these two pricing approaches is difficult due to the labyrinthian price structure Amazon uses.

For both services, transferring data in costs no charge, but moving it to other storage or services can sting you, with Google charging $0.12 per gigabyte on outgoing bandwidth and Amazon charging on a sliding scale from $0.12 to $0.05 – or even lower, if you have a ton of data.

"With Datastore we certainly will continue to evolve over time onto latest and greatest versions," DeMichillie said. "It's really just a matter of timing and sequencing." ®

Internet Security Threat Report 2014

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Website security in corporate America
Find out how you rank among other IT managers testing your website's vulnerabilities.