Google inflates BigQuery AaaS
Bigger, faster, more queries
Google has expanded its BigQuery analytics-as-a-service product via price cuts, new features, and more sophisticated commands.
The upgrades to BigQuery were announced by Google on Tuesday, and see the company let BigQuery output query results of larger than 128MB, along with being able to ask it more sophisticated queries.
Prices have also been cut, with data-storage costs falling from $0.12 per gigabyte per month to $0.08, with further pricing plans coming in the future for "high-volume users," Google developer programs engineer Felipe Hoffa wrote in a blog post.
"These updates make BigQuery a faster, smarter, and even more affordable solution for ad hoc analysis of extremely large datasets," Hoffa wrote.
BigQuery is Google's remote analytics service, and is the external implementation of the Chocolate Factory's proprietary "Dremel" data analysis tool.
Dremel can scan 35 billion rows without an index "in tens of seconds," Google says, because of its use of columnar data storage and a tree architecture for dispatching queries across thousands of machines.
Now, BigQuery has been given the ability to output results larger than 128MB of compressed data, via the
-allow_large_results flag which shoves it into a destination table.
It has also gained more analytical tricks via the "window functions" feature, which let you rank results and use distributions and percentiles.
These changes follow Google making the service more attractive to traditional developers in March via an expanded range of SQL commands. BigQuery will now cache results on a per-user basis for up to 24 hours as well, Google said.
To make sure developers don't break the bank on expensive or ineffecient queries, Google rolled out a tool that can validate a query and estimate its cost prior to running, allowing analysts to make changes before they spend their cash.
The features are being rolled out on Tuesday, and the new prices will take effect in July.