Feeds

Google's BigQuery gets data slurping capability

'Streaming' feature lets it suck rows rather than nosh batches

Internet Security Threat Report 2014

Instead of having to upload stuff into the Google cloud and then feed it through to BigQuery, cloud wranglers can now stream data directly into the Chocolate Factory's analytics-as-a-service tech.

Google announced the "streaming data" update to BigQuery on Wednesday and it means admins can pour data into the service and start analysing it quickly, rather than waiting to upload it into the Google cloud storage layer in bulk.

Doing this speeds access to data and real-time insertions, Google says, but risks "possible data duplication" and "possible connection drops."

Appropriate applications for the service include ones that are constantly generating data that needs need quick-turnaround analysis, such as dashboards.

"The new data is available for querying instantaneously. This feature is great for time sensitive use cases like log analysis and alerts generation," Felipe Hoffa, a Google developer programs engineer, said.

However, limits apply for streaming: the system can ingest 100 rows per second, per table, though Google will permit "occasional bursts of up to 1,000 rows per second." The maximum row size is 100KB and the maximum data size per streaming insert is 1MB.

The upgrade follows Google introducing SQL-like commands such as JOIN into the technology in March, and cutting the price of data storage costs from $0.12 per gigabyte to $0.08 in June.

Streaming data will be free until January 1, 2014, at which point it will cost 1 cent per 10,000 rows inserted, Google wrote. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.