Feeds

Amazon: S3 cloud contains two trillion objects

'We've doubled our big number in a year'

Next gen security for virtualised datacentres

Amazon Web Services now has over two trillion objects within its S3 storage cloud, just one year after Bezos & Co. smashed through the one-trillion ceiling.

Each Amazon object, they say, can range from "range from zero to 5 TB in size," but Amazon does not disclose the size distribution of stored objects. An object consists of a key, a Version ID, a value, metadata, subresources, and access control information.

"It took us six years to grow to one trillion stored objects, and less than a year to double that number," Amazon wrote in a blog post announcing the milestone on Thursday. "Our universe is about 13.6 billion years old. If you added one S3 object every 60 hours starting at the Big Bang, you'd have accumulated almost two trillion of them by now."

Alternately, if each object were to be worth one dollar, then AWS S3 represents the net worth of 30 Bill Gates or 133 Steve Ballmers.

S3 is now regularly peaking at 1.1 million requests per second, up from 835,000 requests per second in Q3 2012, the company wrote. This indicate that along with requiring more storage, the applications built on top of AWS are getting chattier.

These figures may not reflect the actual size of Amazon's cloud, as they do not factor in Elastic Block Storage – a service used by a very large proportion of EC2 instances. Nor does Amazon give figures for the amount of data stored in its flash-backed DynamoDB NoSQL service – another significant piece of business for the company, and one which Amazon chief Andy Jassy described in November as the fastest growing service in AWS history.

As with all things cloud, Amazon's figure is difficult to use to compare S3 against other storage clouds: mid-level operators such as Joyent and Rackspace don't break out storage figures, and neither does Amazon contemporary Google.

Amazon S3 now stores 2 trillion objects

Not pictured: proportion of which are the cat pictures of Tumblr users

Microsoft, meanwhile, stated in July 2012 that its Azure cloud stores 4.03 trillion objects, and that the peak request rate was 880,000 per second (versus Amazon's 1.1m across two trillion objects). This, along with the lack of a detailed breakdown of what goes into Azure storage, indicates that comparing Azure's 4.03 trillion with Amazon's 2 trillion may not necessarily be a fair like-for-like comparison. At the time of writing, Microsoft had not responded to queries by us for an update on Azure storage figures.

With the whopping two trillion for S3, and no information yet on DynamoDB or EBS, it's clear that Amazon Web Services is still growing at a great rate. With each new scrap of information, Bezos & Co. seem more and more like the dragon Smaug, the plus-sized star of JRR Tolkein's The Hobbit: a fearsome creature lurking in a secret mountain, guarding its vast pile of data gold. ®

Secure remote control for conventional and virtual desktops

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?