Feeds

Amazon invites 5 terabyte mondo-files into the heavens

Time to stream your genome sequencer

Next gen security for virtualised datacentres

Amazon has increased the maximum object size on its S3 online storage service to 5 terabytes. Previously, S3 users were forced to store large files in chunks no larger than about 5 gigabytes.

"When a customer wanted to access a large file or share it with others, they would either have to use several URLs in Amazon S3 or stitch the file back together using an intermediate server or within an application," Amazon said in a Friday blog post. "No more. We've raised the limit by three orders of magnitude."

Each S3 object can now range from one byte to 5 terabytes, letting you store extremely large files – including scientific or medical data, high-resolution video, and backup files – as single objects.

You can upload these larger objects using the relatively new Multipart Upload API, which was previously used to upload beefy files in parts.

OpenStack – the (truly) open source project that lets you mimic Amazon's S3 and EC2 services inside your own data center – says that it's working on larger-object support as well. OpenStack community manager Bret Piatt of Rackspace tells us that this will arrive early next year with OpenStack's "Bexar" release and that it too will expand sizes to 5 terabytes.

OpenStack was founded by Rackspace and NASA after both outfits were struggling to scale up their infrastructure clouds. OpenStack is based on Nova, a cloud fabric controller designed by NASA, and Cloud Files, a storage controller built by Rackspace. The storage platform is known as swift, and Rackspace says that it now has a "mature" swift codebase running in a production environment.

Incidentally, Bexar is named for a county in Texas, Rackspace's home state. We're told it's pronounced "bear." We would make fun of this, but we're also told that messin' with Texas is verboten. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?