Feeds

Amazon invites 5 terabyte mondo-files into the heavens

Time to stream your genome sequencer

Protecting against web application threats using SSL

Amazon has increased the maximum object size on its S3 online storage service to 5 terabytes. Previously, S3 users were forced to store large files in chunks no larger than about 5 gigabytes.

"When a customer wanted to access a large file or share it with others, they would either have to use several URLs in Amazon S3 or stitch the file back together using an intermediate server or within an application," Amazon said in a Friday blog post. "No more. We've raised the limit by three orders of magnitude."

Each S3 object can now range from one byte to 5 terabytes, letting you store extremely large files – including scientific or medical data, high-resolution video, and backup files – as single objects.

You can upload these larger objects using the relatively new Multipart Upload API, which was previously used to upload beefy files in parts.

OpenStack – the (truly) open source project that lets you mimic Amazon's S3 and EC2 services inside your own data center – says that it's working on larger-object support as well. OpenStack community manager Bret Piatt of Rackspace tells us that this will arrive early next year with OpenStack's "Bexar" release and that it too will expand sizes to 5 terabytes.

OpenStack was founded by Rackspace and NASA after both outfits were struggling to scale up their infrastructure clouds. OpenStack is based on Nova, a cloud fabric controller designed by NASA, and Cloud Files, a storage controller built by Rackspace. The storage platform is known as swift, and Rackspace says that it now has a "mature" swift codebase running in a production environment.

Incidentally, Bexar is named for a county in Texas, Rackspace's home state. We're told it's pronounced "bear." We would make fun of this, but we're also told that messin' with Texas is verboten. ®

Choosing a cloud hosting partner with confidence

More from The Register

next story
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Google+ GOING, GOING ... ? Newbie Gmailers no longer forced into mandatory ID slurp
Mountain View distances itself from lame 'network thingy'
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
Seagate's triple-headed Cerberus could SAVE the DISK WORLD
... and possibly bring us even more HAMR time. Yay!
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.