Feeds

Amazon invites 5 terabyte mondo-files into the heavens

Time to stream your genome sequencer

Internet Security Threat Report 2014

Amazon has increased the maximum object size on its S3 online storage service to 5 terabytes. Previously, S3 users were forced to store large files in chunks no larger than about 5 gigabytes.

"When a customer wanted to access a large file or share it with others, they would either have to use several URLs in Amazon S3 or stitch the file back together using an intermediate server or within an application," Amazon said in a Friday blog post. "No more. We've raised the limit by three orders of magnitude."

Each S3 object can now range from one byte to 5 terabytes, letting you store extremely large files – including scientific or medical data, high-resolution video, and backup files – as single objects.

You can upload these larger objects using the relatively new Multipart Upload API, which was previously used to upload beefy files in parts.

OpenStack – the (truly) open source project that lets you mimic Amazon's S3 and EC2 services inside your own data center – says that it's working on larger-object support as well. OpenStack community manager Bret Piatt of Rackspace tells us that this will arrive early next year with OpenStack's "Bexar" release and that it too will expand sizes to 5 terabytes.

OpenStack was founded by Rackspace and NASA after both outfits were struggling to scale up their infrastructure clouds. OpenStack is based on Nova, a cloud fabric controller designed by NASA, and Cloud Files, a storage controller built by Rackspace. The storage platform is known as swift, and Rackspace says that it now has a "mature" swift codebase running in a production environment.

Incidentally, Bexar is named for a county in Texas, Rackspace's home state. We're told it's pronounced "bear." We would make fun of this, but we're also told that messin' with Texas is verboten. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.