Feeds

Amazon invites 5 terabyte mondo-files into the heavens

Time to stream your genome sequencer

Secure remote control for conventional and virtual desktops

Amazon has increased the maximum object size on its S3 online storage service to 5 terabytes. Previously, S3 users were forced to store large files in chunks no larger than about 5 gigabytes.

"When a customer wanted to access a large file or share it with others, they would either have to use several URLs in Amazon S3 or stitch the file back together using an intermediate server or within an application," Amazon said in a Friday blog post. "No more. We've raised the limit by three orders of magnitude."

Each S3 object can now range from one byte to 5 terabytes, letting you store extremely large files – including scientific or medical data, high-resolution video, and backup files – as single objects.

You can upload these larger objects using the relatively new Multipart Upload API, which was previously used to upload beefy files in parts.

OpenStack – the (truly) open source project that lets you mimic Amazon's S3 and EC2 services inside your own data center – says that it's working on larger-object support as well. OpenStack community manager Bret Piatt of Rackspace tells us that this will arrive early next year with OpenStack's "Bexar" release and that it too will expand sizes to 5 terabytes.

OpenStack was founded by Rackspace and NASA after both outfits were struggling to scale up their infrastructure clouds. OpenStack is based on Nova, a cloud fabric controller designed by NASA, and Cloud Files, a storage controller built by Rackspace. The storage platform is known as swift, and Rackspace says that it now has a "mature" swift codebase running in a production environment.

Incidentally, Bexar is named for a county in Texas, Rackspace's home state. We're told it's pronounced "bear." We would make fun of this, but we're also told that messin' with Texas is verboten. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.