Putting a mule on a cloud: one man's battle with Amazon S3

Are we there yet?

Remote control for virtualized desktops

We wanted to use S3 to store the Mule Enterprise Edition download, which runs about 58MB. With each release of Mule EE there is a build and release cycle that results in a file uploaded from the development servers to our product download site. The resulting "blessed" file is migrated by a member of the development team.

In contrast to Mule Enterprise Edition, our Community version is on the bloody edge with a continuous integration server managing a broad file structure with multiple dependencies. It's not really worth it for us to go through the scripting hassles of moving all the dev files onto S3, so we maintain a server specifically for those releases and all the associated parts.

We wanted a non-technical person to be able to update the website and let the engineers keep writing code. To upload files to S3, you can use FTP, SFTP, etc. You can also use some of the new hip things like S3browse.com that let you browse S3 like a file system. Another problem solved. Onto the time-bombed URLs.

One business requirement was to make time-bombed expiration URLs so we can analyze the exact number of downloads. The Amazon API allows us to create a user-specific URL that expires so you can figure out exactly how many times the URLs are generated vs. downloaded.

Our website is primarily LAMP based and we use the REST API to make the calls to S3. S3 groups files into objects and buckets. An object matches to a stored file. Each object has an identifier, an owner, and permissions. An object is addressed by a URL, such as http://s3.amazonaws.com/TheRegister/MyFileDownload

We're using the GET request via PHP to grab the file. Easy as pie.

Here is a code sample of how it works:

// create a unique url for an amazon download // the bucketName is the directory, the objectName is the file function getS3Redirect($bucketName, $objectName) {

// unique credentials for our S3 account $S3_URL = "http://s3.amazonaws.com";

// Configure the link timeout $expires = time() + 3600;

// Create the GET request querystring
$bucketName = "/" . $bucketName . "/";
$stringToSign = "GET\n\n\n$expires\n$bucketName$objectName";
$hasher =& new Crypt_HMAC($secretKey, "sha1");
$sig = urlencode(hex2b64($hasher->hash($stringToSign)));

// We now have a lovely URL to return return "$S3_URL$bucketName$objectName?AWSAccessKeyId=$keyId&Expires=$expires&Signature=$sig"; }

// draw the link <a href="<? echo getS3Redirect("MyFileToDownload", "myfiletodownload-installer.jar"); ?>" target="_blank">Download My File(58 MB)

Services like S3 are great for file serving and for for running VM (virtual machine) images (we do that too - more in a separate article), but not great if you are dealing with a complicated build system or if you are really paranoid. It's also tough to do things that are programatically complicated or diverse unless you are using VM images. For example, Google App Engine only runs Python, so if you need Java you couldn't run your app there. And since Google doesn't support VMs you would be SOL.

There are many risks and annoyances that have to be balanced with the (theoretical) rewards. For example, users have little to no control over the security measures that the Cloud vendor takes with their files. You also have to trust that the provider is doing enough in the way of disaster recovery, backups etc. so that if the Cloud gets blown away your files/pictures/ code aren't completely lost. While we expect very high levels of uptime, history has shown that the vendors aren't completely there yet.

That said, I think that we're using S3 in the right way. That is, our business doesn't fail if Amazon goes down, and the cost-benefits definitely outweigh the occasional downtime, especially since we run our own mirrors and could make a switch in a matter in minutes.

It's still early days in the Clouds, but there are some advantages to be had right away. Minimal administration, unlimited storage and plenty of bandwidth mean that you too can make your downloads ready for scale. ®

Beginner's guide to SSL certificates

More from The Register

next story
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story


Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.