Feeds

Putting a mule on a cloud: one man's battle with Amazon S3

Are we there yet?

Internet Security Threat Report 2014

We wanted to use S3 to store the Mule Enterprise Edition download, which runs about 58MB. With each release of Mule EE there is a build and release cycle that results in a file uploaded from the development servers to our product download site. The resulting "blessed" file is migrated by a member of the development team.

In contrast to Mule Enterprise Edition, our Community version is on the bloody edge with a continuous integration server managing a broad file structure with multiple dependencies. It's not really worth it for us to go through the scripting hassles of moving all the dev files onto S3, so we maintain a server specifically for those releases and all the associated parts.

We wanted a non-technical person to be able to update the website and let the engineers keep writing code. To upload files to S3, you can use FTP, SFTP, etc. You can also use some of the new hip things like S3browse.com that let you browse S3 like a file system. Another problem solved. Onto the time-bombed URLs.

One business requirement was to make time-bombed expiration URLs so we can analyze the exact number of downloads. The Amazon API allows us to create a user-specific URL that expires so you can figure out exactly how many times the URLs are generated vs. downloaded.

Our website is primarily LAMP based and we use the REST API to make the calls to S3. S3 groups files into objects and buckets. An object matches to a stored file. Each object has an identifier, an owner, and permissions. An object is addressed by a URL, such as http://s3.amazonaws.com/TheRegister/MyFileDownload

We're using the GET request via PHP to grab the file. Easy as pie.

Here is a code sample of how it works:

// create a unique url for an amazon download // the bucketName is the directory, the objectName is the file function getS3Redirect($bucketName, $objectName) {

// unique credentials for our S3 account $S3_URL = "http://s3.amazonaws.com";
$keyId = "XXXXXXXXXXXXYYYYYYYYYYYYYYY";
$secretKey = "ZZZZZZZZZZZZZZZZZZZZZZZZZZZZ";

// Configure the link timeout $expires = time() + 3600;

// Create the GET request querystring
$bucketName = "/" . $bucketName . "/";
$stringToSign = "GET\n\n\n$expires\n$bucketName$objectName";
$hasher =& new Crypt_HMAC($secretKey, "sha1");
$sig = urlencode(hex2b64($hasher->hash($stringToSign)));

// We now have a lovely URL to return return "$S3_URL$bucketName$objectName?AWSAccessKeyId=$keyId&Expires=$expires&Signature=$sig"; }

// draw the link <a href="<? echo getS3Redirect("MyFileToDownload", "myfiletodownload-installer.jar"); ?>" target="_blank">Download My File(58 MB)

Services like S3 are great for file serving and for for running VM (virtual machine) images (we do that too - more in a separate article), but not great if you are dealing with a complicated build system or if you are really paranoid. It's also tough to do things that are programatically complicated or diverse unless you are using VM images. For example, Google App Engine only runs Python, so if you need Java you couldn't run your app there. And since Google doesn't support VMs you would be SOL.

There are many risks and annoyances that have to be balanced with the (theoretical) rewards. For example, users have little to no control over the security measures that the Cloud vendor takes with their files. You also have to trust that the provider is doing enough in the way of disaster recovery, backups etc. so that if the Cloud gets blown away your files/pictures/ code aren't completely lost. While we expect very high levels of uptime, history has shown that the vendors aren't completely there yet.

That said, I think that we're using S3 in the right way. That is, our business doesn't fail if Amazon goes down, and the cost-benefits definitely outweigh the occasional downtime, especially since we run our own mirrors and could make a switch in a matter in minutes.

It's still early days in the Clouds, but there are some advantages to be had right away. Minimal administration, unlimited storage and plenty of bandwidth mean that you too can make your downloads ready for scale. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.