Business

Good news: Unsecured Amazon Web Services S3 bucket discovery just got easier

Oh, that's not good news is it?

By Richard Chirgwin

10 SHARE

If you thought the business of discovering unsecured Amazon Web Services S3 buckets was for the pros, think again: like all things, the process can be automated, and the code to automate it posted to GitHub.

It's not a new discipline: quickly Googling GitHub for S3 bucket enumeration turns up more than 1,000 results. However, these latest projects use data made public in certificate transparency logs, instead of lists of interesting words common in previous approaches.

Over the weekend, one project in particular came to this author's attention: Bucket Stream, which scans certificate transparency logs with a simple bit of Python.

“This tool simply listens to various certificate transparency logs (via Certstream) and attempts to find public S3 buckets from permutations of the certificates domain name,” author Paul Price (@darkp0rt on Twitter) wrote.

Price added a few useful tips for anybody using S3: randomise bucket names so they don't identify your company; set permissions and keep an eye on them; use two buckets to separate private and public data; audit who can access the data (for example, suppliers), and use Amazon Macie to classify and secure sensitive data.

Fast and furious

A second project, Slurp, builds on the Certstream idea, but re-implements it in Go, which author “bbb31” says is faster and avoids Python dependencies.

It also adds a bit of UI goodness, like colour-coding to identify S3 buckets that are secured versus those that are public.

Those are just the newest of the enumeration projects, of course, but using certificate information to get bucket names is the new wrinkle.

For the older Bucketeers, for example, you need AWS credentials to run the script, not required by Bucket Stream or Slurp.

The much older AWSBucketDump from Jordan Potti needed a lot more work from the user, since as Potti described it, it's a “brute forcer” based on word-lists.

Of course, pros like UpGuard's Chris Vickery and Kromtech's researchers probably have their own toolkits, but the big thing to remember is: if you secure your AWS bucket, bad actors will have to find another way to steal your secrets. ®

Sign up to our NewsletterGet IT in your inbox daily

10 Comments

More from The Register

Oh, Bucket! AWS in S3 status-checking tool free-for-all

'Your data is waiting for the internet to download it' warning lights are now free

France next up behind Britain, Netherlands to pummel Uber with €400k fine over 2016 breach

Dara and pals told to hand over yet another cash wodge for hack it spent $100k covering up

Millions of scraped public social net profiles left in open AWS S3 box

Poorly configured cloud buckets strike again – this time, Localbox fingered

Uber fined £385k by ICO for THAT hack of 57m customers' deets

Updated 2.7 million Brits caught up in 'serious failure of data security' says UK data watchdog

Amazon tries to ruin infosec world's fastest-growing cottage industry (finding data-spaffing S3 storage buckets)

AWS comes up with blanket policies to smother public-facing cloud silos

When it absolutely, positively needs to be leaked overnight: 120k FedEx customer files spill from AWS S3 silo

Passport scans, drivers licenses, etc, exposed online

AWSome, S3 storage literally costs pennies

Just ignore the retrieval fees and relatively lower resilience

Amazon's answer to all those leaky AWS S3 buckets: A dashboard warning light

Updated Look out for that orange alert

AWS users felt a great disturbance in the cloud, as S3 cried out in terror

S3izure made things tricky for an hour, but was no apocalypS3 to match March mess

When is a Barracuda not a Barracuda? When it's really AWS S3

Now you can replicate backups to Barracuda's actually-Amazonian cloud