'Amazon can't do what we do': Twitter-miner's BYO data centre heresy
DataSift strains with Hadoop
Sometimes floating on somebody else’s cloud isn’t enough. Sometimes you just have to float alone – no matter how young you are. DataSift, the five-year-old big data company mining billions of tweets and Wikipedia edits, reckons it’s just one year away from building its own data centre.
DataSift sucks down 2TB of data from Twitter each day while it has two-and-a-half years' worth of Twitter data – 90 billion tweets – sitting on Hadoop servers. DataSift has also launched Wikistats, tracking trends on Jimmy "Jimbo" Wales’ crowd-surfing site. Wikistats records edits, peaking at up to 100 a second.
Nick Halstead, DataSift's founder and chief technology officer, reckons the cost and complexity of his current co-located and mixed set-up, means a data centre is on the cards – and soon. He ruled out a move to using a public cloud option, based on performance and cost.
“You can’t run what we run on Amazon from a cost and performance perspective,” he told The Reg during an interview.
DataSift wouldn’t be the first company working at what’s called “web scale” to build its own data centre, but it is possibly the youngest, the smallest (30 employees) and probably the only tech venture in today’s environment doing so with the potential assistance of venture capital.
Facebook was founded in 2004 and has just spent hundreds of millions building its own centres in Oregon, North Carolina and Sweden, although it still uses third parties in California and Virginia. Twitter, founded in 2006 last year, picked Utah for its first data centre. eBay, hailing from the dot-com era, is building a $287m data centre, also in Utah.
But why would they do this, when those pushing public clouds – such as Salesforce – are so emphatic that in this era of cheap and (ahem, Amazon) reliable data centres, building your own no longer makes financial or organisational sense?
Owning your own can mean lower costs in the long run with access to cheaper power, custom designed cooling and servers, and abundant capacity for expansion.
In DataSift’s case, it also means consolidation and sanity, with a potentially simpler network infrastructure that comes at a lower cost.
DataSift has its own 10 Hewlett-Packard racks plus 240 Dell racks run by Pulsant at two data centres in Reading, near Microsoft. The servers have 936 CPU cores and data filtering nodes can process up to 10,000 unique streams to keep up with what’s being said and deliver results.
Halstead has additional racks in reserve, ready to deploy, but reckons he already spends “a lot” of money on hardware. The real problem, Halstead says, isn’t the cost of rack space but what he calls “very complex” networking. DataSift uses the open-source Java Hadoop framework to process and serve terabytes of tweets and Wiki updates across its distributed, clustered servers. Hadoop means speed, but it’s never been a pushover to install and administer, as founder Doug Cutting told us here.
Sponsored: Data Loss Prevention & Data Theft Prevention