SAS schemes $70m biz analytics cloud
Business-analytics software developer SAS Institute is taking to the clouds. But rather than stake the future of its hosted-application business on existing clouds such as Amazon AWS, SAS has decided to shell out $70m (£48m) to build its own cloud-computing facility.
The Cary, North Carolina software house has seen exponential growth since its founding in 1976. Although privately held and thus not obligated to provide any financials, the company posted sales of $2.26bn (£1.56bn) in 2008 and invested a staggering 22 per cent of revenue in research and development.
The only time sales at SAS slowed were in the dot-bomb years of 2001 and 2002 - and even then the company managed to grow a tiny bit. With over 11,000 employees, SAS is the world's largest privately held packaged-software provider, known for being simultaneously conservative and forward-thinking.
Keith Collins, who led the creation of the SAS 9 business-analytics suite and who runs the company's research and development operations, also serves as the company's chief technology officer. According to Collins, SAS has a data center packed with Sun Sparc/Solaris servers and NetApp and EMC storage arrays that is used to host versions of its applications for customers. Collins says that SAS doesn't even advertise this hosted-application business, but customers found out about it anyway and business is growing "in excess of 30 per cent per year."
The SAS OnDemand apps include applications to detect money laundering, to drive drug discovery (in the lab, not at the airport), and to perform marketing analysis on various fronts.
SAS thinks it's on to something, but it wants to build a more cloud-like facility than its current Solaris farm. Collins says that for the past four years the development side of SAS has been a heavy user of server-virtualization tools from VMware, using the ESX Server hypervisor and related staging tools for development and test environments.
"We expect the hardware to shift significantly over the next few years," he says, adding that SAS hasn't yet decided on the iron and software it will use to build its cloud. Collins did, however, hint strongly that it will use x64 servers with fast local storage backed up by a storage area network, and will very likely use virtualization tools from VMware.
The new cloud facility will be easily an order of magnitude or more larger - in terms of compute and storage capacity - than the current facilities used to host the SAS Solutions OnDemand apps.
The 38,000 square-foot facility that SAS is building on its Cary campus will have two 10,000 square-foot server farms. The first farm is expected to be online in mid-2010 and is expected to support growth for hosted applications over the next three to five years. The second will be fitted out with servers and storage when the first farm hits 80 per cent capacity.
The construction of the data-center facility and related office space will account for between $20m and $22m (£13.8 and £15.2) of the $70m budgeted for the cloud, with the remainder going for servers, storage, and software. SAS is keeping 60 per cent of the construction and equipment spending in North Carolina and says that construction will provide about 1,000 jobs.
SAS may not have a choice but to build its own cloud. Given the sensitive nature of the data its customers analyze, moving that data out to a public cloud such as the Amazon EC2 and S3 combo is just not going to happen.
And even if rugged security could make customers comfortable with that idea, moving large data sets into clouds (as Sun Microsystems discovered with the Sun Grid) is problematic. Even if you can parallelize the uploads of large data sets, it takes time.
But if you run the applications locally in the SAS cloud, then doing further analysis on that data is no big deal. It's all on the same SAN anyway, locked down locally just as you would do in your own data center. ®
Sponsored: DevOps and continuous delivery