Feeds

Cloud pricing begins to take hold inside the firewall

Say goodbye to predictable pricing, and hello to (cheap) uncertainty

High performance access to file storage

Analysis The growth in cloud computing is starting to affect the way technology companies charge and license their on-premises products, which could save IT pros money but also make budget planning much more difficult.

Though public clouds are not the right model for many companies, the interest the technology has drawn from techies to journalists to CFOs has put the cloud way of doing things at the forefront of people's minds, and led to a subtle but important change in how software and hardware is being sold.

We're talking here about capacity-based licensing, and the related fracturing of monolithic software suites into individual components to be bought and sold.

Asigra's announcement on Wednesday that it is splitting its recovery licensing away from its backup licensing, and charging companies according to usage is symbolic of a larger shift occurring in the industry, as companies try to slice and dice their products into discrete packages that can be sold on to punters.

Though these pricing models have been available for decades, the rise of software-as-a-service pricing for cloud applications has made the approach more influential, broadened the categories it can appear in, and trained IT buyers to balk less at the confusion it can introduce.

Microsoft has been doing this by making its System Center on-premise software more modular, and selling a larger range of add-ons. Similarly, HP's "CloudSystem" appliances have adopted a pay-per-use model for storage that means you can buy a large appliance but only pay for the storage capacity you use, and "burst" into more drives at the click of a button and pay for the privilege.

The benefits of this type of model are that it lets companies shrink their cost in the short-term, while still being able to scale-up either capability or usage in the future. It also lets them select (and pay for) only the components of software that they actually use, and then charges them according to usage of their own resources.

The problem is that it makes budgetary planning a nightmare: if software costs x this year, but its price can be affected by y factors outside my control such as a customer needing to perform more restores (Asigra), backup more data (Dell NetVault) or an uptick in capacity due to gaining new clients (CloudSystem, and others), then working out future expenditures can be tricky.

Though it allows businesses to potentially reduce the stuff they classify as a capital outlay, it can do weird things to rolling operational expenditure, which can prompt awkward questions from the bosses when you tell them the cost of restores has doubled for the next year due to some unforeseen usage this quarter.

It also means IT pros could have a harder time getting a big enough budget for their needs, as their typical costs hover around a low level but they'll need to ask for a larger budget in case of an unforeseen need to scale up use of their tech. To which the financial controllers may ask, "Why don't we just give you the smaller budget and we can allocate resources in case you overspend?" That would put IT pros in the awkward position of having less budget than before, and make them look bad by causing them to ask for money if one of their customers or another business unit unexpectedly needs to use more kit.

With companies such as Amazon Web Services driving a wedge between traditional channel sellers and punters, and more companies butting up against capacity or usage-based pricing, it's likely that attitudes among the beancounters may change – but we're not there yet.

One sysadmin for a small Canadian IT consultancy told us at this week's Asigra summit in Canada that losing predictable pricing put them in a tough position with customers. Another IT bod from a St Louis consultancy said his engineers billed by the hour for restores, and his company would lose margins having to absorb variable pricing

The best ways to protect budgets from this trend is to carefully model your expected operating usage of your software or gear, and then whack on a margin to allow for unexpected events. But for overworked and under-appreciated IT pros, this prospect may seem like a bitter pill to swallow. ®

High performance access to file storage

More from The Register

next story
Seagate brings out 6TB HDD, did not need NO STEENKIN' SHINGLES
Or helium filling either, according to reports
European Court of Justice rips up Data Retention Directive
Rules 'interfering' measure to be 'invalid'
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
Bored with trading oil and gold? Why not flog some CLOUD servers?
Chicago Mercantile Exchange plans cloud spot exchange
Just what could be inside Dropbox's new 'Home For Life'?
Biz apps, messaging, photos, email, more storage – sorry, did you think there would be cake?
IT bods: How long does it take YOU to train up on new tech?
I'll leave my arrays to do the hard work, if you don't mind
Amazon reveals its Google-killing 'R3' server instances
A mega-memory instance that never forgets
prev story

Whitepapers

Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.