Feeds

Licensed to BILL: How much should you cough for software licences?

Nobody really knows - so we all get screwed

Choosing a cloud hosting partner with confidence

Storagebod “Yet another change to a licensing model. You can bet it’s not going to work out any cheaper for me," was the first thought that flickered through my mind during a presentation about GPFS 4.1 at the GPFS UG meeting in London.

This started up another train of thought: in this new world of software-defined storage, how should the software be licensed? And how should the value be reflected?

Should we be moving to a capacity-based model? Should I get charged per terabyte of storage being “managed”? Or perhaps per server that has this software-defined storage presented to it? Perhaps per socket? Per core? But what if I’m running at hyperscale?

And if I fully embrace a programmatic provisioning model that dynamically changes the storage configuration, does any model make any sense apart from some kind of flat-fee, all-you-can-eat model?

Chatting to a few people, it seems that no one really has any idea what the licensing model should look like. Funnily enough, it is this sort of thing which could really de-rail ServerSAN and Software Defined Storage: it’s not going to be a technical challenge but if the licensing model gets too complex, hard to manage and generally too costly, it is going to fail.

Of course, inevitably someone is going to pop-up and mention open source … and I will simply point out that Red Hat makes quite a lot of money out of open source; you pay for support based on some kind of model. Cost of acquisition is just a part of IT infrastructure spend.

So what is a reasonable price? Anyone?

Bootnote

If you are a GPFS user in the UK, you should attend the GPFS UG meeting next time it's on. It's probably the best UG meeting I’ve been at for a long time.

Remote control for virtualized desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?