Feeds

NetApp's dangerous dance with the killer brontosaurus - Amazon

One wrong step and you're paste

Build a business case: developing custom apps

Blocks+Files What is NetApp up to, partnering with a cloud IT provider, Amazon, that's positioned long-term to try and annihilate NetApp's business?

Let's start from the position that you are a NetApp customer with your own FAS array and you want to use Amazon Web Services, the AWS cloud computing facility, on some data in that array. How do you get it up into the AWS cloud?

One way is to move that data into the cloud and leave it there, front-ending it with a cloud storage gateway in your own data centre so you get fast local access to its cached cloud-stored data. But that means committing data storage to the cloud and you may not want to do that. There's also the problem of loading potentially multiple tens of terabytes, even hundreds of them, into the cloud in the first place. Sending a truckful of disks is often the most effective way of doing that.

If you wish to keep your master data on your own premises where it is constantly being updated, and have AWS work on up-to date versions of that data then you have a problem. NetApp has an answer which involves an Amazon co-location facility, a second FAS array, and replication of data from your own FAS to the Colo FAS. It's called NetApp Private Storage for AWS.

First, a second FAS array is placed in a co-location (colo) facility which has a Direct Connect link to AWS. Then data is replicated, using SnapMirror and SnapVault, between your on-premise FAS array to the colo FAS, and AWS EC2 and so forth gets to process it using the Direct Connect link. Updated data gets replicated back to the on-premise FAS. There's a NetApp blog about it here.

So, to take advantage of pay-as-you-go, on-demand and elastic AWS, you have to have a second FAS array in a colo centre and buy the SnapMirror and SnapVault licences, and pay for the network link to the colo centre. Logically you are paying for and placing a non-cloud FAS into the AWS infrastructure as an AWS edge device.

Why? Why do you need this? Sure you get a business continuity/disaster recovery FAS array and AWS compute stack, and if that's what you want, fine. But as a way of getting AWS compute to work on your FAS data it looks expensive. Also it's only available in America right now, with geographic region expansion coming.

Direct Connect

Looking at Amazon Direct Connect we read:-

AWS Direct Connect makes it easy to establish a dedicated network connection from your premise to AWS. Using AWS Direct Connect, you can establish private connectivity between AWS and your datacenter, office, or colocation environment, which in many cases can reduce your network costs, increase bandwidth throughput, and provide a more consistent network experience than Internet-based connections.

AWS Direct Connect lets you establish a dedicated network connection between your network and one of the AWS Direct Connect locations.

A Direct Connect FAQ states:-

Q. Can I use AWS Direct Connect if my network is not present at an AWS Direct Connect location?

Yes. APN Partners supporting AWS Direct Connect can help you extend your pre-existing data centre or office network to an AWS Direct Connect location …

Why not then cut out the colo FAS and replication software and have Direct Connect link directly to your on-premise FAS?

There may be a faster network link between the AWS Direct Connect-equipped colo than between it and your own data centre. If so the costs and feeds of the two approaches need comparing to see which suits you best.

Endgames

We might imagine Amazon is talking to EMC, Dell, HDS, HP and IBM about similar partnering arrangements, and why not? Every mainstream storage supplier partnership increases the credibility of AWS. The endgame for Amazon is surely the storage of all enterprise data in its cloud and NetApp is dancing with the enemy - yet dancing very well. It's endgame is NetApp survival, and growth, as a continuing and relevant storage supplier to enterprises of all sizes.

Is there room in the storage market for the cloud storage suppliers and the mainstream storage array suppliers? Cloudy storage folk like Amazon, Google, Microsoft, Rackspace and others want to store data that's sitting in storage arrays right now and have no end to their ambition, which could take ten, fifteen or more years to fulfil. Every byte they store is a byte not stored on a mainstream array.

There are smaller cloud service providers, second tier ones, who do not have the ability to build their own storage as Amazon and Google do. For example PeakColo of Colorado, and they will buy mainstream vendors' storage; PeakColo has petabytes of NetApp kit. These providers will exist under Amazon and Google's shadow.

All the mainstream storage suppliers have the same problem as NetApp. They face their total addressable market potentially; let's stress "potentially", shrinking because Amazon and Google-class cloud storage suppliers take more and more of it away from them. How do they maintain and grow their businesses in the face of Brontosaurus-sized behemoths like Amazon and Google invading their feeding grounds?

NetApp could, for example, turn its V-Series into a generic cloud storage gateway, as well as supplying kit to second tier CSPs. Partnering is one way to get to know your enemy and buy time to work out a strategy, and that strategy is crucial, dealing as it does with a threat that could annihilate your business. ®

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?