Aah, that warm sharing feeling. Just don't let the cloud rain on your firm

How to get biz-level file sharing right the first time, every time

Increase your connection speed? Sure thing, boss, just hand us a billion or two

The only downside to sharing files between locations is the performance hit it entails. If you have two offices in London, for example, there is no problem at all as you pay next to nothing for a multi-megabit metro connection between them. But if you are split between London and Paris, this is definitely not true.

Network access

The first option is to see whether you can make a slow link into a fast one somehow, preferably without just upgrading the speed of the connection as that can be hugely costly. WAN optimisation – devices or software appliances that do statistical analysis on data streams and condense them based on frequently transferred sequences of bytes – is the obvious solution.

Optimiser equipment from the likes of F5, Silver Peak and Cisco isn't cheap, but it is a lot less pricey than an equivalent link speed upgrade.

Particularly for file-sharing applications, you generally experience an above 80 per cent optimisation average. That means a file that would take a minute to download over the raw link will download in 12 seconds when running through the optimisers.

Another alternative is to look at replicating data between sites – something that is best done invisibly rather than relying on humans to choose which files to replicate. The trouble is that when you have two copies of a file, you run into problems when the versions at two locations are being edited and saved at the same time.

There is an obvious way round this problem, of course: don't allow editing of data at any location other than its home filestore. While this sounds restrictive it can have plenty of advantages. Disaster recovery is made possible as you can maintain off-site copies of files in near real time.

And having a read-only copy of a fileset (or more usefully a database) can enable you to run reports at a location far from the source without the delays of the interconnect – and without your potentially complex reporting queries adding to the load of the primary server.

Look at the cloud

Thus far we have talked about how you can share files across your business by making the various locations' filestores available as required. But why not look at moving your files outside your network completely and putting them in a managed filestore that can be accessed by the various locations?

Any data store is insecure if you don't protect it properly

Of course, there is the common fear that cloud equals “insecure”, but that is utter rot. Any data store is insecure if you don't protect it properly, whether it is hosted on a server in your garage or on Microsoft Azure or Amazon's AWS.

Connectivity between your premises and the cloud will generally be via industry standard VPN protocols that are accepted by even the pickiest security auditor. Because you can extend your internal directory service to your cloud or hosted solution using LDAP, ADFS or the like, you have no need to add any onerous further layers to the access control function.

Even better, you still have the opportunity to use WAN optimisation because the larger cloud providers support the popular optimisers. You can plonk one at each of your sites and have your traffic streams optimised just as you would with an in-house setup.

Sponsored: 5 critical considerations for enterprise cloud backup

Next page: Parting of the ways