Feeds

Aah, that warm sharing feeling. Just don't let the cloud rain on your firm

How to get biz-level file sharing right the first time, every time

Top 5 reasons to deploy VMware with Tegile

Increase your connection speed? Sure thing, boss, just hand us a billion or two

The only downside to sharing files between locations is the performance hit it entails. If you have two offices in London, for example, there is no problem at all as you pay next to nothing for a multi-megabit metro connection between them. But if you are split between London and Paris, this is definitely not true.

Network access

The first option is to see whether you can make a slow link into a fast one somehow, preferably without just upgrading the speed of the connection as that can be hugely costly. WAN optimisation – devices or software appliances that do statistical analysis on data streams and condense them based on frequently transferred sequences of bytes – is the obvious solution.

Optimiser equipment from the likes of F5, Silver Peak and Cisco isn't cheap, but it is a lot less pricey than an equivalent link speed upgrade.

Particularly for file-sharing applications, you generally experience an above 80 per cent optimisation average. That means a file that would take a minute to download over the raw link will download in 12 seconds when running through the optimisers.

Another alternative is to look at replicating data between sites – something that is best done invisibly rather than relying on humans to choose which files to replicate. The trouble is that when you have two copies of a file, you run into problems when the versions at two locations are being edited and saved at the same time.

There is an obvious way round this problem, of course: don't allow editing of data at any location other than its home filestore. While this sounds restrictive it can have plenty of advantages. Disaster recovery is made possible as you can maintain off-site copies of files in near real time.

And having a read-only copy of a fileset (or more usefully a database) can enable you to run reports at a location far from the source without the delays of the interconnect – and without your potentially complex reporting queries adding to the load of the primary server.

Look at the cloud

Thus far we have talked about how you can share files across your business by making the various locations' filestores available as required. But why not look at moving your files outside your network completely and putting them in a managed filestore that can be accessed by the various locations?

Any data store is insecure if you don't protect it properly

Of course, there is the common fear that cloud equals “insecure”, but that is utter rot. Any data store is insecure if you don't protect it properly, whether it is hosted on a server in your garage or on Microsoft Azure or Amazon's AWS.

Connectivity between your premises and the cloud will generally be via industry standard VPN protocols that are accepted by even the pickiest security auditor. Because you can extend your internal directory service to your cloud or hosted solution using LDAP, ADFS or the like, you have no need to add any onerous further layers to the access control function.

Even better, you still have the opportunity to use WAN optimisation because the larger cloud providers support the popular optimisers. You can plonk one at each of your sites and have your traffic streams optimised just as you would with an in-house setup.

Beginner's guide to SSL certificates

Next page: Parting of the ways

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Lenovo to finish $2.1bn IBM x86 server gobble in October
A lighter snack than expected – but what's a few $100m between friends, eh?
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.