Data loss and security: Prevention is better than the cure, says Synology

Why IT departments need a fresh approach to info security and management requirements

Sponsored On the face of it, demand for enterprise data loss prevention (DLP) systems that are intended to safeguard sensitive information against damaging and expensive incidents of data loss is growing. The Radicati Group estimates that the market for DLP products and services will expand from $1.3bn in 2019 to be worth $2.2bn by 2022 for example.

But new thinking suggests traditional approaches to DLP have failed to keep pace with sweeping changes in the way IT departments have evolved to store and secure mission critical data and workloads. Gartner has gone so far as to suggest that the market should be entirely redefined to reflect the fact that security and risk management processes must be brought into the information life cycle.

So what has changed, and why do IT departments need a fresh approach to the way they approach their data security and management requirements? A survey of 906 IT professionals working for small to medium enterprises (SMEs) by Spiceworks provides the answer. It discovered that their data is now scattered across an increasingly complex and disparate network of storage infrastructure spread across different locations as companies migrate more information and applications from physical, on-premises hardware to virtual cloud-based systems.

It makes a lot of sense for some workloads to be kept in flexible, lower-cost public cloud infrastructure but others are either so sensitive – or have specific performance requirements in terms of latency and access – that must be hosted in on-premise private clouds or hybrid platforms that span both sides of the company firewall. Failing to pick the right option in either scenario could be seriously detrimental to productivity if workers are not able to quickly access the information and applications they need to keep the business ticking over.

How multi-cloud changes DLP challenge

While being flexible about how and where data is hosted delivers significant business benefits, there is no getting away from the fact that the more cloud services the organisation engages (with or without the tacit approval of the IT department) the greater the number of off-premise systems and repositories data tends to inhabit. And that increases the vulnerability of the information being stored and transmitted to security risks which occupy multiple points in cloud-oriented infrastructure platforms and connectivity chains.

“Ensuring the confidentiality, integrity and availability of data is the key objective of most security programs in organisations today,“ IDC analyst Pete Lindstrom and Robert Westervelt said in the company’s IDC MarketScape: Worldwide Data Loss Protection 2018 Vendor Assessment report. “The data proliferates throughout data centres, partners, and clouds at a rate driving productivity for some but also increasing risk when sensitive data is involved.”

Information is potentially at risk of being mishandled by third party cloud providers while malicious hacks of their infrastructure or compromises of user accounts can also lead to data leakage for example. Inevitably, IT departments worry about their ability to control and protect what they cannot see – trusting the integrity of sensitive data to a cloud provider is often a leap of faith for many.

If it is not encrypted at source and destination – or is transmitted between on- and off-premise systems via the public internet rather than a virtual private network (VPN) tunnel – sensitive information transferred beyond the company firewall can also be vulnerable to interception by hackers.

IT departments also have to think about how they ensure data is properly deleted from remote systems when cloud provisioning contracts come to an end, and how to retrieve large files from distributed architecture quickly. They must do so not only to meet performance requirements but also in response to subject access requests (SARs) mandated by data-protection regulation and to satisfy internal/external auditing requirements.

As Spiceworks’ survey discovered, there are still data sets that businesses prefer to keep on-premise, largely due to the fear of being non-compliant with local or regional data protection laws like the European Union (EU) General Data Protection Regulation (GDPR) or UK 2018 Data Protection Act, or industry governance rules like the Payment Card Industry Data Security Standard (PCI DSS) which regulates large parts of the financial services industry. That data often includes financial, employee, customer and personal identifiable information (PII) for instance, much of which is not judged ideal for cloud hosted storage.

Prevention is better than cure for SMEs

One of the simplest defences against data loss and any unnecessary regulatory attention is to make sure that data is properly backed up in the first place and can be quickly retrieved from backup sets and archives as and when it is needed in response to requests from internal stakeholders, customers or auditors.

But backing up data spread across the various distributed repositories, hosting platforms and storage media that make up hybrid cloud platforms – many of which are outside the direct visibility and control of the storage managers – can also be complicated. The situation is exacerbated in businesses operating from multiple locations which use different backup solutions to support cross organisation disaster recovery initiatives, with management complexity and lack of in-house expertise in small branch offices further contributing to the problem.

A wealth of backup and disaster recovery systems are available, often with embedded DLP tools that help to identify, track and manage sensitive data according to internally set policies and regulatory frameworks alike. But these can be expensive for smaller businesses who want simple set-up and configuration, assisted updates and ready access to troubleshooting resources and support if backup jobs fail. Having one number to call, the proverbial “one throat to choke” is important to SMEs without the in-house knowledge and expertise to troubleshoot their own storage security issues when they arise.

Many smaller organisations also tend to favour platforms that focus on ease of use. Here a single interface such as a self-service recovery portal would allow staff to manage data protection and archiving while also offering search, discovery, backup and auditing. Replacing multiple software tools with a single, centralised solution can consolidate DLP and backup software licensing and subscription fees and gives staff the ability to replicate multiple physical, virtual and cloud-hosted applications and workloads to a safe off-site location. Having an integrated DLP and backup and disaster recovery platform which is cost competitive and maintained by a third-party provider is often the best approach for companies with limited budgets and on-site staff resources to manage that overhead themselves.

Risk assessment prior to engagement

With no deceleration in cloud migration in sight, it is imperative that both DLP and backup systems evolve. They must evolve to safeguard sensitive data stored in both on-premise systems and cloud-hosted environments and be able to protect data as it moves between customers’ own file servers, direct (DAS) and network attached storage (NAS) systems and those of different cloud providers.

Before you start to roll out such a DLP or backup system, and before engaging with any providers, IT departments have to work out their requirements and figure out the files and workloads that have to be protected – how they are going to be protected. This means figuring out, for example, whether you want a network-based DLP system installed at the corporate gateway to monitor data in transit when it leaves the safety of the corporate firewalls. This means systems that can that typically scan network traffic such as email, instant messaging (IM), FTP, web-based HTTP/HTTPS, and peer to peer applications for leaks of sensitive information. You could also implement host-based, or endpoint, DLP systems on desktops, laptops, mobile devices, file servers and other types of data repository and storage infrastructure. A good integrated back-up appliance for Windows and Linux here would provide data discovery and classification with deduplication tools that help manage the backup and archival processes.

Either way you should conduct a thorough evaluation of your existing tools and practices. IDC recommends IT departments liaise closely with other inside your organisation to identify the data that needs to be protected. That means talking to those outside of IT, including HR and legal teams for example. You should also conduct some form of risk assessment and data discovery exercise to identify the type and sensitivity of information spread across their IT environment, including both on- and off-premise storage infrastructure and cloud hosting environments. That process should ideally determine how the data is used and by who, for what purpose, to give storage managers a better idea of where it is best hosted to meet application performance and employee productivity requirements.

Cloud is not minimising the risks associated with data, it’s helping to amplify them and in this world where preventing data loss is better than dealing with the aftermath, it pays to have a well-thought-out plan and a centralised means of executing it. With an explosion in workloads, data and virtual images that means prioritisation, based on knowing what’s important. Do that and you will be on the way to getting a DLP and backup system that avoids the damage associated with lost data.

Sponsored by Synology.

Sponsored: How to Process, Wrangle, Analyze and Visualize your Data with Three Complementary Tools

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER




Biting the hand that feeds IT © 1998–2019