Controlling Application Access
A network security and QoS checkpoint
The pressure is growing on the corporate network and the systems it supports.
When asked about their network and application access infrastructure in a recent research study, the 404 respondents who participated highlighted a range of escalating pressures. Organisations of all sizes are seeing greater demands as a result of core business growth and a general increase in the use of technology and information within the business. Overlaid on this is the additional pressure arising from home and mobile working, BYOD, and greater access of IT systems by customers, partners and suppliers.
Difficulties keeping up are negatively impacting the business.
With no sign of the growth in traffic or the appetite for broader access diminishing, keeping up with the trends is proving hard. Challenges reported include poor/unpredictable application performance and unplanned outages which undermine productivity and even interrupt the business. Changing access patterns, a range of external threats, and the unintended consequences of virtualisation are all then hampering the management of security in particular. Many also call out excessive costs to the business.
Many problems stem from shortcomings in the access infrastructure.
The trends being highlighted shine a spotlight on the corporate access infrastructure and the components within it for dealing with performance, availability and security. Piecemeal growth means that many are struggling with a complex and disjointed network environment containing lot of old technology requiring an excessive amount of manual administration to keep it running. Furthermore, the majority report significant shortfalls in capability in relation to a range of specific performance and security management functions.
Moving forward successfully has a philosophical as well as a practical dimension.
From a philosophical perspective, two mind-set shifts are required. The first is from an infrastructure to a services view of performance and availability management, and the second is to focus less on the ‘network perimeter’ approach to security and think more in terms of multi-layered protection accompanied by effective analytics. From a practical perspective, there is then a need to analyse and prioritise requirements more objectively, and make sure that the right knowledge, skills and tooling are applied appropriately.
A concerted effort must be made to break the ‘reactive investment’ habit.
Most investment intentions currently revolve around tactical requirements such as the replacement of obsolete equipment or implementation of new applications. When making architectural and technology decisions a conscious effort must be made to think of the bigger picture, and move progressively and proactively towards a more coherent access infrastructure capable of dealing with future needs.
IT and business people are often guilty of applying dual standards when it comes to the use of technology. Encryption of business data on smartphones and tablets is regarded as critical, while laptop users have carried around sensitive information with no such protection for years. Meanwhile everyone throws their arms up in horror when a cloud service provider experiences an outage or security attack, even though frequent downtime, wild fluctuations in performance, and the occasional leakage of confidential information are simply accepted as facts of life in relation to internal systems and data.
Such dual standards are generally not the result of deliberate doublethink. New solutions are always considered in the context of the then current requirements, and are implemented using technology and techniques available at the time. The trouble is that both requirements and technology evolve continuously, and we often forget (or don’t have the time) to revisit older systems and infrastructure to make sure they are still current from a technology perspective and are still meeting business needs effectively.The result is that systems gradually fall behind in terms of fitness for purpose and operational efficiency.
This phenomenon is frequently observed in relation to the corporate access infrastructure that deals with many of the important aspects of application performance, availability and security. The chances are that in your organisation, the fundamentals of this were designed into your network five, ten or even fifteen years ago. Since then, modifications and extensions are likely to have been implemented in a piecemeal manner to deal with individual application requirements on a case by case basis. It has probably been a while, however, since you stood back, looked at the state of your access infrastructure as a whole, and asked yourself not just how well it is coping with requirements today, but how ready it is for the future.
With application and network loads increasing, access patterns evolving, and new security threats emerging almost daily, these are highly pertinent questions. In this report, with the help of input gathered from over 400 participants in a recent research study, we therefore examine the way in which requirements and expectations are changing in relation to application access and how well infrastructures are coping. Along the way, we also take a look at some of the imperatives for future proofing your environment.
Sponsored: Are DLP and DTP still an issue?