BYOD: A bigger headache for IT bosses than Windows Metro?
Your survival guide to giving users what they want
Nothing elicits passionate debate quite like the suggestion that consumer technology is dictating workplace IT - with the exception of arguments over the Windows 8 Metro desktop, perhaps.
The debate on the consumerisation of IT is packed with business, legal and human resources headaches. Individual prejudices and experience colour approaches to the topic.
Like "cloud" and "virtualization" before it, the consumerisation of IT and BYOD concepts are somewhat intermingled. The traditionally accepted definition of the consumerisation of IT is when a disruptive technology - think smartphones, social media, etc - takes root in the consumer market first and then migrates into businesses.
This is a "bottom up" approach to technological adoption and it causes angst among managers; in preceding decades, new technologies typically started off in the business sphere and moved out to consumers.
The cheerleaders of consumerisation would have us believe that the rise of the iPhone means the complete upheaval of the more traditional model; a frequent bogus claim is that consumer technology will completely supplant tech targeted at the enterprise. In truth, plenty of technologies are still developed for business use first before migrating "down" to consumers. Precious few consumers have a deduplicating NAS, Big Data rig, or an IaaS setup at home.
Still, consumer technologies are infiltrating the workplace.
Considering how many new technologies are out there, consumerisation can cover everything from software to hardware to cloud services. BYOD is generally used to describe a narrow subset of consumerisation; users purchasing their own devices and using them for work purposes - with or without the sanction of IT. I include notebooks and even desktops within BYOD as, in my experience, demands by users strongly impact the deployment of these devices. For the purposes of this article, I will use BYOD to mean any user-selected device.
Another trend lumped under the consumerisation of IT is Apple in the enterprise. This is a growing thing; IBM alone has more than 30,000 Macbooks deployed. Companies ranging from my own consultancy to behemoths like NASA, Google, Intel and even schools have adopted heterogeneous network policies.
How the consumerisation of IT can affect your network
The fundamental issue presented by the consumerisation of IT is the expansion of platforms present in your company. It could be as simple as device models not being vetted by IT. It could mean hardware, software or services from companies other than preferred suppliers. If not handled properly, the result is IT upon which users are dependent but for which IT has no visibility, potentially no training and likely no spares in stock.
On the software side, we are talking about a potential explosion in software used by everyday users. The corporately mandated browser might be quietly replaced with Firefox or Chrome. With the adoption of new browsers comes the use of plug-ins and extensions.
Cloud services such as Teamviewer - designed to work around corporate firewalls - may also start showing up. Social networks, instant messaging, alternative office packages, and user experience improvement tools (such as uBit menu and Classic Shell) are all things that IT departments may have to contend with.
How will this impact me?
Talking to vendors or asking IT practitioners about the scope of consumerisation within their organization does not give you a true appreciation for what is in fact going on. Here you are seeing only the results of companies that have noticed, and acknowledged the consumerisation of IT. Attempting to gauge the scope of the problem at a macro level will be difficult; how do you take into account clandestine IT deployments you haven't become aware of?
The effect that consumerisation has on your organization will depend on how powerful the push from users is, and whether or not you choose to embrace it. If nobody is pushing for anything different than what is already on the table, then everybody wins and life is good. If, however, there is a deep rooted dissatisfaction with the hardware, software, and services provided by the IT dept, then clandestine deployments of unauthorized computing will inevitably start to appear.
Anecdotally, a significant percentage of clandestine IT is actually created by IT department workers themselves. Everything from stealth installations of Spiceworks to using remote connectivity tools to access a home computer or test lab are fairly common occurrences. In general, most organisations quietly ignore this, but when users start getting in on the game, suddenly there's a problem.
There are a lot more users than there are members of IT. Unlike IT, users probably don't have a true appreciation for the scope of the issues that their decisions can create. User-driven clandestine IT is also frequently invisible. The rest of IT knows about the secret Spiceworks server, even if they don't mention it to other departments. Users fearing reprisal are not going to speak out about using their iPad for work unless caught.
Be honest: Who is in control of your workplace IT? You or the user?
The severity of these risks varies with the business and the laws governing the jurisdictions in which they operate. Most clandestine IT deployments are - at their heart - an effort by employees to bypass company policies and procedures they feel are too restrictive. In some cases (I'd argue in most cases) it is simply a desire on the workers' part to feel that they have a sense of control. This can too easily get mixed up with a feeling that IT policies exist only so that some management types can justify their existence.
For others, it is the belief - accurate or not - that the ability to use their preferred service, software or device will make their job easier. As soon as this happens, corporate information moves beyond corporate control. The lost employee iPhone could contain millions of credit card numbers; the lost Surface tablet, an entire province's medical records.
Vulnerability to malware or even physical access attacks are greatly increased with a personal device over corporately controlled units. The effectiveness of centralized antivirus and mandated unlock passwords evaporate if users start walking around with unmanaged endpoints.
Beyond the obvious headline-grabbing security and privacy risks lurk far more mundane threats. Document retention legislation could result in serious fines if critical business communication begins occurring outside of corporate retention mechanisms. Regulations such as Sarbanes-Oxley and certifications including PCI compliance require certain levels of corporate security be applied to various bits of corporate information.
Industrial espionage is a lot harder to prove if information access occurs outside audited systems. Gaffes and mistakes can present an unprofessional corporate image. Is your salesman communicating with clients from his personal Gmail account? Is such communication purposeful, or did he mean to select the corporate account in his mail client but forgot?
Four ways to grab the reins
There are three basic approaches to dealing with the consumerisation of IT. The first (and often the instinctual) response is the Fortress IT approach: introducing or reinforcing policies forbidding the use of non-approved devices, software, or services. This is typically followed by a vicious crackdown, "education" of users and firings. The odd legal action against an employee for emphasis is periodically employed.
Digital natives don't seem to take particularly kindly to this approach. Unless there is an exceptionally good reason to be working at your company, Fortress IT will result in nothing except the assurance that you will not attract tomorrow's best and brightest.
The second approach, diametrically opposed to the first, is embracing the chaos. Far more popular among small businesses than large, this approach relies on training and trust in employees to treat corporate data with respect. It is typically enacted with fixed allowances for devices and paired with complete freedom for the employee to choose what they wish to use. If the user wants a better device than the corporate allowance, they must stump up the extra. Issues of who owns the device if the employee leaves before three years are common.
While companies using this approach are generally less carpe diem regarding software and services, they have far more open policies than other approaches. The focus of companies embracing the chaos is giving employees the tools they need to do their job. This comes with an implicit acknowledgement that the subject matter experts - the employees - are in a better position to determine what those tools are than management or IT.
Embracing the chaos requires understanding and clearly communicating the risks data loss pose to the company. Above all it requires honesty and trust by both the employer and the employee.
The third approach to coping with the consumerisation of IT is "Deploy Desired Devices" (DDD). The DDD approach relies on accepting that in any instance where we are dealing with employees who are not completely interchangeable, the "least cost" approach to IT has been a failure.
Instead of deploying rickety $400 Acer specials whose performance hasn't changed since the line was introduced six years ago, the DDD approach would require deploying hardware, software and services that people actually want to use. Instead of forcing a Blackberry or a Windows Phone on an employee, options are expanded to include desirable devices such as Android phones or iPads.
The DDD approach requires that companies talk to their staff, find out what people want to use, and why. It requires moving from supporting the smallest possible range of devices, software, and services to embracing "the new" even if that includes Apple in the enterprise.
There is also an implicit fourth approach: pretend that the consumerisation of IT isn't happening. Doing nothing is its own approach. It is even potentially valid if clandestine IT has not yet started it's inevitable infiltration. In the long run the "doing nothing" approach will ultimately fail.
Each approach will bear a cost. Clamping down on your staff will result in either high turnover or in having to provide alternative incentives for staff retention (pension plans, etc). Embracing the chaos requires true jack-of-all-trades systems administrators with a wide range of experience. And those are expensive.
DDD can be accomplished with existing support staff, but requires a corporate attitude of not skimping on the digital tools of the trade. Doing nothing is a gamble that will lead inevitably to data loss and possibly expensive legal concerns.
Still, it's not as bad as all that
Controlling a heterogeneous environment or doing testing on new software or services has traditionally been difficult and expensive. Vendors are aware of this. As demand for new technology has increased so too has the ease of meeting those demands.
Mobile device management software has come a long way. Popular enterprise endpoint management software now regularly supports Apple and even Linux. Virtualization allows the creation and destruction of test environments with ease. RDS, VDI, and ThinApp-style technologies allow the delivery of corporate applications or entire managed environments to unmanaged devices. Cloud aware inventory software is increasingly capable of tracking and monitoring clandestine IT.
The tools and resources necessary to support a broader range of devices, software, and services are becoming commonplace, even if the manageability of some devices lags behind.
Excepting in exceptional circumstances, "it's too hard" is no longer a valid excuse for failing to set policy regarding the consumerisation of IT. We have the technologies required to deal with this issue. What remains is choosing whether or not to acknowledge the reality of it, and what approach your business will take. ®