Feeds

Why I'm sick of the new 'digital divide' between SMEs and the big boys

IT design by white paper is a stick to beat the little guy

Internet Security Threat Report 2014

Vendors say buy

For many in our industry, "competence" is synonymous with "design by white paper". This word is wielded by CCNAs (Cisco Certified Network Associates) foisting $3,000 Cisco routers on SMBs with $60,000 yearly revenue and armchair systems architects who cannot disconnect from their lifetime's immersion in Tier 1 services and solutions.

"Competence" in our industry has become synonymous with regurgitating corporate messaging and deploying the newest solutions instead of those most fit for purpose.

It is both word and weapon: a sword oft drawn to mock and ridicule those who cannot afford to spend three years' gross income on the white paper solution to a problem.

I have had technology journalists at other publications belittle me for talking about such things. They believe that edge cases get too much press as it is.

We should focus on the middle of the curve, I am told; it is our ethical duty to use our privileged position of textual influence to spread "proper" administration and design principles.

This cry has been echoed by many a comment-poster as well: push the "right" way to do things, talk about the "right" vendors and only discuss the "best" technologies, means of application and most common use cases.

First, we assume a spherical cow

I acknowledge that this is probably a logical fallacy of some variety, but I can't get past the concept that more people would focus on "design by white paper" if it were such a great thing that worked for everyone.

The fact that discussions about all these edge cases, startup vendors, open source, cheaper alternatives and novel design approaches keep getting page views would seem to indicate to me that there is a demand for that information.

That this demand exists is anathema to many. To not champion the new simply because it is new is to hold back progress. We are to eat what's put on our plate, pay what we're told to and accept what works in overly simplified simulations of what our use cases "should be".

I posit that a new digital divide is growing. This one is a gap of abstraction. Those with large enough budgets can assume a spherical cow exists and proceed from there to design their entire farm.

Where the world doesn't meet their models they can just toss some more money at it and the problem goes away; the entire nature of their design process allows it.

Virtual networks and virtual storage running virtual desktops and automatically deployed applications in virtual data centres running on clouds of hypervisors managed by statistical modelling and predictive analytics tools. Everything can be solved with a few more nodes or a few more instances.

None of that works in a world where the cost of a single Enterprise CAL (client access licence) is a major expense, to say nothing of the server licence or the cost of the actual hardware.

So what is "proper" IT design? Is it an addiction to off-the-shelf technologies and white papers, compiling everything from source and knowing how to write machine code or is it far more complicated than either?

I believe that "proper" IT is about matching the solution to the problem and that it rarely ends up being the same exact solution twice. Your mileage may vary. Debate, as always, in the comments. ®

Beginner's guide to SSL certificates

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Do you spend ages wasting time because of a bulging rack?
No more cloud-latency tea breaks for you, users! Get a load of THIS
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Internet Security Threat Report 2014
An overview and analysis of the year in global threat activity: identify, analyze, and provide commentary on emerging trends in the dynamic threat landscape.