This article is more than 1 year old

Why I'm sick of the new 'digital divide' between SMEs and the big boys

IT design by white paper is a stick to beat the little guy

Vendors say buy

For many in our industry, "competence" is synonymous with "design by white paper". This word is wielded by CCNAs (Cisco Certified Network Associates) foisting $3,000 Cisco routers on SMBs with $60,000 yearly revenue and armchair systems architects who cannot disconnect from their lifetime's immersion in Tier 1 services and solutions.

"Competence" in our industry has become synonymous with regurgitating corporate messaging and deploying the newest solutions instead of those most fit for purpose.

It is both word and weapon: a sword oft drawn to mock and ridicule those who cannot afford to spend three years' gross income on the white paper solution to a problem.

I have had technology journalists at other publications belittle me for talking about such things. They believe that edge cases get too much press as it is.

We should focus on the middle of the curve, I am told; it is our ethical duty to use our privileged position of textual influence to spread "proper" administration and design principles.

This cry has been echoed by many a comment-poster as well: push the "right" way to do things, talk about the "right" vendors and only discuss the "best" technologies, means of application and most common use cases.

First, we assume a spherical cow

I acknowledge that this is probably a logical fallacy of some variety, but I can't get past the concept that more people would focus on "design by white paper" if it were such a great thing that worked for everyone.

The fact that discussions about all these edge cases, startup vendors, open source, cheaper alternatives and novel design approaches keep getting page views would seem to indicate to me that there is a demand for that information.

That this demand exists is anathema to many. To not champion the new simply because it is new is to hold back progress. We are to eat what's put on our plate, pay what we're told to and accept what works in overly simplified simulations of what our use cases "should be".

I posit that a new digital divide is growing. This one is a gap of abstraction. Those with large enough budgets can assume a spherical cow exists and proceed from there to design their entire farm.

Where the world doesn't meet their models they can just toss some more money at it and the problem goes away; the entire nature of their design process allows it.

Virtual networks and virtual storage running virtual desktops and automatically deployed applications in virtual data centres running on clouds of hypervisors managed by statistical modelling and predictive analytics tools. Everything can be solved with a few more nodes or a few more instances.

None of that works in a world where the cost of a single Enterprise CAL (client access licence) is a major expense, to say nothing of the server licence or the cost of the actual hardware.

So what is "proper" IT design? Is it an addiction to off-the-shelf technologies and white papers, compiling everything from source and knowing how to write machine code or is it far more complicated than either?

I believe that "proper" IT is about matching the solution to the problem and that it rarely ends up being the same exact solution twice. Your mileage may vary. Debate, as always, in the comments. ®

More about

TIP US OFF

Send us news


Other stories you might like