This article is more than 1 year old

Coders are creatives too: Where's our love?...

The underappreciated class

Opinion Developers conjure something from nothing, all day, every day; to my mind this is creation in its purest sense. Okay, we’re not knocking up a universe in seven days or putting the final touches to The Scream – although it may often feel like that – but building order, fun, even beauty, from a tapestry of ones and noughts is surely a creative endeavor. But are coders considered creatives? Nope. My experience of business sees that title handed to graphic designers above all else; a troupe of ne’r do wells who pinch from a Google image search and spend the rest of their day colouring-in.

Users' experience of super-simple user-friendly 'apps' gives the illusion that not much complexity's involved in their creation. Image via Shutterstock

How did this happen? How did a person whose greatest educational achievement is crayoning without going over the lines get termed “a creative”, when the people who built our world are dismissed as geeks and bottom feeders?

The simplistic explanation is that society always looked destined to divide into Eloi and Morlocks. There are those who do, a class that now incorporates programmers, who in Karl Marx’s analysis control the means of production, but forever toil unappreciated in the shadows while the gilded Eloi dance thoughtlessly in the sunshine. My sympathies were always with Wells’ Morlocks. It’s compelling as an image, but somewhat lacking in explanatory power. Instead, I believe this division owes less to Marx and Darwin, and more to software designers’ own brilliance.

When an industry becomes mature, those at the cutting edge eventually become so distant from the end product that a certain alienation builds up – workers who don’t follow their wares to market become disassociated from their work in their own minds, and that of their customers. True enough, along with all mature industries, Marx was correct in this analysis, even if he was bang wrong on everything else.

But this doesn’t explain the meteoric rise of the faux creative industries, the doodlers and plagiarists, the hordes of Macbook-toting dullards who mask their lack of originality in design and business with opaque marketing-speak and the insouciant arrogance of terminal idiocy. This, while genuine technological innovation is dismissed. What’s happening is that software, hardware, and the all important user-interface are becoming so slick, so intuitive, so reliable, that the notion of a creator behind the wizardry is slipping from the consumer’s mind. Like our sewers and roads, like all the fundamentals of industrialised society that become invisible to Joe Public because of familiarity, fine software development is fading from mind.

The rise of the App, a tool that does its simple job without instructions, is emblematic of this. It’s almost a perfect inversion of the religious arguments for Intelligent Design; there, the existence of a perfect watch presupposes the existence of a perfect watchmaker - the complexity of life demands that something built that complexity. Yet the situation in our less-than-pious realm is the flipside because the complexity just isn’t perceived. Fluid and simple software suggests – to the ignorant – a fluid and simple backend.

Does this matter? Well, in practical terms, I believe it explains the eagerness with which even the most law-abiding will pirate software – they simply don’t grasp that any effort goes into it, so why should the builders be rewarded? However, it’s the broader implications that interest me most.

We know that happiness and health are linked to status in the workplace; perceived status, above all else. It’s not just about reward – self-perception of status rarely hinges on salary. Rather, we’re happier, and saner, if we feel appreciated and valued. Today in the outside world only journalists and politicans are viewed less favourably than those bundled into the pit of IT, so is it any wonder that many coders at their peak seem to struggle with depression, drink or substance abuse? Granted, in the UK today it's hard to find many under-forties who wouldn't be considered alcoholics in the US, but even so, on anecdotal evidence at least, these problems seem more acute among the IT community.

The orgy of adulation for Steve Jobs, undeniably an astute businessman, but a man whose main personal innovation appears to have been to make some previously matte products shiny, certainly drove me to drink. The media’s over-emphasis on design, on form over function, is just what we would expect though, as our understanding of what lies beneath dwindles. A society that from cars to celebrities celebrates the superficial, that refuses to engage with nuts and bolts but delights in froth, will naturally fail to see innovation in a particularly deft clause of C#, because it simply isn’t in their universe.

When the public does engage with IT, it’s generally in conflict – and then they’ll cast around for help from those they consider expert. I’m sure I’m not the only developer who dreads family Christmases for the inevitable slew of requests for help with wireless niggles and errant popups. Why ask a developer to configure a network? Because outside the coding cabal, all they know is that we do something with computers. But are graphic designers asked to help with the decorating?

It’s about comprehension. Design can be grasped without analysis. Our society is moving from one in which the educated took pride in knowing how the world worked to a more blinkered place; today, it’s about how you feel. It’s about how a mountain, or a motorcycle or a fridge-freezer makes you feel. Brand managers are full of it. The focus is on sensory input – not intellectual understanding of mechanisms. From school to showroom, empathy is the buzzword, sensation is the mechanism for conveying understanding.

Software can’t be understood with your eyes, ears or nose. But you can see the results of it: so that’s what impacts. Reaching forward, it’s only going to get worse. I bore anyone who’ll listen with my prediction for the ‘next big thing’ - the application of a Kinect-style interface to a smartphone. Sat on a pub table in front of you, projecting on a nearby wall, your fingers will dance in the air, the pretty pictures will respond accordingly – a pinch, a twirl, a snap – the delightful complexity of human hand gestures, captured and translated, fluid, flawless. Who will snap their fingers and think of the coders? No, Clarke’s law will apply I think. To the world, this isn’t an interface: it’s sorcery. We tell the end consumer: "Look what it does" and because it does it so well, they don't ask: "How does it work?"

As interfaces develop, whichever way that might go, the natural path is towards a more human method of communication – be that gesture or speech – and the natural outcome is to see the product as less of a distinct app, and more of an extension to our own heads, or even as a friend. Siri's already hitting a solid B+ on the Turing test.

The irony is that as developers engineer a convincing ghost in the machine, they simultaneously erase themselves from the minds of end users; the fabulous creation shunts aside the creators. The solutions to this aren’t promising. Schools think teaching PowerPoint is teaching computing. Girls in bars are not charmed by explanations of object oriented programming – believe me I’ve tried. A sonnet to SQL just won’t get the message over, and if there’s a haiku that conveys the initial buzz from a Hello World app, then I haven’t read it.

I can see only one possible route out, a way to alert end users to the poor saps doing all the work... Get sloppy. Bubble up some static. Introduce a few cracks, and the façade will fade. It’s not a bug, it’s a cry for attention. ®

Frank Fisher is a monkey see, monkey do, jack-of-all-trades developer, occasional blogger and full-time troublemaker.

More about

TIP US OFF

Send us news


Other stories you might like