Original URL: https://www.theregister.com/2008/01/18/mashup_reaction/

Mashups haunted by past experience

User-generated IT support

By Aubry Thonon

Posted in Channel, 18th January 2008 19:57 GMT

I have lived through re-orgs, outsourcing and off-the-shelf applications being shoehorned into niche markets by over-zealous management. The latest trend in software, though, is for user-generated mashups.

Recently Serena Software announced its user-friendly mashup tool. According to Serena, its tools will "let non-IT staff take care of tedious, line-of-business Office applications". I screamed. Serena is not the only IT vendor to make such noises.

Initially this sounds like a good idea - why shouldn't users be allowed to take charge of the "day to day" hassles, and short circuit the development process, leaving the IT staff to tackle really big projects?

The problem is, these pieces of code will make their way around an organization and while that can be good in some cases, inevitably pieces of unmonitored, unapproved code will be passed around from user to user with disastrous consequences. And, if - and when - something does go wrong, it will be the IT staff who have to go in and fix the problem. So much for easing the burden on IT.

Whether the mashup camp knows it or not, they still need the skills and support of experienced IT staff. Just because you know how to drive a car doesn't make you a race-driver. Neither does being able to do technical drawing qualify you as an architect.

To illustrate my point, let me share two real examples from my own IT experience.

Let's go back to the late 80s where a public-sector teacher got his hands on a programming manual for the language behind the school-automation system deployed in his area's public schools.

Hearing grumblings about the lack of decent library software in the school system, this person read the manual and wrote a piece of software that ultimately served the needs of his school, a DBQ database running on 486 PCs under DOS. Admirable. But now the problems began. His school librarian began to think of changes to the software and the teacher implemented them. Hearing of this system, librarians at other schools acquired copies of this software and began using it.

Pretty soon, the software had made its way around the state - more than 1,000 primary and secondary schools. And then someone phoned the Department of Education's IT help-line for help with the software. Frantic enquiries were made up and down the chain of command trying to figure out where the software had come from, and how it had spread to so many schools.

Eventually, the author was found, questions were answered and a compromise reached: the teacher was pulled from his teaching duties and moved to head office for the maintenance of the library software. The teacher was now folded into the IT team, which meant the maintenance of the software could be properly managed and documented should he ever leave or once he retired. It did mean, though, this one teacher never taught again.

Could this have been avoided? Very likely - the librarian could have contacted the IT department, the teacher could have passed his code to IT staff after it became apparent that it was useful - the points of recovery were there. But because the software was allowed to reach critical mass unchecked, it was too late to rein it in.

Skip forward almost a decade to another school system - a nationwide music academy responsible for overseeing the examinations of musicians and for awarding certificates.

With six major offices around the country, countless test centers and hundreds of people on staff, this was not a small organization. A panicked call for IT help was received one day and I - as the resident expert in both the hardware and software language they were using - was sent to investigate.

What I found was a system written in an SQL language that required the users to, at various points, break out of the program to run ad-hoc SQL queries to populate work tables with data for the program to process. And by users, I mean examiners with a keyboard under one hand and an SQL manual gripped in the other.

The system had been written by a staff member who, by the time I arrived, had already retired. Everything was so badly written that the author had been forced before leaving to perform various "housecleaning" tasks by hand on an almost daily basis to keep the system up and running.

We declined to maintain the system and advised the academy to purchase a commercial product - which it did six months later. It meant that they were using a closed-source system, but the software had been properly designed, documented and supported by a company that knew what it was doing.

Some might say this was the bad old days before web services and "open standards". But could any of this have been lessened using the philosophies of Web 2.0 or mashups?

In my opinion: no. For every success story in the free and open source software arena there are a multitude of projects that died through - let's be honest - incompetent coding practices. It's just that you never hear of them, so it's easy to forget these failures exist.

Simply applying a label like "mashup" or Web 2.0 does not suddenly make those involved into super programmers who can deliver the goods on time. A project under any coding philosophy is only as strong as its weakest development team member and mashups allow everybody to be a member.

There will, of course, be those who do know what they are doing and will work in a responsible manner. They may even leave comments on the code that they pass around and leave behind enough documentation that makes maintenance possible. I am not holding my breath, though. The majority will, I fear, fall squarely into the "know enough to be dangerous" category.

Register reader Aubry Thonon is a senior analyst with more than 20-years' experience as a tester, coder, specification writer, team lead, designer and analyst. A regular commenter on Reg stories, Aubry is based in Australia and studying to become a systems architect.