Feeds

Marketing dept to blame for website crashes: official

Nice to be proven right, eh?

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

It is as you have always suspected: the marketing department is responsible for around a quarter of the overloads and crashes your poor company website suffers.

Website testing firm, SciVisum, spoke to marketing types in 100 UK-based companies, and found that 26 per cent don't ever mention planned online promotions to the guys and gals in the tech boiler room.

More than half admit they forget to provide a warning at least some of the time, and nearly two thirds of marketing bods confess to having no idea how many user transactions their website can support, despite an average transaction value of £50 to £100.

The consequence of this communications gap is not surprising: 73 per cent of companies reported website failures during marketing campaigns. Presumably the surviving few include the 22 per cent of companies who say they always talk to the tech team about such things.

Deri Jones, SciVisum CEO, says that while some of the gap can be attributed to the two groups traditionally not liking each other much, he thinks the problem really starts because marketing and IT approach things from such different angles.

"Marketing people have a tendency to blame the tech department when a campaign doesn't go as well as it should, but from a technology perspective, if the server hasn't crashed, everything is working well. IT measures server load, while marketing is looking for completed transactions."

Jones advises companies to consider the so-called user journey through the site, and says that it is essential that marketing and IT come together at the planning stages of any campaign to map out what this journey is likely to look like.

"The IT department needs to be able to plot this against the website design to ensure there are no hidden barriers to performance. Often, with knowledge of the journeys and the likely load levels, sensible code refactoring and configuration tweaking can give an order of magnitude throughput gains at the critical bottlenecks," he says.

He argues that a site can appear to be working just fine when it is under normal conditions, but oddities in the back end can trip users up when the site is under pressure. "If you look at the journey a user makes through a site, and try to follow that journey while simulating a high traffic load, you can find log jams in unexpected places," he concludes. ®

Related stories

ID theft fears prompt ecommerce boycott
Home delivery is the deal-breaker in ecommerce
Firefox users turned away from 10% of top UK sites

Secure remote control for conventional and virtual desktops

More from The Register

next story
BBC: We're going to slip CODING into kids' TV
Pureed-carrot-in-ice cream C++ surprise
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
Twitter: La la la, we have not heard of any NUDE JLaw, Upton SELFIES
If there are any on our site it is not our fault as we are not a PUBLISHER
Facebook, Google and Instagram 'worse than drugs' says Miley Cyrus
Italian boffins agree with popette's theory that haters are the real wrecking balls
Sit tight, fanbois. Apple's '$400' wearable release slips into early 2015
Sources: time to put in plenty of clock-watching for' iWatch
Facebook to let stalkers unearth buried posts with mobe search
Prepare to HAUNT your pal's back catalogue
Ex-IBM CEO John Akers dies at 79
An era disrupted by the advent of the PC
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.