Virgin Media downed by Manchester arsonists
Halloween spooks cable operator
A suspected arson at a Manchester electricity substation left thousands of Virgin Media customers in the North West without TV and broadband for several hours overnight.
The Halloween attack in the Miles Platting area left 60,000 homes in the centre of the city without power at about 7.30pm. They were all back on the grid by 9pm.
Virgin's customers in the area also experienced a longer blackout of TV, internet and on-demand service.
A spokesman for the firm said today that most of its users were back online by 2am. He was not able to confirm suggestions from Reg readers in the area that its problems were caused by the air conditioning for its servers being knocked out.
The Virgin Media status page for the incident is here, though it hasn't been updated since 8.29pm yesterday.
There's more on the arson from the Manchester Evening News here. ®
Your last word sums up the problem, investment. The company I work for (A very, very large American Telco) has many data centres, pop sites, switch sites etc around the country which have great monitoring facilities but are sadly lacking any investment in infrastructure so unless something goes bang in a big way we are purely reactive. I have 1 site in Manchester where the aircon cant cope on a mild day, the solution? get some fans in, not even portable aircons, just fans.
What we really need to do is replace the 2 ailing, aging aircon units which, when installed were just about adequate for the equipment that was installed in there, several years of shoehorning more and more kit into any available space has left it just about ready to burst into flames (Perhaps when it does we can have a new ACU?)
I am guessing that the company I work for is not unique about its investment policy, if it (just) works then there is no need to replace it.
"Not always possible when hosting 24-7-365 apps and services"
Sorry above should be amended to:
"The whole system ‘should’ also be tested with a full <POWER> shutdown (not simulated) at least every six months."
We do this for both of the (24/7) hosting data centres we own (as well as a genset only test under load every three months) and all equipment keeps working (as it is designed to) but as this is a controlled shutdown so if anything didn’t work as it should you can restore power and sort it so that when/if there is a real power cut everything 'should' work.
The downside of the extra testing is that without the proper equipment there is a higher risk that when the power is restored equipment that has been taken off load and back on again will fail so it can be a difficult balance to reach.
Comfort cooling air con units need a lot of power to initially fire up so if they all fired up at once it may trip the main fuse (depending on the type as some are designed to ignore very short spikes) so it is common to either set them to start up a few at a time or set a trip to leave some off when the power returns. The problem with this is that it relies on someone being there and remembering to reset them.
If the first time they know that the air con is not working is when the servers start turning themselves off then much of the damage is already done. IP based single temperature monitors that send reports at high and critical levels are less than £300 so can be a good investment.
> "As a minimum personally I would not consider it a proper data centre without full UPS/generator back up."
You would think so wouldn't you? I was thinking along the same lines.
>"The whole system ‘should’ also be tested with a full shutdown (not simulated) at least every six months."
Not always possible when hosting 24-7-365 apps and services.
The other site with the air con problems was a normal office building and the ac was normal office ac controlled from the wall mounted unit. These have a habit of not coming back on after a powercut. The first sign that they havent come on is usually when servers start turning them selves off and someone goes to investigate.