Feeds

Conficker botnet wake up call only pinged zombie minority

Resident evil

Internet Security Threat Report 2014

The effective size of the Conficker botnet might be far smaller than previously thought.

Last week machines infected with the latest variant of Conficker began to download additional components - files associated with the rogue anti-malware application SpywareProtect2009 and a notorious botnet client, Waledac - via the worm's built-in P2P update mechanism.

Security researchers at Kaspersky Lab have developed an application that analyses the P2P network communications associated with the malware. Over a 24-hour observation period, Kaspersky analysts spotted 200,652 unique IP addresses participating in the network, far less than initial estimates of infected Conficker hosts that ran into the millions.

However Kaspersky notes that the low volume is explained by the fact that only the latest variants of the worm are communicating via the monitored P2P network. In addition, only a minority of the nodes infected with earlier variants of the worm have been updated to the latest version.

A more detailed analysis, including geographical breakdown of compromised hosts, can be found on Kaspersky's blog here. ®

Remote control for virtualized desktops

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?