Feeds

Boffins devise early-warning bot spotter

Conficker's Achilles Heel

Boost IT visibility and business value

Researchers have devised a way to easily detect internet names generated by so-called domain-fluxing botnets, a method that could provide a first-alarm system of sorts that alerts admins of infections on their networks.

Botnets including Conficker, Kraken and Torpig use domain fluxing to make it harder for security researchers to disrupt command and control channels. Malware instructs infected machines to report to dozens, or even tens of thousands, of algorithmically generated domains each day to find out if new instructions or updates are available. The botnet operators need to own only a few of the addresses in order to stay in control of the zombies. White hats effectively must own all of them.

It's a clever architecture, but it has an Achilles Heel: The botnet-generated domain names – which include names such as joftvvtvmx.org, ejfjyd.mooo.com, and mnkzof.dyndns.org – exhibit tell-tale signs they were picked by an algorithm rather than a human being. By analyzing DNS, or domain name system, traffic on a network, the method can quickly pinpoint and disrupt infections.

“In this regards, our proposed methodology can point to the presence of bots within a network and the network administrator can disconnect bots from their C&C server by filtering out DNS queries to such algorithmically generated domain names,” the researchers wrote in a paper that was presented this week at the ACM Internet Measurement Conference in Australia.

The method uses techniques from signal detection theory and statistical learning to detect domain names generated from a variety of algorithms, including those based on pseudo-random strings, dictionary-based words, and words that are pronounceable but not in any dictionary. It has a 100-percent detection rate with no false positives when 500 domains are generated per top-level domain. When 50 domains are mapped to the same TLD, the 100-percent detection rate remains, but false positives jump to 15 percent.

The technique was developed by Sandeep Yadav, Ashwath K.K. Reddy, and A.L. Narasimha Reddy of Texas A&M's Electrical and Computer Engineering department, and Supranamaya Ranjan of Sunnyvale, California-based Narus. A PDF of their paper is here. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
Microsoft: We plan to CLEAN UP this here Windows Store town
Paid-for apps that provide free downloads? Really
Snowden on NSA's MonsterMind TERROR: It may trigger cyberwar
Plus: Syria's internet going down? That was a US cock-up
Who needs hackers? 'Password1' opens a third of all biz doors
GPU-powered pen test yields more bad news about defences and passwords
e-Borders fiasco: Brits stung for £224m after US IT giant sues UK govt
Defeat to Raytheon branded 'catastrophic result'
Hear ye, young cyber warriors of the realm: GCHQ wants you
Get involved, get a job and then never discuss work ever again
Chinese hackers spied on investigators of Flight MH370 - report
Classified data on flight's disappearance pinched
Microsoft cries UNINSTALL in the wake of Blue Screens of Death™
Cache crash causes contained choloric calamity
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
BYOD's dark side: Data protection
An endpoint data protection solution that adds value to the user and the organization so it can protect itself from data loss as well as leverage corporate data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?