Feeds

Is AV product testing corrupt?

Who can you trust?

Next gen security for virtualised datacentres

I had a conversation a month or two ago with someone high up in one of the IT security companies. He was bemoaning the fact that his company's AV product had performed poorly in tests run by AV-Test.org. He was deeply suspicious of the results anyway because his company actually provides its AV engine to another company that had performed better in the test. He didn't see how that could be, unless a mistake had been made in running the tests.

As it happens there are a few AV vendors who are less than impressed with "independent AV Tests". The lists that are published influence buying decisions, but quite a few vendors believe they don't reflect product capability.

I was sent a well written essay on the topic which I'm reproducing here. It explains the problem better than I can…

The Need for a Standard Malware Competitive Comparison

When I go to buy something, the first thing I do is check out the competitive reviews. If it's a household appliance, I'll look at Consumer Reports. If it's a car, I may look at Car and Driver® or Edmunds. What about when you're looking at security for your home PC, who can you trust to give you the honest review?

The average consumer is being pummeled by competitive comparisons of the performance of anti-virus and anti-spyware. The comparisons include the large and the small anti-malware vendors, and they provide amazingly discordant results. Can I place my computer's health and safety in a free, online product? Which of the major companies have the best performance? Major magazines report comparison statistics, but which do you trust?

One of my favorite quotes is attributed to Benjamin Disraeli and popularised in the US by Mark Twain: "There are three kinds of lies: lies, damned lies, and statistics." He went on to explain that a person can use statistics to support an argument or even twist the statistics based on how the numbers are manipulated. This is a key issue with many of the product comparisons in the media today. Depending upon who paid for, supported or endorsed the test, the bias may change wildly.

I was just reading an article that really hit the nail on the head. Jeremy Kirk, in an article called "Security vendors question accuracy of AV tests" published in InfoWorld, talked about how this debate is finally being noticed by the public. The people he quotes are absolutely correct in their opinions that the current tests aren't truly reflective of the capabilities of today's anti-malware solutions.

In the article, John Hawes, a technical consultant for Virus Bulletin, said the signature-based tests are "not enormously representative of the way things are in the real world". That is an understatement in my opinion.

With almost any industry today, the acknowledged correct form for an evaluation is publish the criteria and methods used in their evaluations. This creates a clear and easily understood direction taken by most evaluators so that both their methods as well as their internal criteria are understood and can be carefully examined publicly and objectively. In the malware comparison market, such practices are not the norm, and this is concerning since results are often grossly misinterpreted.

As an analogy, if I was looking to build a system to detect cancer, I'd build one that detects every kind of cancer that's out there. I surely wouldn't allow individual drug companies to supply me with samples of some "special" kind of cancer that only their drug works on, that would be silly. Also, what good would that do to the public and to those that would lose their lives due to the lack of cancer detection?

This is analogous to testing practices in the anti-malware industry with respect to the detection of virus samples. It has been widely known that individual companies are being allowed to supply specific "samples" to several "independent testing companies" so that their product will rate much higher then the competitions'. This is not only unfair and technologically flawed, but also strides across a wide line of ethically appropriate behavior. This is potentially harmful to all of us as consumers and individuals on the Internet and to the Internet as a whole.

The essential guide to IT transformation

More from The Register

next story
Goog says patch⁵⁰ your Chrome
64-bit browser loads cat vids FIFTEEN PERCENT faster!
Chinese hackers spied on investigators of Flight MH370 - report
Classified data on flight's disappearance pinched
NIST to sysadmins: clean up your SSH mess
Too many keys, too badly managed
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
Researchers camouflage haxxor traps with fake application traffic
Honeypots sweetened to resemble actual workloads, complete with 'secure' logins
Attack flogged through shiny-clicky social media buttons
66,000 users popped by malicious Flash fudging add-on
prev story

Whitepapers

Best practices for enterprise data
Discussing how technology providers have innovated in order to solve new challenges, creating a new framework for enterprise data.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?