This article is more than 1 year old

Is AV product testing corrupt?

Who can you trust?

I had a conversation a month or two ago with someone high up in one of the IT security companies. He was bemoaning the fact that his company's AV product had performed poorly in tests run by AV-Test.org. He was deeply suspicious of the results anyway because his company actually provides its AV engine to another company that had performed better in the test. He didn't see how that could be, unless a mistake had been made in running the tests.

As it happens there are a few AV vendors who are less than impressed with "independent AV Tests". The lists that are published influence buying decisions, but quite a few vendors believe they don't reflect product capability.

I was sent a well written essay on the topic which I'm reproducing here. It explains the problem better than I can…

The Need for a Standard Malware Competitive Comparison

When I go to buy something, the first thing I do is check out the competitive reviews. If it's a household appliance, I'll look at Consumer Reports. If it's a car, I may look at Car and Driver® or Edmunds. What about when you're looking at security for your home PC, who can you trust to give you the honest review?

The average consumer is being pummeled by competitive comparisons of the performance of anti-virus and anti-spyware. The comparisons include the large and the small anti-malware vendors, and they provide amazingly discordant results. Can I place my computer's health and safety in a free, online product? Which of the major companies have the best performance? Major magazines report comparison statistics, but which do you trust?

One of my favorite quotes is attributed to Benjamin Disraeli and popularised in the US by Mark Twain: "There are three kinds of lies: lies, damned lies, and statistics." He went on to explain that a person can use statistics to support an argument or even twist the statistics based on how the numbers are manipulated. This is a key issue with many of the product comparisons in the media today. Depending upon who paid for, supported or endorsed the test, the bias may change wildly.

I was just reading an article that really hit the nail on the head. Jeremy Kirk, in an article called "Security vendors question accuracy of AV tests" published in InfoWorld, talked about how this debate is finally being noticed by the public. The people he quotes are absolutely correct in their opinions that the current tests aren't truly reflective of the capabilities of today's anti-malware solutions.

In the article, John Hawes, a technical consultant for Virus Bulletin, said the signature-based tests are "not enormously representative of the way things are in the real world". That is an understatement in my opinion.

With almost any industry today, the acknowledged correct form for an evaluation is publish the criteria and methods used in their evaluations. This creates a clear and easily understood direction taken by most evaluators so that both their methods as well as their internal criteria are understood and can be carefully examined publicly and objectively. In the malware comparison market, such practices are not the norm, and this is concerning since results are often grossly misinterpreted.

As an analogy, if I was looking to build a system to detect cancer, I'd build one that detects every kind of cancer that's out there. I surely wouldn't allow individual drug companies to supply me with samples of some "special" kind of cancer that only their drug works on, that would be silly. Also, what good would that do to the public and to those that would lose their lives due to the lack of cancer detection?

This is analogous to testing practices in the anti-malware industry with respect to the detection of virus samples. It has been widely known that individual companies are being allowed to supply specific "samples" to several "independent testing companies" so that their product will rate much higher then the competitions'. This is not only unfair and technologically flawed, but also strides across a wide line of ethically appropriate behavior. This is potentially harmful to all of us as consumers and individuals on the Internet and to the Internet as a whole.

More about

TIP US OFF

Send us news


Other stories you might like