Feeds

Is AV product testing corrupt?

Who can you trust?

Top three mobile application threats

I had a conversation a month or two ago with someone high up in one of the IT security companies. He was bemoaning the fact that his company's AV product had performed poorly in tests run by AV-Test.org. He was deeply suspicious of the results anyway because his company actually provides its AV engine to another company that had performed better in the test. He didn't see how that could be, unless a mistake had been made in running the tests.

As it happens there are a few AV vendors who are less than impressed with "independent AV Tests". The lists that are published influence buying decisions, but quite a few vendors believe they don't reflect product capability.

I was sent a well written essay on the topic which I'm reproducing here. It explains the problem better than I can…

The Need for a Standard Malware Competitive Comparison

When I go to buy something, the first thing I do is check out the competitive reviews. If it's a household appliance, I'll look at Consumer Reports. If it's a car, I may look at Car and Driver® or Edmunds. What about when you're looking at security for your home PC, who can you trust to give you the honest review?

The average consumer is being pummeled by competitive comparisons of the performance of anti-virus and anti-spyware. The comparisons include the large and the small anti-malware vendors, and they provide amazingly discordant results. Can I place my computer's health and safety in a free, online product? Which of the major companies have the best performance? Major magazines report comparison statistics, but which do you trust?

One of my favorite quotes is attributed to Benjamin Disraeli and popularised in the US by Mark Twain: "There are three kinds of lies: lies, damned lies, and statistics." He went on to explain that a person can use statistics to support an argument or even twist the statistics based on how the numbers are manipulated. This is a key issue with many of the product comparisons in the media today. Depending upon who paid for, supported or endorsed the test, the bias may change wildly.

I was just reading an article that really hit the nail on the head. Jeremy Kirk, in an article called "Security vendors question accuracy of AV tests" published in InfoWorld, talked about how this debate is finally being noticed by the public. The people he quotes are absolutely correct in their opinions that the current tests aren't truly reflective of the capabilities of today's anti-malware solutions.

In the article, John Hawes, a technical consultant for Virus Bulletin, said the signature-based tests are "not enormously representative of the way things are in the real world". That is an understatement in my opinion.

With almost any industry today, the acknowledged correct form for an evaluation is publish the criteria and methods used in their evaluations. This creates a clear and easily understood direction taken by most evaluators so that both their methods as well as their internal criteria are understood and can be carefully examined publicly and objectively. In the malware comparison market, such practices are not the norm, and this is concerning since results are often grossly misinterpreted.

As an analogy, if I was looking to build a system to detect cancer, I'd build one that detects every kind of cancer that's out there. I surely wouldn't allow individual drug companies to supply me with samples of some "special" kind of cancer that only their drug works on, that would be silly. Also, what good would that do to the public and to those that would lose their lives due to the lack of cancer detection?

This is analogous to testing practices in the anti-malware industry with respect to the detection of virus samples. It has been widely known that individual companies are being allowed to supply specific "samples" to several "independent testing companies" so that their product will rate much higher then the competitions'. This is not only unfair and technologically flawed, but also strides across a wide line of ethically appropriate behavior. This is potentially harmful to all of us as consumers and individuals on the Internet and to the Internet as a whole.

Combat fraud and increase customer satisfaction

More from The Register

next story
Obama allows NSA to exploit 0-days: report
If the spooks say they need it, they get it
Heartbleed exploit, inoculation, both released
File under 'this is going to hurt you more than it hurts me'
Canadian taxman says hundreds pierced by Heartbleed SSL skewer
900 social insurance numbers nicked, says revenue watchman
German space centre endures cyber attack
Chinese code retrieved but NSA hack not ruled out
Burnt out on patches this month? Oracle's got 104 MORE fixes for you
Mass patch for issues across its software catalog
Reddit users discover iOS malware threat
'Unflod Baby Panda' looks to snatch Apple IDs
Oracle working on at least 13 Heartbleed fixes
Big Red's cloud is safe and Oracle Linux 6 has been patched, but Java has some issues
prev story

Whitepapers

Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Top three mobile application threats
Learn about three of the top mobile application security threats facing businesses today and recommendations on how to mitigate the risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.