McAfee Labs

Are Comparative Tests of AV Products Useful?

1
By on Jun 16, 2010

For a comparative review of anti-malware products to be useful to you, it has to be correct, comprehensive, and objective.

Unfortunately, producing a good test is not a simple task. You may think that it is, but it is not. It is like with cars – some are more reliable, some drink more petrol, some handle turns more confidently. But all cars normally do what they are sold for - transport you from point A to point B. Do you think that testing cars is easy? There are myriads of things to consider: safety, engine power, reliability, capacity, fuel consumption, color, size, seats’ shape, brand name, extras, trim quality. I think you may be getting the point. But surely any AV product is much simpler than a car!  Mmm, I am not so sure. There are many ways how a virus can get from point A to point B, you know. So there are myriads of technologies used to block malware propagation:

  • static scanning for known malware
  • generic family recognition for unknown variants of known malware
  • behavioural techniques
  • domain and URL blacklisting and reputation
  • cloud-based protection
  • frequent updating
  • anti-spam
  • virtualization and emulation
  • programs’ reputation
  • sandboxing
  • blacklisting
  • whitelisting
  • security policies
  • intrusion-prevention technologies

For many years AV products did fairly well with just maybe 3-4 of these techniques. Today, we are seeing a lot more malware (which is actually partly due to how well AV products detect – when old malware is blocked that forces bad guys to create new one) and AV companies have to use more techniques to counter this tsunami. As a result, it became significantly harder to test AV products. Firstly, products are generally more complex now. Secondly, different products use different combinations of technologies and it is sometimes impossible to test them in the same way. Take, for example, retrospective testing – it requires “freezing” a product to test it against threats which appeared after the “freeze-point”. Such tests have been done for years but you cannot apply this test methodology to a product based on an active cloud-based protection database (cloud cannot be frozen as it is not under the tester’s control).

It will take time and investment of resources for tests to adapt to the new complexities of the products. Returning to our analogy with cars – you really have to build a crash-test lab to perform crash tests. Same for AV products – testers need to take entire AV suites and check how well it protects the PC from a virus “crashing” into it, so to speak.

Fortunately, there is some help available now for testers (and users who want to understand how better comparative tests can be set up). AMTSO (www.amtso.org), a non-profit organization, comprised of experts in testing matters (AV testers, AV vendors, publishers, academia) published two documents on their Wed site:

  1. “AMTSO Whole Product Testing Guidelines”Â This paper suggests a holistic approach to verifying whether a product succeeded in blocking malware propagation – from point A (malware source) to point B (your computer).  The only important thing is whether malware is stopped at some stage – any contributing technology is allowed (even new and unknown ones!).
  2. “AMTSO Performance Testing Guidelines” There are really many elephant traps on the route to a good test of speed offered by an AV product. This paper describes these pitfalls in detail.

It is likely impossible to execute a perfect test of AV products but it does not mean we should not strive for improving them. I am sure that new documents from AMTSO will provide useful information and will inch all of us a little closer to satisfying your needs for a good dependable test.


One Comment on “Are Comparative Tests of AV Products Useful?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>