For a comparative review of anti-malware products to be useful to you, it has to be correct, comprehensive, and objective.
Unfortunately, producing a good test is not a simple task. You may think that it is, but it is not. It is like with cars – some are more reliable, some drink more petrol, some handle turns more confidently. But all cars normally do what they are sold for -Â transport you from point A to point B. Do you think that testing cars is easy? There are myriads of things to consider: safety, engine power, reliability, capacity, fuel consumption, color, size, seats’ shape,Â brand name, extras, trim quality. I think you may be getting the point. But surely any AV product is much simpler than a car!Â Mmm, I am not so sure. There are many ways how a virus canÂ get from point A to point B, you know. So there are myriads of technologies used toÂ block malware propagation:
- static scanning for known malware
- generic family recognition for unknown variants of known malware
- behavioural techniques
- domain and URL blacklisting and reputation
- cloud-based protection
- frequent updating
- virtualization and emulation
- programs’ reputation
- security policies
- intrusion-prevention technologies
For many years AV products did fairly well with just maybe 3-4 of these techniques. Today,Â we are seeing a lot more malware (which is actually partly due to how well AV products detect – when old malware is blocked that forces bad guys to create new one) andÂ AV companies have to use more techniques to counter this tsunami. As a result, it became significantly harder to test AV products. Firstly, productsÂ are generally more complex now. Secondly,Â different products use different combinations of technologies and it is sometimes impossible to test them in the same way. Take, for example, retrospective testing – it requires “freezing” a product to test it against threats which appeared after the “freeze-point”. Such tests have been done for years but you cannot apply this test methodology to a product based on an active cloud-based protection database (cloud cannot be frozen as it is not under the tester’s control).
It will take time and investment of resources for tests to adapt to the new complexities of the products. Returning to our analogy with cars – you really have to build a crash-test lab to perform crash tests. Same for AV products – testers need to take entire AV suites and check how well it protects the PC from a virus “crashing” into it, so to speak.
Fortunately, there is some help available now for testers (and users who want to understand how better comparative tests can be set up). AMTSO (www.amtso.org), a non-profit organization, comprised of experts in testing matters (AV testers, AV vendors, publishers, academia) published two documents on their Wed site:
- “AMTSO Whole Product Testing Guidelines”Â This paper suggestsÂ a holistic approach to verifying whether a product succeeded in blocking malware propagation – from point A (malware source) to point B (your computer).Â The only important thing is whether malware is stopped at some stage – anyÂ contributing technology is allowed (even new and unknown ones!).
- “AMTSO Performance Testing Guidelines”Â There are really many elephant traps on the route to aÂ good test of speed offered by an AV product. This paper describes these pitfallsÂ in detail.
It is likely impossible to execute a perfect test of AV products but it does not mean we should not strive forÂ improving them. I am sure that new documents from AMTSO will provide useful information and will inch all of us a little closer to satisfying your needs for a good dependable test.