Some additional commentary about the Antimalware Testing Group

You may have seen the recent articles on a new antimalware testing effort being launched. We’re part of this effort and very glad to see that it’s happening. New and standardized testing methodologies in this day and age are vital.

Andreas Marx, who is spearheading the new testing group, emailed some excellent remarks on the effort, and with his permission, I’m posting some of his comments:

As you know, the number of new (unique) malware files per day is increasing in high numbers — so far, we're getting something around 2,000 to 2,500 samples per hour from various sources. The average lifespan of a malware file (used with criminal intent) is, however, only seven hours, according to Symantec.

Current AV software tests are still focusing mainly on some kind of "detection scores", but testing the software against millions of inactive, outdated and thus "dead" files can't be seen as useful anymore and the results of such tests are not only less meaningful, but they mislead the average user a lot.

Most of the currently used tests were developed 15 to 20 years ago, but the attack vectors and the offered protection has changed a lot.

Take the example of cars, we no longer only have safety belts as protection, but also different kind of airbags, ABS, car stabilization features, crumble zones and so on.

Nobody would think, that a review which is only focused on safety belts would give a reader an idea how well the protection of a car is working. In case of anti-malware products, the situation is very similar. Therefore, it's not only important but essential that all parts of the products are tested in a proper way. A single safety belt check... erm, test for detection scores only is not enough.

This includes but is not limited to the testing of the behavior-based detection mechanisms (also called "Dynamic Detection") which are now used in more and more products. Proper detection and removal of actively running rootkits as well as of other malware and ad-/spyware are also points to consider. Testing the "real world" experience would be the way to go. But how to do this?

During the International Antivirus Testing Workshop in Reykjavik

(Iceland) the industry started to discuss the idea how anti-malware products could be tested in a better way than it's done today. At the end of the workshop, representatives from Symantec, Kaspersky, F-Secure and Panda, together with AV-Test.org, formed a plan to create a working group. This entity should not only publish guidelines and papers on a regular basis on the topic of AV testing but also educate the users and other testers. Therefore the tests and thus finally the protection products can be improved. When tests are focused only on "outdated" aspects, developers would need to focus on the "wrong" points of their products, instead of improving the really important parts. Consequently, users would buy products which are well-tuned for tests but which are not offering an adequate protection in the "real world".

During the Virus Bulletin 2007 Conference in Vienna (Austria), we set up another meeting (which was focusing on the "Dynamic Testing" aspects) where representatives of Avira, PC Tools and Trend Micro joined the initiative. The first publication on the topic "Analysts Work on Improved Antivirus Software Test" can be found here.

Last week, AV-Test.org's Maik Morgenstern and Andreas Marx attended the AVAR 2007 Conference. We spoke about the topic "Dynamic Testing" — a paper which was a joint effort of AV and AS companies' team members, from Sunbelt, Kaspersky, Eset, Webroot, IBM ISS, PC Tools, Symantec, Sana Security, F-Secure, Panda, Trend Micro, Sophos et al., as well as the testing organizations Virus Bulletin and AV-Test.org, of course. (The paper and the PPT will be available on our webpage by next week.)

As the project related to the behavioral testing paper worked very well, members of this team had the idea to found a "Anti-Malware Testing Working Group" by the beginning of next year, to work on future projects, similar topics related to testing and to create new standards which are reflecting the capabilities of the security products in a better way.