A lab test that challenges antivirus utilities to protect against well-known viruses, Trojans, and other sorts of malware has at least some value. If a particular tool can’t handle malware that’s been around for months, that doesn’t say much about its effectiveness. However, in the real world a virus that came out yesterday is a lot more likely to give you trouble than one that’s been around since last year.
Even so, many antivirus tests still use thousands of well-known malware samples because it’s just plain difficult to do otherwise. Gathering brand new malware samples is a challenging task, as is presenting them to different antivirus products in a way that’s fair and balanced.
Fortunately, the expert researchers at AV-Comparatives are up to the task. Even they can’t do it alone; the Whole-Product Dynamic “Real-World” Protection test operates with help from the University of Innsbruck and some funding from the Austrian government.
Every day, AV-Comparatives researchers collect a set of malicious URLs, trying to keep a balance between direct malware downloads and drive-by attacks. After verifying that the URLs are indeed dangerous, the researchers feed them into an automated system that checks each product’s response. If the malware attack is completely foiled the product gets credit for blocking it; if the system is compromised the product gets no credit.
Some antivirus products pop up a query asking the user whether to allow or block less risky items. The AV-Comparatives automated system always chooses “Allow” in these cases. Sometimes the antivirus prevents the attack even though the user chose Allow, in which case it gets full credit. If a wrong click by the user can subvert protection, the antivirus gets half credit.
You can view an interactive chart of monthly results on the AV-Comparatives website. Every few months, the results get rolled up into a full report. In the latest such report, which has just been released, Trend Micro and Bitdefender are at the top in terms of detection. Trend Micro Titanium Internet Security missed just one of the over 2,000 malicious URLs, while Bitdefender Internet Security missed just two.
AV-Comparatives uses a clustering system to assign similar ratings to products with similar scores. Initially Trend Micro and Bitdefender received ADVANCED+, the top ranking. However, Trend Micro dropped one level due to a higher than average rate of false positives – valid websites identified as malicious. Four other products lost a level due to false positives, leaving Bitdefender, G Data, Qihoo, Kaspersky and BullGuard the only programs rated ADVANCED+.
All products achieving an ADVANCED or STANDARD rating are considered successful. AV-Comparatives lists those that didn’t reach the minimum standard as simply TESTED. GFI Vipre and McAfee fell into this category due to too many false positives. Fortinet, Webroot, and Ahnlab landed here based solely on protection scores.
What about Norton?
Norton Internet Security is conspicuous by its absence from this list. AV-Comparatives requires all vendors who subscribe for testing to participate in all tests, but Symantec objects to the methodology of one particular test. This year, they chose not to participate at all. I hope next year they’ll just bite the bullet and offer a counter-argument to the test they don’t like.
It’s also worth further discussing Webroot Secure Anywhere Complete, which rated merely as TESTED. That’s because Webroot’s method for handling brand new threats isn’t a good fit for this test. When it encounters a brand new application that looks suspicious, Webroot puts limits on its ability to modify the system, submits it for analysis, and starts journaling every action taken by the program. If the program turns out to be malicious, Webroot can undo every action and delete the program. However, that often doesn’t happen in the five minute timeframe allowed by this test.
Webroot and AV-Comparatives are negotiating a separate test, one that could confirm whether Webroot’s technology really does protect against unknowns and roll back malware activity if given a little more time. I really want to see the results of that test!
The AV-Comparatives Whole-Product Dynamic “Real-World” Protection test really does come close to testing virus protection as a user would experience it. It definitely takes a lot of resources, and I’m glad AV-Comparatives has found the help and funding needed to keep this testing going.Leave a comment on this article