Skip to main content

Results of the first public security suite test from Dennis Technology Labs

When evaluating an antivirus product, I always perform hands-on tests, checking how well it wipes out viruses, Trojans, and other malware from an infested PC, and also testing its ability to defend a clean PC against attack. I also look at test results from five major independent labs: AV-Comparatives, AV-Test, ICSA Labs, Virus Bulletin, and West Coast Labs.

My selection of these five isn't a value judgment; I'm not necessarily saying they're the best. The big virtue shared by these five labs is that they regularly publish test results that are freely available for anyone to view. In most cases, the antivirus vendors each pay part of the cost of the test – which is something we noted in this recent article on the subject of mobile security.

Other labs work on a different model. Tests at NSS Labs, for example, are generally paid for by large corporations looking to make a decision about security software. London-based Dennis Technology Labs contracts to perform in-depth testing, but doesn't typically make the results public. This week, for the first time, Dennis Labs has released a public report on their testing of eight significant security products – and the results are quite interesting.

Test methodology

Dennis Labs researchers aimed to recreate as closely as possible the real-world situation in which a user falls victim to a web-based malware attack. By introducing the sample threats from real-world Internet locations, they allow the security products to bring all levels of protection into play. They do let the vendors know about any failures, but only after the test has concluded.

Unlike many tests, this one is entirely unsponsored. According to Simon Edwards, Technical Director at Dennis Labs: "We decide which products are tested and no vendor paid to be included."

Edwards went on to note: "The way we test falls well within the guidelines published by the Anti-malware testing standards organisation ("

Protection ratings

Researchers rated each product on how well it protected against real-world malware attacks. A product earned three points for each attack it thoroughly blocked and one point for each threat that was merely neutralised. If the attack managed to compromise the test system, the security product had five points deducted. With 100 samples used, possible scores range from 300 to negative 500.

Kaspersky Internet Security 2012 earned a near-perfect 297 points in this test (see our review of the latest 2013 version of the suite here). Norton Internet Security 2012 did fairly well, with 284 points, and ESET Smart Security 5 got 280 points. Microsoft Security Essentials brought up the rear with 124 points.

False positive ratings

There's a joke among antivirus geeks that anybody can write a perfect antivirus that will block all malicious programs. You just write it to block every program. Clearly nobody wants that. False positives – legitimate applications tarred with the virus-brush – are a serious problem. For this test, Dennis Labs carefully examined each product's reaction to a collection of 50 legitimate applications.

Sometimes the antivirus actively blocks a legitimate application, or places it in quarantine. Other times it merely displays a warning and asks the user to decide. For testing purposes, blocking was considered more serious than warning. In addition, researchers weighted the results based on the prevalence of each legitimate file. Killing off a file found on 100 million computers is a lot more significant than if the file appeared on only 100 computers.

Each false positive deducts an appropriately weighted amount from a starting score of 100. None of the tested programs earned a perfect 100 points, though Bitdefender Internet Security 2013 came very close with 99.75.

The lowest possible score, assuming every sample file had maximum prevalence, would be minus 500. The imaginary "block every program" antivirus would earn that score. In this test, AVG Internet Security 2012 and Trend Micro Internet Security 2012 vied for last place, with 76.6 and 76.5 points, respectively. McAfee Internet Security 2012 wasn't much better, with 81.9 points.

And the winners are...

Dennis Labs combined both tests to come up with a final score, and awarded each product a specific rating based on those scores. Kaspersky and Norton earned an AAA rating, the very best. Bitdefender and ESET took the AA award, not bad at all. Trend Micro got a B, and AVG got a C. McAfee and Microsoft didn't do well enough to earn any award at all.

The full report, available on the company's website, notes that while paid products varied in effectiveness, they all beat Microsoft's free product. Also, an accurate reputation-based system that completely blocks access to malware hosting websites can provide very effective protection. You'll also find a full description of testing methodology and very detailed low-level test results.

I look forward to more reports like this from Dennis Labs. Edwards tells me they're already at work on the next one, incorporating more 2013 products.