Microsoft Security Essentials is free, which is great, but its reputation has been getting slammed in antivirus tests over the last few months. The vast majority of antivirus products manage to pass certification with AV-Test; not Microsoft. In November and again in January Microsoft failed certification.
The Microsoft product team issued a rebuttal basically stating that the test in question didn’t measure their actual levels of real-world protection. However, a new test just released by London-based Dennis Technology Labs [PDF] puts Microsoft in last place, way behind all of its competition.
Where AV-Test and AV-Comparatives generally include twenty or more products in a test, Dennis Labs focused on eight vendors in the consumer area: AVG, BitDefender, ESET, Kaspersky, McAfee, Microsoft, Norton, and Trend Micro. The commercial products all did well enough, some of them very well indeed.
The Dennis Labs accuracy test aims to measure a product’s ability to “block all threats and allow all legitimate applications.” Products gain points for correctly blocking threats and for correctly leaving legitimate software alone – they also lose points for blocking legitimate software and for failing to identify malware. The best possible score is 400 points; the worst is -1000 points. With 388.5 points Norton Internet Security (2013) came close to the maximum. All the rest earned at least 300 points, except Microsoft, which took a paltry 30 points.
Some products lost significantly due to false positives. Trend Micro in particular would have had a noticeably higher score were it not for these deductions – but not Microsoft. Security Essentials earned that low score strictly due to its poor detection of threats, with no deduction for false positives.
Dennis Labs researchers also rated each product on its ability to resist real-world malware attacks. A detailed point system “gives credit to products that deny malware any opportunity to tamper with the system and penalises heavily those that fail to prevent an infection.”
The best protection, completely defending the system against attack, is worth three points. Letting the malware launch initially but then cleaning all hazardous traces is worth two points. Finally, if the security software managed to terminate a running malicious process without actually cleaning up traces, that scores it one point.
As for the heavy penalties, those kick in when the malware totally gets past all defences, or if the system is damaged after the security product’s response. Every such failure reduces the overall score by five points. With 100 samples tested, the best possible score is 300, the worst is -500.
Norton topped this list too, with 289 points, and all the rest earned at least 200 points. All but Microsoft, that is. In a rare sub-zero score, Microsoft took -70 points.
Dennis Labs also released a report on five Enterprise-level security products and five SMB products. Tested in the Enterprise group, Microsoft System Centre Endpoint Protection fared even worse than the free consumer antivirus, with negative scores in both accuracy and protection.
McAfee displayed the second lowest score in both of the consumer-side tests, but McAfee technology fared much worse in the other two tests. McAfee “VirusScan, HIPs and SiteAdvisor” earned very low scores in the Enterprise group, and McAfee Security-as-a-Service actually came in below zero for accuracy in the SMB test.
Simon Edwards, Technical Director of Dennis Technology Labs, observed: “It’s interesting to see how badly Microsoft does in the consumer and enterprise tests, particularly when noting that its products also fared poorly in the last AV-Test report. As you no doubt know Microsoft was dismissive of that test but my view is that if lots of different tests, from competing test houses that use different methodologies/approaches, reach similar conclusions then those conclusions start to appear increasingly convincing.”
I have to agree. In my own hands on testing, Microsoft Security Essentials has never performed well. At the other end of the scale, Norton Internet Security is the only product to receive the top AAA rating in the Dennis Labs test, plus Kaspersky and ESET took AA ratings, and all of these do well in my tests.
A post on Sophos’ NakedSecurity blog praised the Dennis Technology Labs report, noting that: “Conducting these types of tests is not easy or even straightforward, but Dennis Technology showed that it is indeed possible.”
It seems to me that Microsoft should look at what the top-rated vendors are doing and try doing the same. Sure, it’s good to emphasise “customer-focused processes,” but it’s even better to do that and pass the lab tests.Leave a comment on this article