AV-Comparatives: File Detection Test September 2015 http://chart.av-comparatives.org/chart1.php?chart=chart1&year=2015&month=9&sort=1&zoom=2 http://www.av-comparatives.org/detection-test/ .
We have now also redesigned our weblog. Now you find some infos about several topics: http://weblog.av-comparatives.org/?cat=2 -> AV-C Tests http://weblog.av-comparatives.org/?cat=4 -> Other Tests and Security News (various things) http://weblog.av-comparatives.org/?cat=12 -> Books http://weblog.av-comparatives.org/?cat=14 -> Links (various links)
Thanks for sharing. This test is most relevant for my on-demand-only AV setup. Great result from Avira.
Does prevalence matter? A different approach to traditional antimalware test scoring http://blogs.technet.com/b/mmpc/arc...-to-traditional-antimalware-test-scoring.aspx
Very interesting. Avast and Avg are the worst in this particular test. Although it must be said it is just a detection test.
I find it interesting that 3 out of the bottom 5 are heavy hitters in the business/corporate antivirus industry, and those are Trend, Sophos, and McAfee. I know that the business/corporate versions were not tested here. I am sure that when one or more of these undetected files were executed that either one of the products that I mentioned above could still prevent an infection with behavior analysis or intrusion detection.
This approach is working well for Microsoft. They maintain their position from the last prevalence test.
Microsoft-prevalence-based analysis of the File Detection Tests - Sept 2015 http://www.av-comparatives.org/microsoft-prevalence-based-analysis-file-detection-tests/ The results can be used by editors / media / bloggers etc. for free. Please give as source: http://www.av-comparatives.org Edit: Does prevalence matter? A different approach to traditional antimalware test scoring http://blogs.technet.com/b/mmpc/arc...-to-traditional-antimalware-test-scoring.aspx Hat tip: Thankful
It all depends if you buy into this regional "prevalence of malware approach." I don't. After all, it's called the "worldwide" web.
Eset's results are not surprising. They have good signature detection, but not great. Avira, and Bitdefender almost always score better by blacklisting signatures. Eset's strength is it's heuristics so it detects most samples it misses when they attempt to execute. I think i'm right anyways. Do they attempt to execute the samples missed after they do an on-demand scan? I have been thinking that they do not all these years, but never bothered to check. edited 11/24 @2:57
Why does Emsisoft always score lower in the File Detection test? They are suppose to be using the same signatures. Is there a delay in Emsisoft receiving Bitdefender signatures?