SE Labs Home Anti-Malware Protection Jul-Sep 2020 (2020 Q3) https://selabs.uk/reports/epp-home-20q3-anti-malware-testing/ https://selabs.uk/wp-content/uploads/2020/10/jul-sep-2020-home.pdf
From report: "Only Kaspersky blocked all of the public and targeted attacks." and: "The leading products from Kaspersky, ESET, McAfee, Avast, Trend Micro, AVG, Microsoft, Avira and NortonLifeLock all win AAA awards"
Well, I was about to post a positive comment in the Malwarebytes thread, because in some other test it performed quite well for the first time ever. But in this test it's pretty much a joke again. However, I would like to know how this test was done, was this a simulated test or real life? With that I mean, did they really found these exploits and malicious URL's online? Or did they simply downloaded the malware and then executed them? And how many samples and what type of malware were tested?
Found this. But I don't know if any are accurate to this current test. https://selabs.uk/reviewers-guides/
Well, I've found some more reports and I haven't read them in depth yet, but seems that they are trying to simulate real life attacks. So I'm not sure how many malware samples they are testing, but overall I really like their tests. Here you can find reports about CrowdStrike and SentinelOne. https://blog.selabs.uk/2020/10/serial-hackers-breach-response-testing/ https://selabs.uk/reports/breach-response-2020-q2-crowdstrike-falcon/ https://selabs.uk/reports/breach-response-2020-q2-sentinelone/
Granted, the tests are far from perfect. However, there is no other auditable basis for selecting an AV to trust/purchase. Without objective testing, the only factors for selecting an AV would be advertisements, word of mouth, name recognition, "expert opinion," & "a roll of the dice."
I strongly suspect that the product providers themselves care very much about the results of such tests. Beyond "bragging rights", or "pride of workmanship", there is the obvious benefit of convincing potential customers that their products are beneficial and worthwhile purchasing. Many/most prominently feature on their websites the awards received from such testing. Poor performance breeds poor reputation which can negatively impact sales. The results of these tests are often widely reported in trade publications, blogs, etc. So the general public may be influenced by the performance ratings. Competition and visibility of efficacy engender motivation to strive for excellence. The end result is improved sales for the vendors and optimum quality for the consumer.
MS Defender is the perfect example of how we were able to monitor its improving detection capabilities through the years by checking the results of testing organizations. They are useful in giving an indication of how well they fare with different tests. Malwarebytes on the other hand is not showing good results in several tests, if that continues it is not a sign of reliability although it might still be an excellent cleaner of infected machines.
Although I won't even bother discussing MB as a primary, I can state definitively that as a 2nd Opinion scanner it is oblivious to Worm infections and is wretched in detecting many persistence mechanisms of malware already extent on one's system.
The last post in this thread sums it all up nicely. https://malwaretips.com/threads/how-the-hell-wd-works-on-windows-home-pro.95146/page-3