AV-Comparatives: Real-World Protection Test – April 2018 http://chart.av-comparatives.org/chart1.php?chart=chart2&year=2018&month=4&sort=1&zoom=2 https://www.av-comparatives.org/dynamic-tests/
I was thinking the same thing. I have not used Emsisoft in a long time, but I saw post mentioning changes that were being made to the BB in order to get false positives down. I thought to myself that the BB would likely miss many threats if the BB becomes too lenient. I hope they find the right balance soon.
I'm shocked McAfee got a perfect score! They have been performing horrible for several years now. Edited: 5/15/18 @ 1:14 Congrats to McAfee for such a huge improvement! I don't know what you did, but keep up the good work. I'm guessing better management.
The reduction of FPs had absolutely nothing to do with this. Over the last three months, we gradually rolled out new infrastructure for our behaviour blocker as part of the monthly feature updates. We decided to introduce this new tech gradually as to avoid headaches when switching everything at once. In addition, just the infrastructure on its own had major benefits like fixing several long-standing compatibility issues with products like Kaspersky, Avast, AVG and some other products, that rendered systems unusable as no process could be started on systems running both EAM and their product in real-time. The rollout itself was pretty smooth and we didn't see anything unusual in our telemetry or continuous daily testing either. However, it turned out that there was a rare race condition with certain malware obfuscators that caused some 32-bit processes to not be monitored correctly on Windows 10 64-bit systems. AV-C did report the issue to us as part of their normal report at the end of March and we fixed and released it as an update during the 2018.3 lifecycle, but by then we already had racked up a couple of misses in the April test period as well. It is true that we did reduce the false positive rate considerably over the last months. However, those were achieved by improving our cloud systems and none of the misses this year were caused by our cloud accidentally flagging a behaviour based detection as a false positive.
You might want to check this out: https://www.wilderssecurity.com/threads/se-labs-home-anti-malware-protection-q1-2018.403678/
One test good, another test bad - nothing new for me. SE Labs sounds like homemade test, maybe it's time for a "fmon security labs" test.
Actually, it's probably because they use more zero day samples according to some. Malware that has been in the wild is of course more easy to spot.
https://www.av-comparatives.org/com...=2018&chart_month=4&chart_sort=1&chart_zoom=3 -------------------