Sometimes "the devil is in the detail." As far as 0-day protection goes for Dec., 2015, WD's score was 90%. Eset's score was 98.3%. Comodo's was 100%. Appears both Eset and Comodo to name a few were penalized for false positives.
Glad to see the improvement...I knew they were saying there would be a significant improvement in Windows 10 for WD.
I hope they will go further making the engine faster, the database better, give it a new UI, etc. Maybe even behavioral blocking at some point in time.
It's nice to see WD protection is improving. That means their users are more protected and also it also gives other AV vendors another reason to try to improve their products. For actual usage I prefer to use ESET over WD. It's much more configurable and IME feels lighter on my system. But that's just my personal preference.
Lets run the numbers of Windows Defender through the calculator. 140*0,025=3,5=4 140*0,1=14 14658*0,004=58,6=59 14658*0,004=58,6=59 4+14+59+59=136 140+140+14658+14658=29596 136/29596=0,00459*100=0,459 100-0,459=99,541 So during these two months Windows Defender blocked 99,541% of everything that they threw at it.
How they even count Comodo detections? Only signatures or do they also count all the sandboxed samples as protected?
https://www.av-test.org/en/test-procedures/test-modules/protection/ seems the first part is all technology being used (sandboxing, behavioral detection, url blocking, etc. the way a user would encounter stuff) second part is static detection of widespread/prevalent threats ala wildlist testing @ virusbtn.
They don't mention any "isolation". Just pure hard detections/blocking. I'm asking because to be honest, I haven't ever seen anything get past Comodo. Past its signature AV, yes. Past sandbox wrecking the OS, not really.
well, they got 100% on the first part, which includes execution of stuff and 96% on static testing which is signature stuff
Its a hodgepodge of semi-useful data. Comodo gets the second worst protection score despite stopping 100 percent of zero day threats and being one of the toughest programs out there. That's just mind-numbing in its ignorance. There is useful data in there, but you can't take any of it at face value.
This sentence makes absolutely no sense. You can't have the best score and be the second worst. It just doesn't compute.
If you think that is the case, your not putting much thought into it. The testing criteria is problematic. In fact, that's an issue with all the testing organizations in one form or another. There is useful data in all the tests, but you have to know what to look for. Comodo highlights the problem with this test. Comodo has always had so-so signature AV, but its proactive stuff is as good as it gets. penetrating a properly configured Comodo system is as tough as they come. Thats why the 0 day performance was 100 percent, outperforming heavyweights such as Kaspersky. Yet their "protection"scoring system drops it all the way to second to last. Why? Because its NOT a protection test, its a signature AV test. And don't think for a second that the problem is limited to Comodo. I just used them as an example because of the huge disparity between its actual protection and its "protection" ranking in this particular test series.
I'm wondering. If WD continues to improve to a degree that it's a strong competitor. Will that cause a decline in 3rd party AV's? Even such that the business model changes.