I disagree. They use different testing methods and samples and if a product scores similar in more tests, it means it as consistency across all conditions. If it only scores well in one testa nd not the others, you start to wonder why...
Even then, I still think it's better for a smaller vendor like Emsisoft to be tested by one of the big known orgs, than be tested by several less known testing org because it may be cheaper, and customers may not trust the results because they have not heard of that tester before. Symantec didn't want to be included in the file-detection test so they choosed to not be tested at all by AV-C, even if they most likely would be great in the RWPT and others. G-Data on the other hand didn't want to be included in the AV-C RWPT so they are not tested either. For them it's not a financial question but a methodology question, and that they can't cherry pick and skip over tests that is included in a test serie. Suck in 1 test, perform great in others...what's the problem Different samples sure, but it wouldn't hurt if AV-Test found a way to increase their 0-day sample set a bit as they only include 208 samples. We see it all the time, some products score average year after year by some tester, and better by other testers. Then you may wonder if there is a problem in the product or....
Yes according to that test. But MSE scores bad in all tests year after year....a "good" example of the consistency that you mentioned earlier I would say.
"DRep" is the new type of detections that take into account the origin of the executable. Originally it was short for "Domain Reputation" but now it goes well beyond that. In Avast, we take special care to be able to track the source URLs of binary files. That is, for these detections, it makes a difference if the scanned files are downloaded from URL X or Y. It turns out that these are very powerful detections, especially if connected with some machine learning models that minimize the FP rates. This is another reason why we introduced the HTTPS scanner (and why it's so important to us) -- to be able to track the origin even for files received via HTTPS channels (which is about 30% of all web traffic today). Thanks, Vlk
Which is why antivirus companies want to track certain things that users do online. People think it's just because they are evil and want to track users. But the reality is, they want to monitor conditions so they can provide better protection. Detection with context can tell or explain a lot of things on how malware originated, where, if it is spread from same domains etc...
So Avast is realy getting back in good detection again? I hope so. I also like the good scores of Eset. I think it is maybe time too switch from Kaspersky too Avast or Eset.
The problem is, products like Norton rely on several cohesive engines functioning together. A single filescan test doesn't show the whole picture. Norton scores amazingly well when 'all things' are factored, and the test is conducted properly.. Sigs, Sonar, Insight, Reputation, Heuristics, and IPS. But when any one thing is disabled, or not tested, it can impact the performance overall. Norton has very powerful IPS integrated in that works with it's AV to detect some pretty cool threats most things would miss. A similar concept in place with ZyXEL and Fortinet UTM appliances that leverage IPS as part of the detection process. I don't know any security researcher that doesn't consider Norton 2015 a serious contender these days when all technologies are factored together.
Looks like it. I'll wait for one or two more tests to see if they keep consistency or even improve the scores, but yeah...
See post Windows Defender Whacked, also applies to faulted AV-testing discussed in this thread, let's discuss this in the other thread please.
With all those layers, it surprises me how NS is one if the lightest solutions, great engineering I must say
MSE is not light by any definition. Especially considering how little technologies do the security checking on stuff.
In that case, anyone now why Avira only request their Pro version to be tested, not their Free version as both Avast and AVG? Something fishy concerning protection levels here? On the other hand, Panda ONLY submits its free version... /J
Protection I believe is the same for both except you don't get the mail module and game mode with the free version. For testing purposes I think by offering the pro version is a form of promotion along the lines of perception, 'paid is better'. There is nothing fishy about protection levels, the free version I believe has the same detection as the pro version, except you get some promotional material with it.
Doesn't the pro version come with the web protection module? In the free version, I believe you are required to install the toolbar for web protection.