Discussion in 'other anti-virus software' started by Inside Out, Oct 14, 2013.
It's always nice to see how well Emsisoft performs. Rock solid file and behaviour detection.
some people never read the test reports and therefore easily misunderstand/misinterpret test results. If you see e.g. VB tests where practically everyone gets 100.0%, you should understand that this is because the products where tested against a very specific set (e.g. Wildlist samples). Most file detection tests use nowaways only a small set of some few hundred thousand of samples and limit the samples to prevalent ones (either prevalent files or prevalent families), so it is no surprise that most products score between 89-99% (never had a FDT at AV-C where 100% where reached, only in WPDT this happens). It is all written in the test reports of every testing labs, so make sure to read before you interpret .
They do not provide either with the latest product versions.
@ IBK: Would you please fix the broken link to the false alarm test?
And Microsoft says exactly the same:
"We used to have part of our team directed towards predicting test results and figuring out what might be in someone’s test. There’s always a cost to that. If they are doing that work they are not looking at those threats that are affecting our customers. We always felt that was wrong. There’s something not right about that – we’re not doing the best job for our customers."
They stopped doing that and they began to score poorly in file detection tests but they maintain that their protection is now better than before because they allocate resources where they should be.
So it's Kaspersky, Symantec and Microsoft saying all the same. Are they wrong? Are they lying? It doesn't matter, you'll keep reading here:
"Great result for QuadrupleScanner Soft "
Well done Panda and Forti, what happened to Avast?
Yes, dodgy stuff do happen in detection tests, but what are they really boycotting when Kaspersky, who sure can do R&D just fine, make it clear that they're no fan of detection tests either but enter them anyway even getting good results legitimately?
AV-C is not the testing lab where "predicting samples" is possible, that is another lab which based on the current sample selection criteria this would be indeed possible. AV-C makes use of telemetry data and prevalence, as Microsoft wrote here.
Kaspersky wrote several blogs about testing, also one which covers what happens when testing labs change procedures (e.g. remove some tests that certain few vendors do not like or are difficult to pass) to accommodate some vendors, so if you link to them, you should read them in full, e.g. here.
AV-C is amongst the labs which prevent cheating of vendors and has in several cases demonstrated how some vendors where trying to cheat in tests (so even if you do not see us pointing fingers, you may hear vendors talking about it so that other testers get aware and take countermeasures like us too).
Nearly two years have passed since your organization has tested Symantec in the Real-World Protection test in the final quarter of 2011. You probably need no reminder that Symantec was clearly the strongest vendor in that final series of tests.
Perhaps you can explain why you have ignored Symantec in your most advanced testing series? Meanwhile, who is paying for Symantec's testing now that they are are no longer participating?
Why would Microsoft admit to having cheated if they hadn't?
Avira is the real winner.
A question for IBK:
How is AV Comparatives going to up it's game on the next Mac tests? Knowing what "features " a Mac AV has is nice, we are way past that now. Right clicking on a full file of Mac malware while not scientific in testing, sure tells us Mac people what AV is being proactive. A simplistic Mac malware test that has 300 files in it would not work in a PC world of malware testing. It works in the very very young field of Mac malware testing. Because up till now you have only given us a "features" test. It appears Intego Comodo and MacKeeper <--shady business practices) have wrote some more definitions to do better in the test. That can only help us Mac users.
I can only assume you have seen this Mac malware test.
This test gives us Mac people at least an idea what program to at least give our time to test it for our liking.
I have no idea how you would be able to do a real world test of Mac AV.
Maybe hire a grayhat Unix guy to write some i-frame redirects and some common Mac authentication by-pass droppers in your lab with known Mac malware and see who catches it, do the "best real world " scenario you can duplicate in the lab.
What are your thoughts on AV-Comparatives doing more advanced Mac malware testing??
Surprised Avast! did as poor as it did, but I don't put too much stock in on-demand tests, especially if the samples aren't allowed to execute.
We did already a real-world test of Mac malware where samples were not only scanned but also executed/installed.
A side note included in the Mac PDF at AV-C:
"All are distinctly malicious, functioning programs and were seen in-the-field in 2013. As usual, we did not include any potentially unwanted or grey samples (adware, hacking tools, etc.) in the set. We also excluded component files (which could be in the thousands) as these cannot run and do not pose a risk by themselves; certain magazine tests tend to use such files just because they are detected by various products, but we consider components to be irrelevant. "
Thanks for the news and for the mention
File detection tests are always my favorite "reviews" and the one I blindly trust above all others!
You know, Avira Free is probably the AV product that has garnered the most attention and recognition of millions of users around the world.
This is something that competitors with paid products just can't stand... and they have friends
just read a test report on a Bugatti Veyron:-they tried to drive it up an unpaved road/over fields to top of a mountain,because it didn't do what the testers wanted they concluded it was a poor car,even though its outside its designed use:-bit like testing Norton a way its not designed to protect a PC and concluding it's not up to the task! (made bit up about road test:-used it to make a point!)
I have only one complaint regarding this test - why the heck they choose avira paid and compare it to other free products like panda and avast I like avira but I cant really compare it to likes like avast etc.
Panda Cloud performing quite impressively. Perfect score with AV-C's September Real World Test (along with EMSI and TrendMicro). Not bad for a free product.
You wouldn't believe how well Avira did, either.
Afaik, it is up to each vendor to supply the product they want to be tested, so Avast chosed Avast free, and Avira their paid product. But Avira could have tested Avira free if they wanted to do that.
And Avast could have tested Avast Pro/IS if they would like.
This is afaik but it can have changed too.
@SweX: it is as you said. Panda and Avast wanted to participate with their free versions, mainly in order to show that their free versions can stand up with the commercial versions (as the free versions often contain the same protection features as the paid versions, but may lack of other additional non-security related features or premium support etc.).
My guess is they put forward the product they felt would give the best result(s).
The last time I looked it was a Mac AV "feature review" I saw one sentence on testing malware. All I got out of that test was what "features " each AV program had. I really did not see any AV malware test. Did I miss a true AV test I will have to go back to your site and look.
Separate names with a comma.