AV Comparatives Sep 2013 File Detection Test

Discussion in 'other anti-virus software' started by Inside Out, Oct 14, 2013.

Thread Status:
Not open for further replies.
  1. FleischmannTV

    FleischmannTV Registered Member

    Joined:
    Apr 7, 2013
    Posts:
    1,094
    Location:
    Germany
    It's always nice to see how well Emsisoft performs. Rock solid file and behaviour detection.
     
  2. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,873
    Location:
    Innsbruck (Austria)
    some people never read the test reports and therefore easily misunderstand/misinterpret test results. If you see e.g. VB tests where practically everyone gets 100.0%, you should understand that this is because the products where tested against a very specific set (e.g. Wildlist samples). Most file detection tests use nowaways only a small set of some few hundred thousand of samples and limit the samples to prevalent ones (either prevalent files or prevalent families), so it is no surprise that most products score between 89-99% (never had a FDT at AV-C where 100% where reached, only in WPDT this happens). It is all written in the test reports of every testing labs, so make sure to read before you interpret ;).
     
  3. malexous

    malexous Registered Member

    Joined:
    Jun 18, 2010
    Posts:
    830
    Location:
    Ireland
    They do not provide either with the latest product versions.
    https://community.norton.com/t5/Nor...nitions-on-off-switch-missing/m-p/983927#M298
     
  4. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    @ IBK: Would you please fix the broken link to the false alarm test?
     
  5. vojta

    vojta Registered Member

    Joined:
    Feb 26, 2010
    Posts:
    830
    And Microsoft says exactly the same:

    http://www.pcpro.co.uk/news/securit...signed-to-be-bottom-of-the-antivirus-rankings

    "We used to have part of our team directed towards predicting test results and figuring out what might be in someone’s test. There’s always a cost to that. If they are doing that work they are not looking at those threats that are affecting our customers. We always felt that was wrong. There’s something not right about that – we’re not doing the best job for our customers."

    They stopped doing that and they began to score poorly in file detection tests but they maintain that their protection is now better than before because they allocate resources where they should be.

    So it's Kaspersky, Symantec and Microsoft saying all the same. Are they wrong? Are they lying? It doesn't matter, you'll keep reading here:

    "Great result for QuadrupleScanner Soft :thumb: "
     
  6. ance

    ance formerly: fmon

    Joined:
    May 5, 2013
    Posts:
    1,359
    Well done Panda and Forti, what happened to Avast? :doubt:
     
  7. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Yes, dodgy stuff do happen in detection tests, but what are they really boycotting when Kaspersky, who sure can do R&D just fine, make it clear that they're no fan of detection tests either but enter them anyway even getting good results legitimately?
     
  8. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,873
    Location:
    Innsbruck (Austria)
    AV-C is not the testing lab where "predicting samples" is possible, that is another lab which based on the current sample selection criteria this would be indeed possible. AV-C makes use of telemetry data and prevalence, as Microsoft wrote here.
    Kaspersky wrote several blogs about testing, also one which covers what happens when testing labs change procedures (e.g. remove some tests that certain few vendors do not like or are difficult to pass) to accommodate some vendors, so if you link to them, you should read them in full, e.g. here.
    AV-C is amongst the labs which prevent cheating of vendors and has in several cases demonstrated how some vendors where trying to cheat in tests (so even if you do not see us pointing fingers, you may hear vendors talking about it so that other testers get aware and take countermeasures like us too).
     
  9. scoopnog

    scoopnog Registered Member

    Joined:
    Oct 3, 2012
    Posts:
    8
    Nearly two years have passed since your organization has tested Symantec in the Real-World Protection test in the final quarter of 2011. You probably need no reminder that Symantec was clearly the strongest vendor in that final series of tests.

    Perhaps you can explain why you have ignored Symantec in your most advanced testing series? Meanwhile, who is paying for Symantec's testing now that they are are no longer participating?
     
  10. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Why would Microsoft admit to having cheated if they hadn't?
     
  11. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA

    Avira is the real winner. :argh: :thumb:
     
  12. Blueshoes

    Blueshoes Registered Member

    Joined:
    Feb 13, 2010
    Posts:
    226
    A question for IBK:


    How is AV Comparatives going to up it's game on the next Mac tests? Knowing what "features " a Mac AV has is nice, we are way past that now. Right clicking on a full file of Mac malware while not scientific in testing, sure tells us Mac people what AV is being proactive. A simplistic Mac malware test that has 300 files in it would not work in a PC world of malware testing. It works in the very very young field of Mac malware testing. Because up till now you have only given us a "features" test. It appears Intego Comodo and MacKeeper <--shady business practices) have wrote some more definitions to do better in the test. That can only help us Mac users.

    I can only assume you have seen this Mac malware test.


    http://securityspread.com/detection-rate-results



    This test gives us Mac people at least an idea what program to at least give our time to test it for our liking.

    I have no idea how you would be able to do a real world test of Mac AV.

    Maybe hire a grayhat Unix guy to write some i-frame redirects and some common Mac authentication by-pass droppers in your lab with known Mac malware and see who catches it, do the "best real world " scenario you can duplicate in the lab.

    What are your thoughts on AV-Comparatives doing more advanced Mac malware testing??


    .

    .
     
  13. Brandonn2010

    Brandonn2010 Registered Member

    Joined:
    Jan 10, 2011
    Posts:
    1,853
    Surprised Avast! did as poor as it did, but I don't put too much stock in on-demand tests, especially if the samples aren't allowed to execute.
     
  14. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,873
    Location:
    Innsbruck (Austria)
    We did already a real-world test of Mac malware where samples were not only scanned but also executed/installed.
    A side note included in the Mac PDF at AV-C:
    "All are distinctly malicious, functioning programs and were seen in-the-field in 2013. As usual, we did not include any potentially unwanted or grey samples (adware, hacking tools, etc.) in the set. We also excluded component files (which could be in the thousands) as these cannot run and do not pose a risk by themselves; certain magazine tests tend to use such files just because they are detected by various products, but we consider components to be irrelevant. "
     
  15. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,641
    Location:
    Sneffels volcano
    Thanks for the news and for the mention :)

    File detection tests are always my favorite "reviews" and the one I blindly trust above all others!
     
  16. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,641
    Location:
    Sneffels volcano
    You know, Avira Free is probably the AV product that has garnered the most attention and recognition of millions of users around the world.
    This is something that competitors with paid products just can't stand... and they have friends ;)
     
  17. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    just read a test report on a Bugatti Veyron:-they tried to drive it up an unpaved road/over fields to top of a mountain,because it didn't do what the testers wanted they concluded it was a poor car,even though its outside its designed use:-bit like testing Norton a way its not designed to protect a PC and concluding it's not up to the task! (made bit up about road test:-used it to make a point!)
     
  18. Malware fighter

    Malware fighter Registered Member

    Joined:
    Jan 31, 2011
    Posts:
    253
    I have only one complaint regarding this test - why the heck they choose avira paid and compare it to other free products like panda and avast o_OI like avira but I cant really compare it to likes like avast etc.
     
  19. jnthn

    jnthn Registered Member

    Joined:
    Sep 22, 2010
    Posts:
    185
    Panda Cloud performing quite impressively. Perfect score with AV-C's September Real World Test (along with EMSI and TrendMicro). Not bad for a free product. :D
     
  20. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
  21. SweX

    SweX Registered Member

    Joined:
    Apr 21, 2007
    Posts:
    6,429
    Afaik, it is up to each vendor to supply the product they want to be tested, so Avast chosed Avast free, and Avira their paid product. But Avira could have tested Avira free if they wanted to do that.
    And Avast could have tested Avast Pro/IS if they would like.

    This is afaik but it can have changed too. :doubt:
     
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,873
    Location:
    Innsbruck (Austria)
    @SweX: it is as you said. Panda and Avast wanted to participate with their free versions, mainly in order to show that their free versions can stand up with the commercial versions (as the free versions often contain the same protection features as the paid versions, but may lack of other additional non-security related features or premium support etc.).
     
  23. aztony

    aztony Registered Member

    Joined:
    Sep 9, 2012
    Posts:
    603
    Location:
    The Valley Arizona
    My guess is they put forward the product they felt would give the best result(s).
     
  24. Blueshoes

    Blueshoes Registered Member

    Joined:
    Feb 13, 2010
    Posts:
    226
    The last time I looked it was a Mac AV "feature review" I saw one sentence on testing malware. All I got out of that test was what "features " each AV program had. I really did not see any AV malware test. Did I miss a true AV testo_O I will have to go back to your site and look.
     
  25. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,873
    Location:
    Innsbruck (Austria)
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.