AV-Comparatives May 2006 Retrospective Test Table Now Available

Discussion in 'other anti-virus software' started by profhsg, May 26, 2006.

Thread Status:
Not open for further replies.
  1. .....

    ..... Registered Member

    Joined:
    Jan 14, 2005
    Posts:
    312
    The false positive test set is different from the test set in the detection.
     
  2. TeknO

    TeknO Registered Member

    Joined:
    Feb 18, 2005
    Posts:
    147
    Location:
    Istanbul, TURKEY
    The test results of Norman and BitDefender are the behind of TrustPort. It was very interesting for me :) Could you explain it? Perhaps that means is 1+1=2 :)
     
  3. TeknO

    TeknO Registered Member

    Joined:
    Feb 18, 2005
    Posts:
    147
    Location:
    Istanbul, TURKEY
    Could you explain the detection formulation?
    Total detection = correct detection + false positive ... or ??
     
  4. rdsu

    rdsu Registered Member

    Joined:
    Jun 28, 2003
    Posts:
    4,537
    Remember that Norman and BitDefender doesn't detect the same, so if you add the detection of both will have more samples detected... ;)

    That is why some AV's have 2 engines...
     
  5. .....

    ..... Registered Member

    Joined:
    Jan 14, 2005
    Posts:
    312
    IBK uses two sets of samples: Malware and "Normal" files.

    For detection the malware samples are used.
    For Speed/ false positives the "normal" files are used.

    The samples for each test if different and do not effect the results of either test, but rather the RATING of a product. You should read the report, it contains lots of useful information.
     
  6. rdsu

    rdsu Registered Member

    Joined:
    Jun 28, 2003
    Posts:
    4,537
    No. The false-positive is only to measure the quality detection of the AV's...

    I will never trust on an AV that have a huge number of false-positives...
     
  7. TeknO

    TeknO Registered Member

    Joined:
    Feb 18, 2005
    Posts:
    147
    Location:
    Istanbul, TURKEY
    Yes of course, I know it. But, my calculation is true in the FP site. :)
    Bitdefender FP (5) + Norman FP (3) = TrusPort ("8")
    Sorry, it was only joke.

    Thanks AV-Comparatives.
     
  8. rdsu

    rdsu Registered Member

    Joined:
    Jun 28, 2003
    Posts:
    4,537
    :D :D :D
     
  9. Wolfe

    Wolfe Registered Member

    Joined:
    Jan 23, 2005
    Posts:
    160
    Great test once again, IBK - truly one of the very few independent tests on the web! ;)
     
  10. TAP

    TAP Registered Member

    Joined:
    Aug 17, 2004
    Posts:
    344
    I totally agree, if you manage a large number of computers in corporate network, sometimes a single false positive can equally (or greater) make quite headache as false negative (miss a real virus).

    Good to see Norton/Symantec doesn't cause any false positive, it's always a very good and reliable AV solution.
     
  11. TNT

    TNT Registered Member

    Joined:
    Sep 4, 2005
    Posts:
    948
    Not in my experience. It might not have a large amount of false positives, but its detection of in-the-wild trojans is awful. Sorry for anybody who likes them. From my actual, real-world experience, I honestly wouldn't recommend Symantec to my worst enemy.
     
  12. kalpik

    kalpik Registered Member

    Joined:
    May 26, 2005
    Posts:
    369
    Location:
    Delhi, India
    Seems like NOD32 is not the *undisputed* market leader in heuristics anymore! :(
     
  13. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    VBA32 scoring 56% is absolutely NOT surprising.

    Superb.
     
  14. Stan999

    Stan999 Registered Member

    Joined:
    Sep 27, 2002
    Posts:
    566
    Location:
    Fort Worth, TX USA
    VBA32 has too many FP's.
     
  15. FRug

    FRug Registered Member

    Joined:
    Feb 7, 2006
    Posts:
    309
    I wonder whether AntiVir would have gotten an advanced+ rating if IBK tested with Medium Heuristic settings... It seems somewhat weird to see it getting rated advanced with other AVs that have almost ~30% less detection....
    False positives may be an issue, however I think it's is still kind of weird seeing a real effective detection ending up on par with scanners that have quite low detection rates in the overall classification...

    The version used by IBKs tests was the first Version ever released by the new antivir heuristics, FPs have to be expected with a brand new version that was only in betatesting users a couple of hundred/thousand users. I'm pretty sure the FPs will have been eliminated/reduced by now, after all there have been numerours new engine versions...

    What also would be interesting is, how many files were in that false positive set?

    I'm not critizing IBKs testing methology but I'm kind of surprised by the percentages needed for standard/advanced/advanced+ ratings. It seems like times are changing for those who actually work on heuristics, those who do it a bit, and those who don't seem to care all that much and are lucky having some samples before IBK had them in his set.

    Looking at the report graph I would distinguish between Advanced+ (50%+) Advanced (40%+ or maybe 35%+) Standard (20%+). Everything below 20% can in my opinion be considered 'not really trying to do proactive detection'. Many FPs might lead to a downgrading as well, but I think the minimum levels to reach an advanced rating is set too low to be meaningful.... This may have been different in previous tests, but the numbers have changed significantly since then.

    I'm also surprised by Norman, after all their sandbox was supposed to be pretty good, no?
     
  16. FRug

    FRug Registered Member

    Joined:
    Feb 7, 2006
    Posts:
    309
    IBK: Uh, i just looked at the Nod32 FPs "SFX Archives' what does that mean? They had FPs on numerous SFX archives? If yes, each should be rated as a false positive seperately no? After all you did count TrendMicro's installers separately and the two MS Servicepack thingies as well.
     
  17. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    Looks poor when Heuristic Analysis set to the HIGHEST level. BUT this is not the default setting and if you try and crank up the heuristics to excessive it gives you a warning window that this will lead to false postives.

    So in fact when the Heuristic Analysis is set to lower settings the number of false positives drops down to 24. Still high, but not as bad as 73!

    In fact I have used VBA32 as a primary scanner for about 6 months now and I have yet to come across one false positive using the default settings.
     

    Attached Files:

  18. waters

    waters Registered Member

    Joined:
    Nov 8, 2004
    Posts:
    958
    antivir got 32 percent better detection than avast ,yet both advanced.The new heuristics of antivir has only been out a short while,so how good is it now,or going to be.
     
  19. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    The scanning speed test is also interesting where I was very surprised that Panda seemingly has the fastest scanner.

    Apparently scan settings for all AV's were set to "best possible detection settings" but again users of these AV's would not always use this setting.

    For example in VBA32 most users would not use "thorough mode" all the time for on-demand scans. But I can confirm that Trustport is a very slow scanner.
     

    Attached Files:

  20. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    From my personal experience it is very easy to get to 30% detection (so everyone below this didn't put any real work into the heuristic IMHO).
    Getting above 50% is *alot* of work, however.

    Also keep in mind that the tested version of AntiVir had a very small number of static runtime unpackers, no generic unpacking, no sandboxing and no variant detection. I can't understand how scanners with all these abilities are unable to score higher.

    And 54% isn't so far away from 58%. :cool:
     
  21. chaos

    chaos Registered Member

    Joined:
    Jan 19, 2006
    Posts:
    97
    Location:
    Greece
    People shouldnt pay too much attention to these independent results as some times these "independent" testing organisations may not be so independent after all.

    People usually involved in these efforts start with good intentions .
    But after some time that their tests have gained publicity and they determine the quality of a product in the eyes of the consumers companies come into the game.They find it more cost effective to "donate" some cash for "independent" research than to advertise their product.
    Also in these activities that much effort and time is required doing it for free is not such a good idea.Additionally sponsors would like to see some profit of any kind in exchange for their cash and preferably of course, more cash.

    I found it suspicious already from the previous time that a company in the AT test decided not to participate in the test for their own reasons.The guy that makes the tests was being quite aggressive and he was bashing the company for not wanting to participate and was implying without any specific results that the product had a very low malware detection rate.
    It is also very suspicious that some companies only get the Advanced rating because of the relatively high number of false positives.We are talking about detection rates here ,not how convenient a program is.

    What is my point .
    Study the results of all the available tests but always be aware that companies can do almost anything for gaining customers and some people also could do anything for cash.The concensus should always be based on facts and not blind trust upon shady methodologies or ratings that try to sum up things that are actually irrelevant.
     
  22. Longboard

    Longboard Registered Member

    Joined:
    Oct 2, 2004
    Posts:
    3,238
    Location:
    Sydney, Australia
    so, my impressions:

    on demand SYmantec: best detection v no FP.
    Crap proactive

    AVK, NOD, KAV best proactive but not best scanners.

    My head hurts.

    Any thing here make anybody change their 1st choice AV?

    Also reinforces need for AT as part of fortress.
     
    Last edited: Jun 1, 2006
  23. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    May not be a good idea, but I find it more fair.

    That's wrong thinking. What peoples want are unbiased results, if results are faked they would be senseless. If a company would ask me such a illegal thing, I would probably post it somewhere, as I would find it very bad behaviour of which peoples should be informed.

    The reason was the low detection rate (I know what the result was and I told it to that AT company - after they saw that result, they asked to do not get listed). I noted the fact in the report, and peoples took it to bash, while I was trying to defend a bit their decision. Note: also in the AT test, no company did donate anything, nor I did ask for anything. It was done just to inform peoples. My philosophy may sound stupid, but I find it to deal justly.

    A product that gets 99% in the proactive test but also all clean files is not really useful. And not as good as one that gets e.g. 41% but has no false positives.
    If you do not care about false positives, you can always just rely on the number in the proactive test. ;)
     
  24. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Anyone using GData AntivirusKit 2006 here? I'm very impressed on how it scores in all tests :eek: Opinions please..
     
  25. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    It is a very good AV; the KAV engine for signatures and BitDefender for heuristics.

    However, some users have reported problems with resource usage and support.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.