AV-Comparatives Results - Nov 2007 Retrospective/ProActive Test

Discussion in 'other anti-virus software' started by C.S.J, Nov 30, 2007.

Thread Status:
Not open for further replies.
  1. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    well we wasnt the highest FP detectors :)

    and it was new technology being tested, and the detection did increase, so im happy enough with this test :)

    just need to work on signature detections now and of course, like many .... the fp's.
     
  2. 031

    031 Registered Member

    Joined:
    Sep 5, 2007
    Posts:
    187
    Location:
    Bangladesh
    nod32 is not just an antivirus , it is a piece of art .
    71% detection and without a single false positive .........
    unbelievable . 10-20 %more detection and they won't need any signatures.
     
  3. Arup

    Arup Guest


    Agreed, and its still free:)

    FPs aside, Avira is still at 81%.
     
  4. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    well praise must be given to Bitdefender,

    the BIGGEST! Most under-rated AV/Suite on the planet.

    im glad that one of my 3 licences, is infact a Bitdefender one.

    it truly is, the lightest and fastest suite available. :thumb:

    my brother uses its constantly, while i use drweb... but seen as its 2pc, i have it on a snapshot too. :)

    nice proactive detection, but the FP's are there.... does not bother me :)
     
  5. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    I tend to agree. They did incredibly well. This test comes at the right time for Eset as this will offset some of the disappointment so many are voicing in regards to stability etc with 3.0. In this test NOD32 is way above the others.
     
  6. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    What happened to F-Secure? They did poorly except they only had one FP so did very well in that regard. Don't they still use Kasperky's engine and F-Prot?
     
  7. ASpace

    ASpace Guest


    Amen! :thumb:
     
  8. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    My 2 cents...

    First, almost every program in the test improved by about 8-10%. Even without changing their scan engines much. My guess is that this time the test set contains some huge amount of malware which was easy to catch, even for those without any propper heuristics. So don't get excited too much about 40% = advanced+. I my eyes, that is silly - 40% is easy to get without putting too much work into the heuristics. >40% should be standard, >60% advanced and >80% adv+.

    NOD32 and zero false positives? Thats utter nonsense.They have a very low rate of false positives, I agree to that. - but nothing near zero. And the fp rate of the KAV7 heuristics was somewhere near Dr.Web when I am scanning things... VERY weird that both programs score so low fp wise just in this test...

    BTW, the programs with "many" false positives got reduced by 2 levels (from adv+ to std). Why didn't those programs with "some" false positives got reduced by 1 level?

    Also this test does not reflect the real life situation "on the ground" at all. The daily new ITW malware I see has almost zero pro-active detection. By *any* program. Just take a brand new Zlob, DnsChanger, Zhelatin/Tibs, Vundo, Pandex, Zbot, Ntos, Bzub... The test was performed with 3 months old versions of the programs - no way in hell they could detect the above malware if they are older than 2-3 weeks, sometimes even after 1 day already.
     
  9. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Agreed. The detections for new variants of prevalent malware are abysmal, as can be seen at Castlecops and some underground malware exchange forums. Personally I have no idea how or why some products score so high; it'd be interesting to know what samples Andreas uses for these tests.
     
  10. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    solcroft, only a very few malware families are "highly optimized" to avoid detection. Compared to the total number of new malware that shows up every day, they are like 1-2% of the number of samples.

    However, those are the malware families that users get infected with mostly. Of course, those commercial malware authors that want to earn money do keep making their "products" undetectable all the time and also seek to achieve "optimal distribution". More infections that stay undetected = more money. :gack:
     
  11. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    I'm not really a hardcore malware collector, so I don't know how many zoo samples get released each day. Of the prevalent ones that I do see everywhere though, it's the situation as I mentioned earlier: really horrible proactive detection, save for a few select vendors - maybe because the malware writers don't bother evading those scanners, or maybe because their proactive detection really is outstanding.

    At any rate it raises an interesting question about the test: do scanners that score well only do so on non-optimized zoo samples? Because it really doesn't matter if you score 95% proactive, if the prevalent malware that're really ITW out there and account for the majority of infections consistently fall within the 5% that you can't detect.
     
  12. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    For most of the time, yes. How you actually want to pro-active detect malware when the malware writer actively changes the code, encryption etc. until no antivirus program detects the malware anymore and THEN release the malware? The only thing you can do is to update the detection as fast as possible and try to cover every possible detection-avoiding-trick that the malware writers could imagine.

    In some collections, when scanning without signatures I get above 90% detections. That doesn't make me happy at all anymore, knowing that there is just another wave of new/adapted Vundos hailing down on the users at the same time.
     
  13. R8y

    R8y Registered Member

    Joined:
    Nov 5, 2007
    Posts:
    33
    Location:
    South Africa
    Agreed, Zhelatin and Zlob's author were and still are very busy updating their little exe and distribute them through various channels, but from my past observation, major AV vendors had implemented generic/packer detection for the above popular families, my own little and probably biased sample scan showed that for Zlob, zhelatin the so-call zero detection is quite difficult to achieve and there will always be some AV on VT pick it up with either Heuristic/Generic or Packer detection (such as the technique used in Zhelatin/Storm).... But again, I am not into this and never had a complete picture of the real world on these malwares.
     
  14. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    its not 3 months, its 3 weeks this time.
     
  15. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    One of the reasons why I don't trust blacklist scanners entirely.

    Just out of curiosity, is there any feasible method that vendors can take to overcome this problem other than playing catch-up? I personally can't think of any short of introducing new technologies such as sandboxing or behavioral detection, but in my experience anti-malware vendors have proven brilliantly ingenious in the past.
     
  16. DVD+R

    DVD+R Registered Member

    Joined:
    Aug 2, 2006
    Posts:
    1,979
    Location:
    The Antipodes
    Re: Av-Comparatives Results


    Well in that case how come BitDefender only gets a Standard, because its got 44% o_O Avira Standard with 81% :blink: Somehow that report seems bollocks
     
  17. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Re: Av-Comparatives Results

    Penalized for FPs. Read the entire thread, or the report itself.
     
  18. R8y

    R8y Registered Member

    Joined:
    Nov 5, 2007
    Posts:
    33
    Location:
    South Africa
    Bottleneck as I called it. I had been thinking about this for quite a while. Maybet there are some new technologies brewing which AV vendors whould not disclose at the moment for both competition reasons and malware author's interest but with all the numerous new packers released daily and evasive techniques, one of the major challanges for AV vendors is how to be more accurate and effective.
     
  19. EraserHW

    EraserHW Malware Expert

    Joined:
    Oct 19, 2005
    Posts:
    588
    Location:
    Italy
    I strongly agree
     
  20. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    I agree and that is why the day of these controlled laboratory tests, are really insufficient in today’s real world.
     
  21. Arup

    Arup Guest

    Re: Av-Comparatives Results

    Avira was penalized for 16FP even though it detected 81% of the samples. I would take the 16 FP for the 2500+ detection. I would live with the FPs for more detection.
     
    Last edited by a moderator: Dec 1, 2007
  22. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    Re: Av-Comparatives Results


    agreed from me. avira would remain on my system if it was not so crazy here in price..i really do love avira
     
  23. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    the 16 FP's are not from the 25,000 test set,

    example: if there were only 50 files in the FP test, would you still accept 16?

    i think IBK will keep the ratings as they are, even though personally it would not bother me, as i always check a file before deciding what to do with it, first by submission and checking on the various free scan websites.
     
  24. Arup

    Arup Guest

    The FP test did not have 50 files and I repeat, I will take the 2,503 more detection and live with 16 FPs, I am fine with IBK's testing.
     
  25. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.