Virus Weekly Stats....0Day

Discussion in 'other anti-virus software' started by apm, Jan 9, 2008.

Thread Status:
Not open for further replies.
  1. apm

    apm Registered Member

    Joined:
    Mar 15, 2006
    Posts:
    164
    http://www.shadowserver.org/wiki/pmwiki.php?n=Stats.VirusWeeklyStats

    vendor detected total percent

    AntiVir 329415 332300 99.13%
    Vexira 326631 332300 98.29%
    DrWeb 326452 332300 98.24%
    VirusBuster 326468 332300 98.24%
    AVG7 326360 332300 98.21%
    Clam 326195 332300 98.16%
    Norman 325852 332300 98.06%
    F-Secure 325728 332300 98.02%
    Avast 325388 332300 97.92%
    F-Prot6 325283 332300 97.89%
    McAfee 323491 332300 97.35%
    VBA32 322576 332300 97.07%
    F-Prot 316545 332300 95.26%
    Panda 248000 332300 74.63%
    BitDefender 159137 332300 47.89%
    Kaspersky 159126 332300 47.89%
    NOD32 99250 332300 29.87%

    :rolleyes:
     
  2. JasSolo

    JasSolo Registered Member

    Joined:
    May 9, 2007
    Posts:
    414
    Location:
    Denmark
    How about FP's in this test?

    Cheers
     
    Last edited: Jan 9, 2008
  3. plantextract

    plantextract Registered Member

    Joined:
    Feb 13, 2007
    Posts:
    392
    something doesn't compute:
    F-Secure 325728 332300 98.02%
    Kaspersky 159126 332300 47.89%

    i can't belive f-secure's heuristics engines are that performant, no other tests show such a huge gap

    BitDefender 159137 332300 47.89%
    bit defender has a great heuristic engine and a very good detection rate in general (Av-test, av-comparatives)
     
  4. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    looks accurate to me.;)
     
  5. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
  6. Edwin024

    Edwin024 Registered Member

    Joined:
    Nov 14, 2004
    Posts:
    1,008
    I'm curious about how they test, these results are so different from other tests... If this is true especially Eset really has a problem. With Bitdefender and Kaspersky also in the relegation zone. Weird.

    EDIT: Another strange thing - last edited files are from November 12 2007. And that is for the weekly list....
     
    Last edited: Jan 9, 2008
  7. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    As usual, Avira is a wise choice.
     
  8. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    I too would like to know how they test
     
  9. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    However they test, it makes Dr. Web look a whole lot more effective than the AV-Comparatives routine. I might even give the good Dr. another try on my boxes myself! :D
     
  10. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    That doesn't matter, what does if its credible or not.

    I leave it up to anti-malware to give drweb some light ;)
     
  11. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    How do we know if any of them are credible? The empirical tests and resultant data also presuppose that the rigors of the test were/are as stated by those performing the test. The reality is, the trust in the objective validity of various AV tests might beg the question. Not to argue from skepticism (which is a philosophic contradiction), but there must be necessarily a lot of "faith" in the integrity of the testers, and from my long years in the computer industry-I would advise caution.
     
  12. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    absolute bs just by looking at clamav's score.
     
  13. rogervernon

    rogervernon Registered Member

    Joined:
    Jul 16, 2006
    Posts:
    289
    Who was it who said "Never trust ANYTHING posted on Wiki" ?
     
  14. dNor

    dNor Registered Member

    Joined:
    Oct 3, 2007
    Posts:
    212
    Location:
    Irvine, CA, USA
    Those are certainly some...unique...results. o_O
     
  15. larryb52

    larryb52 Registered Member

    Joined:
    Feb 16, 2006
    Posts:
    1,131
    they look a bit off espeically Kaspersky...
     
  16. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    Some discussion on previous results from this site here .
     
  17. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    results on zero-day threats are intersting, but we need more information about how they test them and where the samples are from.

    results over the whole year, 2007.

    it puts clamav rightly so in its place for 2007.


    F-Secure 98.89%
    F-Prot6 97.58%
    DrWeb 96.31%
    AntiVir 94.11%
    Norman 90.12%
    Kaspersky 83.19%
    McAfee 81.80%
    NOD32 81.54%
    Vexira 81.19%
    AVG7 78.94%
    F-Prot 77.03%
    BitDefender 74.16%
    VirusBuster 71.91%
    VBA32 67.62%
    Avast 63.59%
    Panda 44.75%
    Clam 11.18%

    good results for drweb over the last year for zero-day, but like i said... we need to dig deeper to see how the tests are done :)


    -------------------------------------------------

    if you look at the daily results, for today.

    antivir scores well as always, but drweb is fantastic too.

    we know antivir has the detection-overall, so how can today.. on the daily tests, antivir detect 99%

    surely, these are not all FP's.

    also avg and avast scored 99% for today, and they are not known for fps.

    nod32 is at the bottom of the weekly test, but today they got 98% while kaspersky remains at 0.07%

    so, are these tests legitmate for zero-day testing

    but...

    maybe someone can tell me why the results and the charts do not match?
     
    Last edited: Jan 9, 2008
  18. patrikr

    patrikr AV Expert

    Joined:
    Aug 9, 2005
    Posts:
    97
    Location:
    California, USA
    Looks good to me ;)

    But yeah, it's strange that our products detection of Worm.Win32.VB.es (311390 last week) would be better than Kaspersky's detection of the same malware (153978 in the same period)

    I don't have insight how these numbers are calculated but I know who to ask, will check it out.

    --
    Patrik
    F-Secure Security Labs
     
  19. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    with all due respect, you dont know your software to well. This is the 2nd test in 2 weeks that talks about F-Secures ability to handle zero day threats better then all the rest. It is because of Deep Guard, which Kaspersky does not have. The results are accurate based on a AV really being good at detecting zero day threats, which F-Secure is because of its internal HIPS, and others because they are designed to detect anything, FPs and all.
     
  20. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    doesn't kav have the pdm?
    anyway the samples were not executed in this test, so Deepguard and PDM were not in action.
     
  21. Edwin024

    Edwin024 Registered Member

    Joined:
    Nov 14, 2004
    Posts:
    1,008
    Guys...what 0days? It says that all tests are from November 12 2007...
     
  22. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    let us know what you find out about it patrik ;)
     
  23. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    kaspersky is fighting back with #1 for today, so it can do it..

    but over the past year, drweb has been flying high, so ... So much for the really low detection rate, apparently ;)
     
  24. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    And you relied to me from above:

    "That doesn't matter, what does if its credible or not."

    So now you think the test is credible?
     
  25. patrikr

    patrikr AV Expert

    Joined:
    Aug 9, 2005
    Posts:
    97
    Location:
    California, USA
    Of course I know our software and I also know Shadowserver and which version of our scanner they're using. The one they're using doesn't have DeepGuard, simple as that.

    The difference is this case seem to be for one single detection (Worm.Win32.VB.es) which is a detection by the Kaspersky engine in our product. It should therefore be detected by both products but it isn't for some reason.

    The c't test that showed that DeepGuard performed really well was conducted in a totally different manner than just simply scanning a file (in which case DeepGuard doesn't do anything). DeepGuard is behaviour based and only triggers when a process is started as IBK correctly pointed out.

    --
    Patrik
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.