Pro-active virus detection test

Discussion in 'NOD32 version 2 Forum' started by mrtwolman, Jun 3, 2004.

Thread Status:
Not open for further replies.
  1. mrtwolman

    mrtwolman Eset Staff Account

    Joined:
    Dec 5, 2002
    Posts:
    613
  2. tazdevl

    tazdevl Registered Member

    Joined:
    May 17, 2004
    Posts:
    837
    Location:
    AZ, USA
    Let's also be sure to look at numbers from the other tests as well where NOD32 doesn't fare well at all.
     
  3. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    strange how some folk "pick and choose" test results to fit their "wanted" result and ignore ones that don't fit this result(even different parts of the same test!)
     
  4. bsilva

    bsilva Registered Member

    Joined:
    Mar 24, 2004
    Posts:
    238
    Location:
    MA, USA
    I agree that it didn't do as well on other parts of the test, but it did get an overall rating of number 1 along with mcafee. No one was perfect.
     
  5. explanation

    explanation Guest

    NOD32 has quite the same results with ITW viruses and new zoo samples 33% and 37%
    It’s the same for Panda. The others anti viruses have far worse results with ITW.
    2 explanations are possible:
    1- virus writers test their products against Norton, Kasperski and McAfee and think that NOD isn’t a threat. (good virus writers who want their products enter the wild list of course)
    2-A virus detected heuristically by most sells AV has no chance of entering the wild list

    More seriously, what means the results of already known viruses, Isn’t it more a test of speed of writing signature ?
     
  6. Arin

    Arin Registered Member

    Joined:
    May 1, 2004
    Posts:
    997
    Location:
    India
    also remember that its a pro-active test. 3 months without updates gives me the creeps. but you have to admit NOD32 scored the best on new ITW viruses. if those were ZOO viruses i'd have argued that those samples were unworthy. good work ESET!
     
  7. Mr. Smith

    Mr. Smith Guest

    Way to half-quote a test there pal! Lets look at the rest of them..

    Proactive Detection of Already-Known Samples:

    KAV: 82%
    McAfee: 72%
    Norton: 59%
    NOD32: 48%

    Proactive detection of other samples:

    McAfee: 59%
    RAV: 50%
    KAV: 23%
    Norton: 16%
    NOD32: 6%

    Overall though, total results look like KAV and McAfee won, with RAV and Panda close behind.
     
  8. Paul Wilders

    Paul Wilders Administrator

    Joined:
    Jul 1, 2001
    Posts:
    12,472
    Location:
    The Netherlands
    Ladies and gents,

    There's an older thread running on this test over here - coming with comments from IBK/Andreas Clementi who actually performed this test. You might find his comment(s) very enlightning.

    regards.

    paul
     
  9. Wheres Norman in that test? We can surmise from that test is that true Heuristical detection is still pretty evasive to most AV products, don't you think?

    I noticed multi-engined products are missing from the test, and it would have been interesting to see how products like F-Secure and AVK showed up.

    But I would have been most interested to see how Norman worked out because its Sandbox system, while scanning is slower, has been proven to truely find REAL new threats.
     
  10. kjempen

    kjempen Registered Member

    Joined:
    May 6, 2004
    Posts:
    379
    @Where's Norman:

    Maybe if you actually read the posts at the thread which Paul Wilders suggested above (this thread) you would understand? That's the thread where the discussion of this test first started. Especially read IBK's posts in that thread.
     
  11. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    57,737
    Location:
    Texas
Thread Status:
Not open for further replies.