ESET NOD32 Ranked 26th Most Effective Antivirus Tools Against New Malware by SRI

Discussion in 'ESET NOD32 Antivirus' started by JAB, Apr 15, 2008.

Thread Status:
Not open for further replies.
  1. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
  2. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,619
    Location:
    Toronto Canada
    Interesting? No it's not even that. :rolleyes: Behind Clam, eTrust and Quickheal? o_O It would have made a good April Fools day post.
     
  3. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    That's part of what makes it interesting. It makes me wonder why ESET would perform so poorly in those tests, when my general opinion of ESET is quite high.

    /jab
     
  4. stratoc

    stratoc Guest

    that's done it for me! 2 years ago i went with nod because of it's lightness and reputation. i hate v3 and v2.7 has started throwing up vista compatible warnings since sp1, these forums were much more positive even a year ago. i have cut my losses and gone for antivir for a while..
    i don't know how reputable those tests are but the results are pretty shocking and dont fill me with confidence at all.
     
  5. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,375
    1. Honeypots are prone to catching corrupted files, we've seen this many times in the past. ESET does not detect corrupted files. A serious tester provides the samples used in his test to AV vendors for verification. I'm not aware of ESET being contacted by that tester.

    2. A test set comprising of 1030 files is too small IMHO, compared to about 800-900,000 samples used by other testers and given the fact that dozens of thousands of new threats emerge on a daily basis.
     
  6. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    1. That sounds like a reasonable hypothesis.

    2. If one is measuring detection against emerging threats, I see no reason why a sample of 1,000 would not be statistically relevant. The key is whether the sample is approximately random. If SRI has a random sample, I think basic statistics suggests that the uncertainty in their measurement is only about 1.4 points. If SRI has a reasonably random sample, one can be reasonably sure that if ESET were run on the "dozens of thousands" of emerging threats, ESET's success would be between 69% and 75%, which is not enough of a range to change its relative ranking more than about two positions to 24th.

    It is important to note that SRI is not a small, fly-by-night organization/single individual with a hobby, nor do they purport to be performing a comprehensive test of anti-virus products. They submit what their honeypot catches to virustotal and tabulate the results. This means, I think, that ESET receives a copy of any missed files, although ESET wouldn't know they had originated with SRI.

    One of the reasons I posted the link to the article was in the hope that ESET would reach out to SRI to discuss methodology and obtain samples for analysis.

    /jab
     
  7. SmackyTheFrog

    SmackyTheFrog Registered Member

    Joined:
    Nov 5, 2007
    Posts:
    767
    Location:
    Lansing, Michigan
    Why the heck isn't v3 included in those tests?
     
  8. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    Presumably because virustotal isn't using v3. That doesn't diminish the importance of your observation. Why doesn't virustotal use v3?

    /jab
     
  9. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,375
    It doesn't matter whether v2 or v3 is used for testing. The question is how many of those files were actually functional as files collected by honeypots are prone to corruption. Also NOD32 users are protected against threats otherwise not detected at Virustotal or other on-demand scanners as the web/email protection modules utilize much more sensitive heuristics as well as blocking of suspicious websites.
     
  10. Nike_P

    Nike_P Registered Member

    Joined:
    Mar 30, 2008
    Posts:
    122
    Location:
    Europe
    very intresting...
    I see again avira, bitdefender, and avg at the top...
     
  11. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    Yeah, I really like Avira, but their central management tool is unusable. And, their false positive rate is a bit high.

    /jab
     
  12. n8chavez

    n8chavez Registered Member

    Joined:
    Jul 19, 2003
    Posts:
    2,305
    Location:
    Location Unknown
    Is this just a result of 'faulty', or disagreeable, testing practices? I know that some AVs, such as Dr Web, dislike the way tests are done. DRW gets low results because of this. Is this the case with NOD?
     
  13. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    - The tester uses Virustotal. He has no control over the settings of the AVs/AMs.
    - AVG better than Webwasher and Antivir, LOL :rolleyes:
     
  14. nameless

    nameless Registered Member

    Joined:
    Feb 23, 2003
    Posts:
    1,184
    Exactly! Like I always say, if it goes against preconceived, unsubstantiated notions, it's nonsense!
     
  15. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    You're not saying the only tests results that are valid are the ones the guys complaining about this one agree with?!?
     
  16. eidolon

    eidolon Registered Member

    Joined:
    Apr 11, 2008
    Posts:
    7
    The question is, who gets to decide which are preconceived, unsubstantiated notions, and which are facts that are common knowledge?

    That task should not be left for the ignorant, I say...
     
  17. ok33

    ok33 Registered Member

    Joined:
    Mar 21, 2008
    Posts:
    12
    Interesting, wonder why they use v2....I would say fair game anyways since the detection would be the same as v3 am i'm correct? Congratz to Bitdefender no hard feeling every vendor has it's process of doing things and no vendor is perfect. Nod32 still rocks!
     
  18. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,619
    Location:
    Toronto Canada
    You must be correct as far as detection goes as Marcos says it doesn't matter whether they used V2 or V3.
     
  19. SmackyTheFrog

    SmackyTheFrog Registered Member

    Joined:
    Nov 5, 2007
    Posts:
    767
    Location:
    Lansing, Michigan
    I think he was saying it doesn't matter what version they tested with because the methodology was flawed in the first place.
     
  20. dazulrich

    dazulrich Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    6
    How can the last product have missed 1031 binaries, if there were only 1030? o_O
     
  21. mrtwolman

    mrtwolman Eset Staff Account

    Joined:
    Dec 5, 2002
    Posts:
    613
    If you read in-between-the-lines "The results do not take into consideration the false positive rate of a given tool, and thus a tool that declares everything to be infected would appear to have the highest true positive percentage rate."

    On the other hand, NOD32 detected 72 per cent of saples which is very close to the detection levels usually achieved in proactive/retrospective tests.

    IMHO, nothing to worry about.
     
  22. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    That's an interesting point. But, I find it difficult to reconcile with the AV Comparatives reports. Looking at the SRI numbers vs AV Comparatives proactive/retrospective report:

    Low false positives:
    * F-Secure: 87% vs. 14%
    * Symantec: 78% vs. 35%
    * ESET: 72% vs. 71%

    Medium false positives:
    * AVG: 95% vs. 25%
    * Kaspersky: 93% vs. 40%
    * Avast: 88% vs. 37%
    * Microsoft: 88% vs. 35%
    * Norman: 88% vs. 33%
    * McAfee: 74% vs. 34%
    * Fortinet: 74% vs. 3%

    High false positives:
    * Avira: 95% vs. 81%
    * Bitdefender: 95% vs. 44%
    * Dr. Web: 83% vs. 39%
    * F-Prot: 83% vs. 33%

    Why is that that everyone but ESET has a significantly easier time detecting the SRI samples than the AV Comparative proactive/retrospective samples? It looks like the similarity in detection rates for ESET is a coincidence.

    I'm not hammering on ESET here. I'd just like to better understand this difference. I think the bulk of the evidence suggests that ESET is the best product on the market.

    /jab
     
  23. mrtwolman

    mrtwolman Eset Staff Account

    Joined:
    Dec 5, 2002
    Posts:
    613
    Well, i cant speak on behalf of others but it seems that something in our software causes rock solid detection rates :)

    From the statistical point of view, the test set was limited, there are no data known about its composition etc...
     
  24. JAB

    JAB Registered Member

    Joined:
    Apr 17, 2007
    Posts:
    36
    I think the concern is rock solid at 72%. :)

    I've already dealt with the statistical issue. The small test set is irrelevant to the conclusions, providing it is a random sample. The power of statistics is the ability to draw high confidence conclusions from limited data.

    Nonetheless, I agree that knowing much more about the composition of the test set would be very useful. I'm hoping ESET will reach out to SRI, whether the results get shared with anyone else or not.

    /jab
     
Thread Status:
Not open for further replies.