New AV-Test.org malware testing (Avira finished 1st, CA eTrust finished last)

Discussion in 'other anti-virus software' started by InfinityAz, May 23, 2007.

Thread Status:
Not open for further replies.
  1. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Like I said, the calculation is very crude and only a rough estimate. The known numbers from www.av-comparatives.org August 2006 and February 2007 results are:
    • Total samples in each category (Trojans, Backdoors, etc.). This allows me to calculate a difference in each category. I assume this represents all new samples and that there has been no culling of bad samples from the testbed. If there has been trimming of bad samples from the testbed, this is obviously a low estimate.
    • Samples missed are also tallied in each test for each category. The worst case scenario is that all misses from August 2006 are now handled and that the misses now shown involve only the new samples introduced for Feb. 2007. Keep in mind that the new samples estimate is the basis for the calculation. The best case scenario (in terms of the new samples) is that none of the August 2006 misses were corrected which means that the Feb 2007 misses were inflated by an amount equal to that seen in August 2006. Obviously, some of the August 2006 misses will have been addressed, I simply have no way of knowing how many - hence correcting for all of the Aug 2006 misses will yield a high estimate.
    • Like I said, do not overinterpret. The estimates pertain to a partial testbed. The entire range of the estimates may not include either test result since it tries to recast the www.av-comparatives.org results into a partial test with a testbed somewhat analogous to that used by www.AV-Test.org. Since it attempts to mimic "new samples only", I'd expect it to come in low relative to the two parent cases (Aug. 2006 and Feb 2007). With respect to it's relation to the www.AV-Test.org results, there is no guarantee with respect to testbed overlap. In terms of being "new samples only", there is correspondence. Note also, the estimate "testbed" is not small. It is calculated to be comprised of roughly 176,000, of which 150,000 samples are Trojans and Backdoors. So these two categories dominate the results
    I performed this calculation since to my initial eye, the results for NOD32 and F-Prot seemed a bit out of place relative to the others. After calculating these estimates, I don't believe that's the case. They do seem to reflect a trait seen in the www.av-comparatives.org tests as well, particular since the Estimated Min and Max are reasonably close in this case. One can argue what this means in terms for performance, what I would say one cannot argue (myself included) is that the www.AV-Test.org results are somehow skewed against NOD32. They appear to represent a current snapshot in time of immediate performance. The Estimated Min and Max for F-Prot are sufficiently different that one really cannot draw any conclusions using those values in that single case. However, by inference from the remainder of the comparisons, it's reasonable to conclude that the overall test is in control and that the results reflect current reality for F-Prot as well. Again, the disparities between the two tests were fairly stark for those two cases, but do seem to reflect the different natures of the two tests - i.e. all malware vs. only "new" malware.

    Blue
     
  2. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    thanks for the explanation
     
  3. De Hollander

    De Hollander Registered Member

    Joined:
    Sep 10, 2005
    Posts:
    718
    Location:
    Windmills and cows
    @IBK

    Wood that be your test...?
     
  4. Pfipps

    Pfipps Registered Member

    Joined:
    May 15, 2007
    Posts:
    181
    AV-test.org needs a meaningful rating system like AV-Comparatives (no certification, Standard, Advanced, Advanced +) so consumers don't get confused by the results.

    As far as I am concerned, any anti-virus with the Advanced or Advanced + ratings is fine. All the other reasons are on a personal basis. They even point out that the Standard rating is sufficient if it has a ICSA certification - but I wouldn't take that risk.
     
  5. Pfipps

    Pfipps Registered Member

    Joined:
    May 15, 2007
    Posts:
    181
    The Sophos rating is scary. My college depends on Sophos! Mind you, the word on the street is that my college's IT department sucks :p .
     
  6. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    I wouldnt trust Sophos it is actually quite poor and its detection rate is inflated here.
     
  7. Pfipps

    Pfipps Registered Member

    Joined:
    May 15, 2007
    Posts:
    181
    Why is it inflated?
     
  8. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    because Sophos' detection rate is not above 80% and mroe like 75% and that is not adequate.
     
  9. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,225
    Location:
    The land of no identity :D
    Again you are exhibiting your distrust of AV-test.org....The fact is AV-test shows Sophos as getting around 80% while another test showed it as getting around 75%. But in either case Sophos is not so good, its better to use something else.
     
  10. Pfipps

    Pfipps Registered Member

    Joined:
    May 15, 2007
    Posts:
    181
    "Products were set with their most aggressive detection options, such as using all heuristics and testing inside archives."

    I overlooked that one. It is important because I run NOD32 at max settings and I can't tell its running. With programs like Norton and Kaspersky, I can't stand the max settings. It's a huge issue, since it means that vendors like Panda, Alwil, and AVG may have better real life protection since they can be maxed out without much pc slowdown. Well, not necessarily better; but I know the top vendors would see their detection rates go down when on the usable settings (Am I wrong on this one?). The only exception I know of is Antivir, which is apparently quite fast anyway.
     
  11. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    AV-TEST dont know what they are talking about,

    drweb easily should score 99%, or probably even 100%

    im never reading into these results again, they are such rubbish, do they even know what malware is?

    :rolleyes:

    lol :D
     
  12. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    C.S.J,

    It took me a few seconds before I realized the sarcasm.

    :D
     
  13. Pfipps

    Pfipps Registered Member

    Joined:
    May 15, 2007
    Posts:
    181
    Who was it directed to? Did my post sound like a fanboy message? Or am I too self-important? ;)
     
  14. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    sarcasm, whats that? :shifty:
     
  15. masqueofhastur

    masqueofhastur Registered Member

    Joined:
    Nov 19, 2005
    Posts:
    109
    Is the WebWasher listed CyberGuard WebWasher?
     
  16. pilotart

    pilotart Registered Member

    Joined:
    Feb 14, 2006
    Posts:
    377
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.