New AV-Test.org malware testing (Avira finished 1st, CA eTrust finished last)

Discussion in 'other anti-virus software' started by InfinityAz, May 23, 2007.

Thread Status:
Not open for further replies.
  1. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Yes, this does make sense, although I was initially surprised at the relative drop. Actually you can get some insight into the trend quantitatively by looking at differential changes in detection, which does provide a result consistent with the www.AV-Test.org results for NOD32 at least.

    The basic estimation process is straightforward. Successive tests have an incremental increase in samples in each category. Let's say that Trojans increased by 100,000 samples, from 300,000 on a previous test to a current value of 400,000 total samples. Assume the current detection rate is 96%, while the previous detection rate was 98.5%. That means 16,000 samples were missed in the current test while 4,500 samples were missed in the previous test. In that time 100,000 samples were added to the testbed. You can't calculate the detection rate for the added samples with the information provided, but you can place upper and lower bounds on it. For the lower bound assumed all missed samples are in the new members of the testbed. This result is simply (100,000 - 16,000)/100,000 = 84%. For the upper bound, assume that none of the missed detections were fixed and remove them from the number of missed detections. This provides an upper bound of (100,000 - (16,000 - 4,500))/100,000 = 88.5% for the example calculation.

    I won't provide the complete analysis for NOD32, but the bounding estimates are 90.7% and 94.2% for lower and upper bounds respectively for the entire testbed when this procedure is followed. I view as entirely consistent with the www.AV-Test.org result (88.32%) given the crude level of approximation that I've used. Note that this represents a 6 month window for my estimates.

    Blue
     
    Last edited: May 25, 2007
  2. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    I sent off another e-mail to Andreas Marx about the doubts posed by some people regarding availability of the 16GBs worth of samples before the actual test itself. Mr.Marx was again very kind to provide these comments:

    And with that, I hope all doubts about this latest test are cleared. I thank Mr.Marx again for taking the time to clarify the various doubts presented by me as well as the forum posters here. His words make perfect sense, and to me, AV-test remains as reliable as ever. :)
     
  3. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    Hardly. As I stated in an earlier post, the discrepancies for all AV vendors between the April and May tests by the same testing company (AV-Test) are just too great.
     
    Last edited: May 25, 2007
  4. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    Agreed
     
  5. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500

    So does that mean that back in 2004 i should trust in Norton? It was far too bloated. Just because it is the most trusted in the industry doesn't mean that it should have my trust.
     
  6. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    just because it used to be bloated, doesnt mean it offers less in protection.
    so, yes you should trust it.
     
  7. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    A pity that some other testing sites have set conditions for vendors to reach before they receive the missed samples. The "low-scoring" vendors therefore never see them again; Ad infinitum.
    Completely agree :thumb:
     
  8. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    If we look at another specific example, for AV AVG, regarding Trojan detection, the detection for April was 91%, May 96%. This is a five percent difference. Based on a Trojan sample size of 407,487 for May, this represents an increase in Trojan detection for May of 20,374. This hardly seems possible.
     
  9. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    Yes. Why not.
     
  10. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    I'll have to voice a differing opinion to that. Testers are testers, not servants of security vendors. As far as I'm concerned they've contributed enough by volunteering time, effort and money by bringing professionally-conducted tests to the public.

    With my extremely limited knowledge of how the industry works, I do expect any vendor worth its salt to be able to collect their own malware samples without having to rely on testers to do the work for them.
     
  11. extratime

    extratime Registered Member

    Joined:
    Oct 14, 2005
    Posts:
    100
    Firecat thanks for asking Andreas Marx those questions, and a big thanks to Andreas for his candid and detailed responses.

    It only bolsters their reputation as a testing body. It also makes me realize that there is a lot more malware out there than I could have imagined.
     
  12. flyrfan111

    flyrfan111 Registered Member

    Joined:
    Jun 1, 2004
    Posts:
    1,229
    Not always true. When you take a test in school isn't it better for you to know what you got wrong? More is learned from knowing what you got wrong than knowing what you got correct. It is only fair to inform vendors of what their product got wrong(or more accurately missed).
     
  13. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    The answer to your question is: undoubtedly. But the testers are not teachers, and neither are vendors little kids.

    It would be in the vendor's best interests if they got the samples they missed; obviously they want all the malware they can get. However, if a vendor consistently displays a lack of malware-collecting resources to improve their product unless they are spoon-fed by testers, it says a lot about that vendor; at least, it does for me. Not to mention that testers aren't duty-bound to aid the less-capable vendors. It'd be nice of them, certainly; but I don't think it's a responsibility.
     
  14. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    whether testers want to 'spoon-feed' the vendors, it doesnt matter.

    i personally would rather have my av have the samples, than not.
     
  15. besafe

    besafe Registered Member

    Joined:
    Mar 29, 2007
    Posts:
    222
    The point of the test is to determine which vendors produce a quality product. If a certain vendor is poor at finding and detecting the most recent malware, that is good for the consumer to know.

    If the testing organization is feeding the missed samples to the security companies, this could artifically inflate the test results and potentially mask a security vendors faults.

    The security company should be doing the research and finding the malware samples, not the testing organization. I want to know which vendors are good at finding new malware, not which vendors are good at updating their signatures based on the test organizations samples.
     
  16. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Regarding AVG specifically, I have stated in another post that AVG Professional was used in the PC World Review while AVG Anti-Malware (which uses Ewido+AVG engines) was used this time. This would explain the difference in trojan detection rate. :)

    Regarding sending samples to AV vendors, Andreas' view on the matter is highlighted in post #89. However, he also says that every vendor should be given a chance to improve regardless of detection rates by giving them the missed samples. Essentially it means that tester-provided samples should not be the only source for AV companies, but rather AV companies should gather samples by themselves. Only a combination of the two creates a good AV product (and improves existing ones). Though Andreas' sample set is HUGE, I'm sure no one on Earth has every piece of malware released on the Earth, therefore it is as much of the AV vendors' duty to collect their own samples as it is of the testers to allow vendors to verify test results as well as add undetected/missed samples to improve their overall detection. Because if an AV vendor relies solely on tester submitted samples, then there is every chance that by the time the tester submitted samples are added to the database, those samples are already old. And in paying attention to only these samples, the company has already missed current malware roaming around the Internet and hence has erred in terms of providing protection to its users from the latest malware.
     
  17. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    BTW regarding some other discrepancies people may have noticed between the PC World tests in April and the current test performed by AV-test.org, it should be noted that the PC World article "Top Antivirus Performers" was published on April 23, 2007, but the actual testing was finished quite a bit earlier than that date. The editors need time to write the actual review after looking at the lab results and of course it takes a couple of weeks to test out all the products to determine their opinion of the product on factors other than the detection rate (since detection rate alone is not always a factor for a potential customer).

    And on top of this, one needs to know that it takes time to print the PC World magazine, so the truth is that there was a lot more time between the PC World results and the latest test from AV-test than just one month. :)
     
  18. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    And the eight percent increase in Trojan detection rate for Avast! (88 to 95.94), how is that explained? This would be an increase of 32,354 Trojans in one month!! They must be pretty busy at Avast!
     
  19. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Again, the time difference was quite a bit longer than one month, its not like Avast! wouldn't improve in that period....
     
  20. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Obviously you are not doing a very good job of reading what people say before chiming in with your bit.
     
  21. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    Thanks. I've noticed your posts are always positive and supportive of others.
     
  22. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    Even if the PCworld test was performed six months ago, that would mean Avast! would average an increase in Trojan detection of 5400 per month for six months.
    Is this credible?
     
  23. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    based on the nature of the used samples, its quite possible that they all get detected with just some few definitions..
     
  24. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    Generics?
     
  25. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    you could call it generic
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.