AV Comparatives Tests

Discussion in 'other anti-virus software' started by JerryM, Aug 31, 2012.

Thread Status:
Not open for further replies.
  1. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    As I notice the AV Comparatives Performance Tests involving the PC Mark Score (May 2012) and the Whole Product Dynamic Test (March-June 2012) results Avast, for instance, shows the second highest score on the Performance Test.
    However, it is down in the pack on the Whole Product Dynamic Test.

    I am not so concerned with the specific scores, but wonder why the difference in the capabilities of the AVs in the two testso_O?

    Thanks,
    Jerry
     
  2. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    two completely different tests covering two completely different aspects of products.
     
  3. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Thanks, IBK. I figured as much, but can you give a quick explanation that a dummy can partially understand?
    Don't both tests seek to determine the protection the AV provides?

    I realize that if I read all of the methodology and such it might be clear, but I do not understand all I read in that area.

    Does either test determine how the AV acts if the malware attempts penetration vs finding it and removing it?

    Thanks, and I realize you may be too busy to attempt to explain it to someone who is about like an oak stump.

    Jerry
     
  4. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    The performance test only looks how much impact on performance the products have. It has nothing to do with malware.
    If unclear, please just send an email and let me know your native language, maybe I find a better way to explain (most performance reports which explain it are only in English, but maybe I find a translation).
     
  5. AlexC

    AlexC Registered Member

    Joined:
    Apr 4, 2009
    Posts:
    1,288
    This really emphasizes the importance of keeping the software updated:
    From the Whole Product Dynamic “Real-World” Protection Test, March-June 2012: http://www.av-comparatives.org/comparativesreviews/dynamic-tests
     
  6. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Hi again IBK,
    I do appreciate your patience. I did not understand that the "The performance test only looks how much impact on performance the products have."

    I thought it was a measure of the protection from the various malware. Now that I know the difference I understand why the difference in percentages.\:oops: .
    English is my native language. I'm in the US. But I am not very smart as is evidenced here.:)

    Since I have at various times stated, yours is the testing organization in which I have the most confidence. Accordingly, the level of protection shown by your tests do influence which AV I use. Hence I wanted to understand the results better. Now I do. Many thanks.

    Regards,
    Jerry
     
  7. AlexC

    AlexC Registered Member

    Joined:
    Apr 4, 2009
    Posts:
    1,288
    One thing i don't understand about this tests is how all products rate above 90% detection, and yet in the "real life", so many computers show up infected...

    The truth is that most pc's i see infected:

    - are using outdated AV's,

    - have all kind of crap that isn't detected by the AV's, probably (i guess) due to legal reasons (conduit, babylon, ilivid, ask, etc), and that make the computer and browsing sluggish, and so the effect is the same of a "detected" malware.

    IBK maybe you can help clarify why products like the ones i mentioned above aren't detected by most AV's?

    And the inconsistency between the detection rates above 90% and what happens in "real life"?

    Thanks!

    (and sorry for my poor english)
     
    Last edited: Aug 31, 2012
  8. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380

    You answered your own question right ? The AV guys are only right 90% of the time. They have to get it right 100% of the time. Even 99.9999% is not good enough because you end up with an infection.
     
  9. For 0-day malware 70% is a good score, the best range from 70% to 90% the worst I've seen 30% and it is well used by the wilders community so make sense of that. These are real world tests by people interested in security so I trust them more than most tests.

    Nothing is 100% though, never will be. So just choose the right solution for your needs. part of the blame must go to AV companies who spread the total protection myth.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.