AV-Test Product Review and Certification Report - 2010/Q2

Discussion in 'other anti-virus software' started by malexous, Aug 16, 2010.

Thread Status:
Not open for further replies.
  1. Nanobot

    Nanobot Registered Member

    Joined:
    Jun 23, 2010
    Posts:
    473
    Location:
    Neo Tokyo
    Last edited by a moderator: Aug 17, 2010
  2. 3GUSER

    3GUSER Registered Member

    Joined:
    Jan 10, 2010
    Posts:
    812
    How can you even compare AV-Test.org with Raymond.cc ?!

    And they don't need to mention the tools in each and every report , this is AV-Test.org - AFAIK the biggest testing organisation in the AV industry.
     
    Last edited: Aug 17, 2010
  3. Baserk

    Baserk Registered Member

    Joined:
    Apr 14, 2008
    Posts:
    1,321
    Location:
    AmstelodamUM
    Agreed. A sample set of 16 seems very low.
    Does anyone know for how much the Dynamic Detection Testing-part counts in the total score?

    Not so fishy.
    The Avast security suite was tested, not the AV.
     
  4. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I too am surprised that only 16 samples were used in the “blocking of malware on or post execution” portion of the test. I wonder: what was the rationale for AV-Test to use so few?

    The “real-world testing” component is based upon 59 samples. However, over about a 90-day duration (April-June, 2010), I would have expected the test to encounter a greater quantity of zero-day malware attacks.
     
  5. Nanobot

    Nanobot Registered Member

    Joined:
    Jun 23, 2010
    Posts:
    473
    Location:
    Neo Tokyo
    Okay i take back what i said about this test being fishy

    This test is the laughingstock of the tests EVER
     
  6. 031

    031 Registered Member

    Joined:
    Sep 5, 2007
    Posts:
    187
    Location:
    Bangladesh
    I totally agree with you. For me the surprise was the result of bitdefender :blink: :blink:
    Bitdefender is no way lighter than avast or avira :cautious: :cautious: :cautious:
     
  7. dw426

    dw426 Registered Member

    Joined:
    Jan 3, 2007
    Posts:
    5,543
    Is that your opinion or do you have something to back that up? Sorry if I'm being hard, but again typical Wilders behavior has shown itself. One minute a test is looked at with interest and congratulations are spread around. Then, inevitably, someones favorite vendor doesn't do so hot, the test starts to have "something wrong with it", and from there on, reason after reason comes along from others to dismiss the entire thing. And no, I'm not saying that because I have a favorite that did bad or good, I say that because this always, never without fail happens. Is 16 samples enough? Maybe not, but that doesn't automatically mean the testers are complete idiots.
     
  8. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,564
    Location:
    New York City
    Overall, I think the test is pretty good. Symantec and Kaspersky scored at the top, which shouldn't be a surprise. I don't know enough about Panda to comment about it. Historically, Eset has never done that well with AV-test.
    A big surprise to me is the high performance of AVG. The only part that seems unreliable is the average system slowdown. This is only one test, and should be treated as such.
     
  9. Baserk

    Baserk Registered Member

    Joined:
    Apr 14, 2008
    Posts:
    1,321
    Location:
    AmstelodamUM
    Perhaps there is some confusion here and there as some seem to think that only AV's have been tested where AV-Test.org has tested complete security suites (besides MSE and SpywareDoctor+AV) afaict.

    I don't understand why they would test the usability/slowdown of different products.
    Suites, AV's and combinations of AM/AV products are likely to have a different impact on a system.

    A test like this seems to imply that AV-Test.org assumes that those different products should, in principle, have the same resource usage.

    I don't think I could agree with such an assumption.
     
    Last edited: Aug 17, 2010
  10. Ibrad

    Ibrad Registered Member

    Joined:
    Dec 8, 2009
    Posts:
    1,972
    Just to add for those of y'all wondering why AVG scored so high remember the suite and pro version has Identity Protection = Behavior Protection = The old Norton Antibot and from reading about tests AVG has really kept it a strong tool.
     
  11. Nanobot

    Nanobot Registered Member

    Joined:
    Jun 23, 2010
    Posts:
    473
    Location:
    Neo Tokyo
    Don't jump to conclusions so easily,on each of the three pc's (2 desktops 1 laptop) i have at home i have installed 3 different av's so the part you said about favorite vendors is totally wrong,at least for me.

    Now on the other you said,nobody said the testers are idiots but this doesn't mean we have to ignore some real facts just because they happen to be the "biggest testing organisation in the AV industry"
     
  12. LODBROK

    LODBROK Guest

    There is no AVG pro. And only the IS suite has IDP.
    http://www.avg.com/gb-en/free-antivirus
    Free and AV can add IDP for $20. Which if you think about it Free and IDP looks like a pretty good deal for really good protection.
     
  13. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Spain - Holland : 1 - 0

    I posted that suggestion long time ago at PrevX, wait and see

    PrevX4 - Panda2 : 1 - 0

    :p
     
    Last edited: Aug 18, 2010
  14. Ibrad

    Ibrad Registered Member

    Joined:
    Dec 8, 2009
    Posts:
    1,972
    Oh my mistake I thought AVG had a PRO Av that had the IDP module.
     
  15. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,787
    I think when v10 comes out in a few months, the packages are supposed to change.
     
  16. webster

    webster Registered Member

    Joined:
    Feb 23, 2004
    Posts:
    285
    Location:
    Denmark
    Bullguard use the Bitdefender scanner, and should have the same detection o_O
     
  17. guest

    guest Guest

  18. InfinityAz

    InfinityAz Registered Member

    Joined:
    Jul 23, 2005
    Posts:
    828
    Location:
    Arizona
    I ranked them based on AV-Test's protection score, with 0 equal to worst and 6 equal to best.

    Product - (Protection Score)
    AVG: Internet Security 9.0 - (5.5)
    G Data: Internet Security 2010/2011 - (5.5)
    Panda: Internet Security 2010 - (5.5)
    Symantec: Norton Internet Security 2010 - (5.5)
    F-Secure: Internet Security 2010 - (5)
    Kaspersky: Internet Security 2010 - (5)
    McAfee: Internet Security 2010 - (5)
    PC Tools: Spyware Doctor with AntiVirus 7.0 - (5)
    BitDefender: Internet Security Suite 2010 - (4.5)
    Avira: Premium Security Suite 10.0 - (4)
    Microsoft: Security Essentials 1.0 - (4)
    Avast: Internet Security 5.0 - (3.5)
    BullGuard: Internet Security 9.0 - (3.5)
    Eset: Smart Security 4.0 - (3.5)
    Webroot: Internet Security Essentials 6.1 - (3.5)
    Trend Micro: Internet Security Pro 2010 - (2.5)
    Norman: Security Suite 8.0 - (2)

    Why ranked on protection? As others have stated, I'm more interested that it prevent infection and there are good, free tools for removing malware.
     
  19. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
  20. guest

    guest Guest

    Very inspirational speech ;) but a bit aggressive for football maybe jeje
     
  21. fsr

    fsr Registered Member

    Joined:
    Jul 26, 2010
    Posts:
    190
    I haven't formed my opinion on this new testing, but so far and from the consumer point-of-view, this one seems to be the more important certification the indrusty is currently issuing, and i'm thinking of competitors like VB100, WCL and ICSA. Probably more re$ource$ are needed to increase the statistical relevancy of the "Blocking of Malware on or post Execution" test (the same can be said of the latest Dynamic Testing from AV-C), and i also wish they were more transparent regarding methodology used. Looking forward for follow-up. :thumb:
     
  22. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    last night I took 16 samples and tried them on product A. It got 15 of them, and would have recieved a A rating from me.

    Then I tried another 16 with product A and it only got 4 of them. In this test a D rating.

    How quickly one can rise or fall depending on how narrow the scope is.
     
  23. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    The extremely low number of samples does seem to fly in the face of reason,with such an amount there's a huge possibility of wide variance in results.o_O
     
  24. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,787
    For the 0 day testing they used 59 samples. I think that is reasonable considering how new the samples have to be, and the time frame to test in.

    I think this is one of the better tests I have seen done. I like the way the results are laid out, and the fact there is a system performance impact measurement.

    I just wish more products were included.
     
    Last edited: Aug 18, 2010
  25. flik

    flik Registered Member

    Joined:
    May 21, 2006
    Posts:
    49
    About the 0-day protection testing,
    I think that it would be better to test 4-5 sets of 0day samples and then take the average mean of each product score.
    Most of the products, would detect some more samples after the second signature update, the same update, so it's a matter of some hours for one product to score 10/16 or 15/16.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.