AV-Comparatives: Whole Product Dynamic Test (2010)

Discussion in 'other anti-virus software' started by Pleonasm, Sep 22, 2010.

Thread Status:
Not open for further replies.
  1. guest

    guest Guest

    The implementation of TruPrevent in Panda Cloud Pro is a bit different, but probably the score would be close.
     
  2. Matthijs5nl

    Matthijs5nl Guest

    PC Tools surprises me in the positive way, so does Trend Micro.

    F-Secure proofs there is a not a single reason to buy BitDefender, it is buggy and F-Secure's inhouse added technology (DeepGuard (behavior-based) and Backlight) also make it better.

    This review (up to now) supports my opinion about the freebies; Panda Cloud and AVG just outperform avast! (shame we can't see MSE).

    All the others are performing like expected.
     
  3. smage

    smage Registered Member

    Joined:
    Sep 13, 2008
    Posts:
    378
    Yes, hopefully Avast 5.1 will do better next year when the new BB and cloud technology come into play.

    Regards
     
  4. Nevis

    Nevis Registered Member

    Joined:
    Aug 28, 2010
    Posts:
    812
    Location:
    255.255.255.255
    seems i m sialing in good norton boat

    surprising:

    trend micro persorming so well
    bit defender perfroming so bad :D

    lol , so now bit have to reconsider whether to sell its enfine to some av which critices bit only and made better producy
     
  5. ReverseGear

    ReverseGear Guest

    thnx guest :)
     
  6. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    IBK, three questions and comments....

    1. IBK, as a consequence of this observation concerning the operation of SONAR, are you considering modifying the testing procedure to more realistically report the results from Norton Internet Security 2011?

    2. IBK, in the testing process, is each malicious URL sent to each test machine at the same time? I suspect this is true and, if so, it would be worth noting in the description of the testing procedure.

    3. IBK, it is delightful to see the simplified description of confidence intervals in the report (page 10) to educate readers who are not statistically savvy. Yet, the actual results displayed on page 7 of the report fail to display the error bars that you describe. Can you correct this shortcoming in the next release of the report? For example, by my calculation, a product exhibiting an 80% malware blocking rate based upon a sample size of N=307 has a “true” rate residing between 75.5% and 84.5% (with p=95% confidence).

    And one minor note...

    4. The font size within the diagram on page 8 is so small that it is almost illegible.
     
  7. gery

    gery Registered Member

    Joined:
    Mar 8, 2008
    Posts:
    2,175
    avg finally got a serious fine result as it always scored average but i guess it will keep the right direction for good now.
     
  8. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    @Pleonasm: as you may know we are in contact with vendors and issues will be discussed. if there is something to address, it will be addressed. in the graph you see "synchronization", which means they are now synchronized. in final report also graphs/fonts will be better. we wrote this report in a hurry, as we first intended to deliver only an overview and then the final report in december. the current report explains a bit how the test works, but there is still room for improvement (on which we are still working on).
     
  9. atomomega

    atomomega Registered Member

    Joined:
    Jul 27, 2010
    Posts:
    1,292
    @IBK: Is there a chance that we will see Sunbelt included in the final report on Decembero_Oo_O
     
  10. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    thats not exactly 100% applicable to the free solutions they each offer, since the free version of avast is 100% the same as the suite minus the firewall and the script shield i believe while the free version from AVG is far more limited than their suite which includes an actual behavior blocker (not like avast's mis named thing lol) and panda cloud would most likely also differ in its results compared to their suite

    so really, the results here cannot be 100% applied to each vendors free offerings since for most of them, there are big differences between the products.
     
  11. 3GUSER

    3GUSER Registered Member

    Joined:
    Jan 10, 2010
    Posts:
    812
    :isay:

    I am beginning to lose faith in AVC . Writing this in a hurry means more mistakes that otherwise would not pop-up.
     
    Last edited: Sep 23, 2010
  12. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    He's still one of the few who actually know what they're doing. He is also nice enough to answer all our questions. Hi tech, let's not be too critical.
     
    Last edited: Sep 23, 2010
  13. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    did u not read the fact that they released this report in a hurry just for people to read? hes already said the full report will be released later, this is just for people to get a quick glimpse...:rolleyes:
     
  14. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    I remain confident that AVC is the most objective and competent test organization of any.

    Regards,
    Jeerry
     
  15. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814

    AVC has brought us years of good testing services, My view on there test and how they do tests will not change unless they give me damn good reason to change it.

    A simple quick write on what is going on in there latest test was for our benefit and nothing more. He has even been nice enough to stop by and answer our questions. Which I'm pretty sure is not in his Job description.
     
  16. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    "in a hurry" means for me: more time = more textpages to write (which 99% of users sadly may not read/understand anyway, as most just jump to results/graphs). less time, shorter text.
     
    Last edited: Sep 23, 2010
  17. begemot64

    begemot64 Registered Member

    Joined:
    Jul 28, 2010
    Posts:
    71
    Pretty much. Most, including advanced users, do not have the technical knowledge to fully understand methodology of testing security suites. One's knowledge of statistics needs to be way beyond 101 for that.
     
  18. hckyo

    hckyo Registered Member

    Joined:
    Jul 27, 2010
    Posts:
    93
    wow, AVs detection rates increased a lot this test ... (on demand august test )
    Well done Avira 10, Bitdefender 2011, f-secure 2011, Avast and even Nod32 :argh:
     
  19. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,789

    Are we looking at the same results? Bitdefended did the 2nd worst, only behind Kingsoft.
     
  20. iwod

    iwod Registered Member

    Joined:
    Jun 25, 2004
    Posts:
    708
    Great, I think this report better show how each product do then those scanning test.

    Anyway - AVIRA , AVG , F-Secure , Panda , Symantec/Norton , Trend Micro

    Which one uses the least resources? Trend Micro any good? ( It seems to be called PC-Cllin over here in Asia )
     
  21. hckyo

    hckyo Registered Member

    Joined:
    Jul 27, 2010
    Posts:
    93
  22. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    IBK, hopefully you did not interpret my questions/comments as criticisms, since they were not intended to be so. Please consider them as suggestions for enhancements in future editions of the report.

    Additionally, to clarify, ...

    1. For the malware that successfully compromised the test PCs, was it generally the same malware that bypassed all anti-malware products? Or, were the findings more mixed, with one anti-malware product missing samples that the others blocked and vice-versa?

    2. Which classes of malware were most successful in compromising the test PCs (e.g., rootkit, bot, virus, spyware, etc.)?

    3. Did the malware samples used in the test exclude polymorphic variations of the same threat? In other words, are the 307 test cases distinct malware threats?

    Thank you.
     
  23. Rampastein

    Rampastein Registered Member

    Joined:
    Oct 16, 2009
    Posts:
    290
    Interesting, Norton and Kaspersky don't seem to be so dominant in this test (although they're not doing bad) like in other recent dynamic tests. I wonder what was the "user dependent" part of KIS, I've thought it was entirely automatic with default settings.

    It'll be interesting to see the final reports. I've always considered AV-C tests to be the most important tests when making my opinion about AV software.
     
  24. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I was surprised to read this comment in the report.

    To the forum community, in general terms, how could a vendor possibly “game” this test “easily” -- given that each vendor did not know which malware samples were to be tested on any given day?
     
    Last edited: Sep 23, 2010
  25. guest

    guest Guest

    Don't worry the sponsored tests by Symantec were done much much faster, this is going to take 4 months, so you should first start to lose your faith in Symantec's sponsored tests
    The final report will be ready in December. The test finish at the end of November.
     
    Last edited by a moderator: Sep 23, 2010
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.