AV-Test.Org Releases Real-World Malware Protection Report

Discussion in 'other anti-virus software' started by King Grub, Dec 18, 2009.

Thread Status:
Not open for further replies.
  1. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    You are absolutely correct. As I stated, the paper authored by AV-Test “suggests” that the procedure which they describe as “ideal” would have been followed by themselves -- but, doesn’t (of course) prove that it was in fact implemented.

    A full and complete descriptive report by AV-Test would be informative and helpful for such questions....
     
  2. simmikie

    simmikie Registered Member

    Joined:
    Nov 11, 2006
    Posts:
    321
    they did mention that they used all of their 150 PCs during the test and all 40-50 fulltime staff during this test.

    easy to see how they would have staggered their resources to accommodate cleaning yesterdays infected pcs while testing on a different subset of 12 clean pcs.

    Mike
     
  3. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,944
    Location:
    USA
    Part of the reason I wonder about their methodology is that this test seems to be a departure from previous tests... the 10 new threats a day for 60 days aspect, I mean. What do you really think they did, i.e. what would be your guess? Clean system each day? I'd like for them to keep introducing new threats to the same untouched system, for ultra-real world testing.
     
  4. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Thank you for the information Pleonasm. Strange action to take, but I guess the more details, the better - allows users to consider what they feel is more important :)
    So the main difference between Detection and Blocking is allowing all popups and alerts including website blacklists or product warnings?
     
  5. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814

    It's definitely a new way of testing and believe it or not a welcome one from my view. This shows alot more of what products can do. With the methodology, I'm sure it will be tweaked over time just as the other way has over time. Main thing is, Is the community ready for a new way of things.
     
  6. simmikie

    simmikie Registered Member

    Joined:
    Nov 11, 2006
    Posts:
    321
    well a nice question, but as far as me and mine go, the few times i have had an infected pc, the pc does something strange. i shutdown and let the hero of the day clean it, usually FDISR or Shadowprotect. my point is i do not know how realistic getting hammered for 60 plus days is.

    i find it interesting that some are all agog at the results of some of the entrants. they are suites, and tested without respect to which component of the suite detected/blocked.

    i could be mistaken but pctools has included in it's suite Threatfire, Norton has the 360 gig and other technologies i've never heard of. were these av engines tested on their own merit i suspect the results would have been more in line with what folks anticipated. and actually for me that is fine, while i don't run a suite i do run the layered approach. at the end of the day i don't care which security tool stops an infection as long as it is stopped.

    for me this test and the resources committed to it are the new gold standard of real world testing methodology. i hope to see more like it.

    Mike
     
  7. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I agree. My expectation is that those anti-malware products that do not perform at the top of these real-world assessment ratings will be motivated to innovate and to improve their approaches, which will in turn cause the current leaders to do the same. As a consequence, over time, such competition will drive enhanced protection for users of almost all security solutions.
     
  8. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,944
    Location:
    USA
    Regarding whether or not these systems were cleaned after each day, I posed the question to Neil Rubenking. His response...
     
  9. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    Here is a similar result from AV-Comparatives. Read and believe, Norton is the best when it comes to detection in the real-world.

    https://www.wilderssecurity.com/showthread.php?t=260933
     
  10. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    It's Christmas time my friend, gifts are for everyone $$$ ;)

    can't wait and see what people will comment on the next "real world" (?) test..
     
  11. Jav

    Jav Guest

    Yeap, seems symantec did really good work on this edition.
    Got on top on 2 real world test. :)
     
  12. progress

    progress Guest

    Symantec and Kaspersky are back, the detection rate of AVG IS is good as well :thumb:

    But what happened to Avira - 87.7% and many FP... :rolleyes:
     
  13. elapsed

    elapsed Registered Member

    Joined:
    Apr 5, 2004
    Posts:
    7,076
    Avira Premium Security Suite 9.0 Blocked:87.2% FPs:none
    AVG Internet Security 9.0 Blocked:84.2% FPs:few

    I know which I'd choose ;)
     
  14. noblelord

    noblelord Registered Member

    Joined:
    Aug 19, 2009
    Posts:
    162
    Location:
    UK
    Trend Micro wasn't tested, I think you mean TrustPort. :)
     
  15. elapsed

    elapsed Registered Member

    Joined:
    Apr 5, 2004
    Posts:
    7,076
    I think you need to read the page again.

     
  16. progress

    progress Guest

    That's what I mean: You can't block anything you can't detect, am I right? :doubt: Hm ...

    Why are the results so different compared to AV-C? Avira and BitDefender are much better rated at AV-C :doubt:
     
  17. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    They're testing different things - not the same websites/malware.

    ... and I guess, because of the relatively small samples in the most recent tests, results will not be as consistent as when testing on larger samples, eg, on on-demand tests.
     
  18. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    That is a reasonable conclusion, I believe. Under these conditions, one should look for convergent validity -- comparable patterns of results across different replications of tests using similar methods. If an anti-malware product performs well on several such tests, then one’s confidence in the quality of that product’s performance is enhanced.
     
  19. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814

    Could you imagine them doing the test with a bigger test set ? By the time they were done testing it would already be a outdated test.

    In all honesty a test that is ran like this has no choice but to use a smaller sample set. Otherwise it would really be inaccurate due to how long it would take to execute. This test and the one done by AV Corp. At least gives us a view of how the product as a whole preformed.


    I do applaud Av-Test and AV-Corp for doing these tests. This is not a cheap way for them to test.
     
  20. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    I know and agree... Didn't criticise the size, commendable work by them. I was justifying why there may be such a difference in rankings between the two tests - that's all.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.