PC World - Top Standalone Antivirus Software for 2010

Discussion in 'other anti-virus software' started by Dave53, Nov 24, 2009.

Thread Status:
Not open for further replies.
  1. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    I disagree here. I think that my beloved Avira would take all honors in this category. Placed at the very bottom..
     
  2. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Why is so? AVIRA interface is pretty good. I like the way it's organized. But nothing too flashy.
     
  3. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    This quotation about Norton AntiVirus illustrates why this test is poorly constructed, in my opinion. I seriously doubt that many users care about how well individual facets of a security solution (e.g., behavioral analysis versus “zoo sample” detection rates) work -- rather, they care (or, in my opinion, should care) more about how all these features (and all the others not included in the test) come together to provide “protection in practice”: i.e., protection against malware in the real-world, an outcome which isn't assessed by this test.

    I’m disappointed by the PC World review. I suspect it does more to obscure than to clarify real-world differences in malware protection among products.
     
  4. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    I agree. In Avast i can't understand why they use that grey skin as default skin. It's the ugliest skin i 've ever seen in an AV. The blue one , is so much better. Why don't they switch to the blue as default?

    The other problem with Avast's blue skin, is that i usually don't remember which button does what and have to hover over a button to see the "pop up". I also find confusing that in the tray right click menu there is an update for database and another for program update. They should have put a "check for updates" and that's it.
     
  5. InfinityAz

    InfinityAz Registered Member

    Joined:
    Jul 23, 2005
    Posts:
    828
    Location:
    Arizona
    Here are some of the numbers from the test.

    Signature-based detection of malware and spyware:
    1. G-Data AV 2010 - 99.90%
    2. McAfee AV Plus 2010 - 99.90%
    3. Panda AV 2010 - 99.80%
    4. F-Secure AV 2010 - 99.80%
    5. BitDefender AV 2010 - 99.70%
    6. Norton AV 2010 - 99.50%
    7. Avira 9 - 99.30%
    8. Avast 4 - 97.80%
    9. KAV 2010 - 97.30%
    10. Trend Micro AV 2010 - 96.90%
    11. NOD32 4 - 94.80%

    Adware detection:
    1. G-Data AV 2010 - 99.80%
    2. Panda AV 2010 - 99.50%
    3. F-Secure AV 2010 - 99.30%
    4. McAfee AV Plus 2010 - 99.30%
    5. Norton AV 2010 - 98.90%
    6. Avira 9 - 96.80%
    7. KAV 2010 - 96.20%
    8. BitDefender AV 2010 - 95.50%
    9. Avast 4 - 91.40%
    10. NOD32 4 - 88.70%
    11. Trend Micro AV 2010 - 84.50%

    Heuristic detection of new malware (with two-week-old signatures):
    1. G-Data AV 2010 - 71.90%
    2. Avira 9 - 69.20%
    3. KAV 2010 - 66.80%
    4. F-Secure AV 2010 - 66.70%
    5. BitDefender AV 2010 - 65.30%
    6. NOD32 4 - 63.60%
    7. McAfee AV Plus 2010 - 56.80%
    8. Panda AV 2010 - 53.70%
    9. Trend Micro AV 2010 - 51.70%
    10. Norton AV 2010 - 42.00%
    11. Avast 4 - 38.50%
     
  6. De Hollander

    De Hollander Registered Member

    Joined:
    Sep 10, 2005
    Posts:
    718
    Location:
    Windmills and cows
    So there bottom line conclusion : Norton Antivirus is a good choice for nontechies, but those willing to deal with a less-refined interface can get stronger protection looks valid accordding their numbers.
     
  7. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Organized, maybe. I'd like it a bit more "polished" though (better font type, UI layout etc) but well it's just my taste, we'll see how the new v10 evolves ;)

    Let's go back to topic..
     
  8. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814

    Yes because this is all that matters in a AV.
     
  9. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    I have to doubt the test.

    McAfee having a 99.9 % detection ? :blink:
     
  10. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    I think they misplaced the decimal point, 9.99% is the right value :D
     
  11. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,616
    Location:
    Milan and Seoul
    IMO, the whole test at PCWorld has nothing to do with detections and numbers (see the ridiculous importance given to GUIs) but rather a well planned strategy to eliminate fastidious contenders: companies offering free and decent solutions.

    MSE and AVG haven't been even tested, Avira and Avast the most dangerous competitors were given respectively 7th and an inglorious last. GDATA is the perfect convenient number one as it is not free and practically unknown, and guess who are the two giants at number two and three.

    The average user will go for well known companies that are within the first 3 positions. Now this isn't a tirade against Symantec and Kaspersky who have the right to best promote their product as they see fit, but another example how journalism on the internet has become a dangerous weapon.
     
  12. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I can’t agree that your assessment is a valid “bottom-line conclusion,” since the PC World test appears to fail to utilize many features of Norton AntiVirus, such as Browser Protection, Download Insight and Intrusion Prevention. I seriously doubt that many users disable these features, and thus a test that doesn’t leverage them may severely underestimate the quality of the malware protection provided.
     
  13. dcrowe0050

    dcrowe0050 Registered Member

    Joined:
    Sep 1, 2009
    Posts:
    378
    Location:
    NC
    PC World places way to much importance on GUI and one could argue that they're behavioral detection rates held more water than others parts of the test. Take F-Secure for example, they were in the top 5 of all categories but in the behavioral detection they didn't stand out, resulting in 6th place. I may be wrong and if I were to do the math I may be proved wrong but F-Secure scored better than Kaspersky, BitDefender, and Panda in almost all categories of General Performance. Done as well as these three in Disinfection, and done better than all but Kaspersky in Rootkit Detection. Yet all of the three except Panda scored better on the Behavioral section( just one part of a 4 part test) and the result is that F-Secure is behind all these products. Maybe they got reported their dat wrong or I read it wrong but I have never trusted PC Worlds test.
     
  14. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    Nice to see a test on Panda finally. I don't agree with the assessment that Avira needs more features. I'll be keeping an eye on Bitdefender's reported buggyness by users here as I've liked them them in the past.
     
  15. acr1965

    acr1965 Registered Member

    Joined:
    Oct 12, 2006
    Posts:
    4,995
    It is just kinda odd to me that they grade on "options" in a stand alone antivirus review. If options are so important (besides basic components of av's) then they should have graded suites.
     
  16. tgell

    tgell Registered Member

    Joined:
    Nov 12, 2004
    Posts:
    1,097
    I went to the site and the Eset and Avast reviews are still there.
     
  17. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    I think this explains the final ranking. Simply, you don't see the whole picture here.

    Detection rates are irrelevant.


    From Trend Micro's CEO:


    So i guess, from now on, they will be competing on better GUI. :D
     
  18. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    I love it when these tests are posted on here:-If it or any test doesn't agree with a members pre-conceived ideas then the test is rubbish and not to be believed,if it does then its an accurate assessment of the state of play
     
  19. InfinityAz

    InfinityAz Registered Member

    Joined:
    Jul 23, 2005
    Posts:
    828
    Location:
    Arizona
    Agree, and see it all too often. That's why I posted some of the results earlier (there are more results but PC World did a poor job of making them easily accessible and/or summarizing the results like they used to do).

    Instead of focusing on what the magazine says, look at all of the actual results and make your own determinations.
     
  20. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I’m not even sure that the behavioral detection rates reported by the PC World test are a realistic indication of the extent to which the heuristics within the anti-virus products effectively block malware in the real world.

    Consider, for example, that the engine (“SONAR2”) utilized by Norton Internet Security 2010 relies upon and leverages insight from the community based reputation scores. Since the latter were presumably not accessed in this test (because it used two-week old signatures), then the assessment of its heuristic performance isn’t an accurate reflection of the actual power of the product. Probably similar arguments could be made for many of the other anti-virus products tested.

    On a general note, the PC World test suffers from what I call the “fallacy of decomposition” -- namely, that you can understand the performance of the whole by independently analyzing its constituent parts. The difficulty is that the performance of the whole isn’t an additive summation of the performance of the parts -- there are interactions to be considered. Systems thinking is the alternative to reductionism, and it isn’t employed in this test by PC World.
     
  21. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    For what it is worth, this test basically agrees with my “pre-conceived ideas” about the merits of Norton AntiVirus, and yet I still criticize the methodology for its lack of realism.
     
  22. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I disagree. The problem isn’t the metric -- it is the methodology by which the measurement is derived. When detection rates are computed through a scan of “zoo samples,” then yes, the score is meaningless. When detection rates are assessed by a more robust method that more accurately simulates a user’s real-world interaction with the web, then the scores are very meaningful, indeed.
     
  23. the Tester

    the Tester Registered Member

    Joined:
    Jul 28, 2002
    Posts:
    2,854
    Location:
    The Gateway to the Blue Hills,WI.
    Good point.
    I lose interest when I read about the GUI being an important facet.
    Looking at the results that you posted earlier makes the test more relevant to me.
    And the av that I run isn't one of those tested, so I don't have a personal bias on the test.
     
  24. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    PC World is a home user magazine. Lets not forget that. So a lot of the things that malware enthusiasts (like us) pay no attention to, and in fact important to the home user and therefore to a magazine catering to home users. Ease of use is a big factor and AVIRA just doesn't cut it.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.