AV-TEST - February 2016 (Windows 7)

Discussion in 'other anti-virus software' started by LagerX, Mar 29, 2016.

  1. LagerX

    LagerX Registered Member

    Joined:
    Apr 16, 2008
    Posts:
    565
  2. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,555
    Location:
    New York City
    Thanks. Nice to see Emsisoft tested and performing well.
    AVG continues to shine.
    Microsoft dropping back a bit.
     
  3. Rompin Raider

    Rompin Raider Registered Member

    Joined:
    May 6, 2010
    Posts:
    1,254
    Location:
    Texas
    Thanks for posting...Windows 7...the good old days! LOL Miss it!
     
  4. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    It's suprising to see AhnLab doing so well.
     
  5. SM_Unlimited

    SM_Unlimited Registered Member

    Joined:
    Jun 8, 2010
    Posts:
    32
    Interesting they have fleshed out the performance section results to show % change in performance in a number of areas in these latest results.
     
  6. LittleDude

    LittleDude Registered Member

    Joined:
    Mar 22, 2008
    Posts:
    79
    The performance section is much better now. It was interesting to see how much ESET slows browsing(I ran some tests to confirm).
    I think it is strange that AV-Comparitives does not include browsing impact in their performance tests.
     
  7. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,787
    Yes, the performance numbers are scary. Panda Cloud......yikes!
     
  8. haakon

    haakon Guest

    Hmmmm. 20% slowdown as an industry standard for launching popular websites. I wonder how much that value is reduced by the use of EasyList and EasyPrivacy blocking, some tracking blocker (Ghostery, PrivacyBadger, etc.) and a Flash/HTML5 blocker. I whitelist some sites and it takes significantly longer to render a page while monitoring endless connections to dozens and dozens of blockable elements and trackers and too many video ads.

    Anyhow, don't forget to disable your AV when browsing those "popular" websites. Wink wink nudge nudge. :D
     
    Last edited by a moderator: Mar 29, 2016
  9. haakon

    haakon Guest

    Do you think the average user actually notices that their favorite pop culture page loads in 3.84 seconds instead of 3.20?

    Performance data is for OCD geeks, anyway. ;)
     
  10. LittleDude

    LittleDude Registered Member

    Joined:
    Mar 22, 2008
    Posts:
    79
    I hate surfing without an ad blocker...the ads are the biggest slowdown not to mention annoying.
    At the end of the day if I'm comparing two security products that are the same price and offer similar potection but one slows browsing,
    I'm know which one I'm going to choose.
     
  11. stapp

    stapp Global Moderator

    Joined:
    Jan 12, 2006
    Posts:
    23,933
    Location:
    UK
    The platform used for the test was Win 7 SP1 32bit.
     
  12. phalanaxus

    phalanaxus Registered Member

    Joined:
    Jan 19, 2011
    Posts:
    509
    Test is nice and all but their higher weighing of detection to protection is wrong imo. This system protected you from all the thieves but didn't know the name to some of them, so instead we are going to score this other system which didn't protect you against some of the thieves but named more of them, higher.
     
  13. Solarlynx

    Solarlynx Registered Member

    Joined:
    Jun 25, 2011
    Posts:
    2,015
    They return Qihoo to tests. Does it mean that it's trustworthy company again?
     
  14. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    as far as i can tell, their categorical ratings are based on relative numbers - e.g. if the average is 97% and something hits 98% or 99%, it's above the average and is awarded a higher score. conversely, if the average is 97% and a product hits 91.8 or 86.4%, it's awarded a lower score. in the case of MSE (which is what my link is) you can see they obviously didn't "weigh" wildlist/high-prevalence stuff more than newer (protection category) threats.

    protection testing methodology is explained here
     
  15. phalanaxus

    phalanaxus Registered Member

    Joined:
    Jan 19, 2011
    Posts:
    509
    Comodo results indicate otherwise or they have some really weird scoring on the detection part.
    P.S. I know the methodology, but I don't think they explain how they score.
     
  16. tgell

    tgell Registered Member

    Joined:
    Nov 12, 2004
    Posts:
    1,097
    I don't understand the reasoning behind testing the Chinese version of Qihoo 360 Antivirus v5 and not 360 Total Security. Looks like its using Qihoo's own engines plus Bitdefender.
     
  17. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    They look like they're weighted the same. Fewer prevention cases vs wildlist cases. MSE was well below avg on prevention but 99.6 & 99.7 on wildlist, but scored 3. Comodo was way below avg on wildlist but scored 100% on prevention and got a 4/6. Probably something like 3 pts each. At any rate, I contacted them to ask because there's no point in speculating.
     
    Last edited: Mar 30, 2016
  18. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    Here's a link to the Protection category methodology: https://www.av-test.org/en/test-procedures/test-modules/protection/

    As far as the awarding of points:

    Home-user products must achieve at least 10 of the 18 points available and at least 1 point in each category in order to earn an "AV-TEST CERTIFIED" seal of approval.

    Corporate solutions must achieve 10 of the 18 points available and at least 1 point in each category in order to receive the "AV-TEST APPROVED" seal of approval.
    Each category is awarded a maximum of 6 points; that's what the "circles" represent on the test report. How the points are given in each category is a mystery.

    My guess is it is based on the percentages awarded in the test sub-categories. For the protection category, there are two. That would mean a maximum score of 3 points each. I suspect there is a minimum percentage threshold, lets say 70%. Anything below that gets 0 points. Above 70%, the points are award in increments of 1 for every 10% increase above 70%.
     
  19. SM_Unlimited

    SM_Unlimited Registered Member

    Joined:
    Jun 8, 2010
    Posts:
    32
    As above just ignore the point system and look at the actual % to compare effectiveness between products.
     
  20. Hiltihome

    Hiltihome Registered Member

    Joined:
    Jul 5, 2013
    Posts:
    1,131
    Location:
    Baden Germany
    To me it looks like there are engineers at work, that do not know much about real world.

    They tested the chinese version of Qihoo 360 Antivirus v5, a version for chinese market, and linked to chinese web site...
     
    Last edited: Apr 1, 2016
  21. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    The problem with tests like this, is that the performance impact of an antivirus will vary from one computer to the next. So, on your own computer/s ot's possible that performance ranking will change.
     
  22. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    I received a response from AV-test this morning. The scoring for protection is 3 points per protection subcategory (prevention + detection = 6). There's no weighting (e.g. wildlist/zoo detection isn't more important to scoring than prevention/zero-day/early-lifecycle malware).

    yep, i agree. that's the best way to look at these tests (and others). most don't seem to want to, for whatever reason.

    product performance in lab testing is supposed to be considered relative to other products in the same test. it's not supposed to provide you with an idea of how it will perform on your very specific machine. it's just intended to provide an idea of how well products perform relative to each other on the same test hardware.

    if av-test is anything like AV-C, the vendor tells them which product version/release to test, and they test it. that would suggest that qihoo asked them to test the product version they tested.
     
  23. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    I know that, and I see that as a problem, because on different hardware the relative performance will vary. If you were to list the products in order from least to most system imapct, then that order will vary from one computer to the next. For example, I have seen cases where an antivirus has very little system impact on one computer, but on a different computer running the same operating system and the same version of the antivirus, there have been very noticable slowdowns.
     
  24. phalanaxus

    phalanaxus Registered Member

    Joined:
    Jan 19, 2011
    Posts:
    509
    @m0unds
    Thanks for clearing that up. Still can't figure how the hell they score exactly though.
     
  25. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    it's still not about trying to correlate performance gleaned from testing with your very specific machines and experiences (this is inclusive of stuff like "different computer same os", etc). it's about taking a bunch of products, installing them in a controlled testing environment, then performing the same operations with each product installed and computing a comparative result of all the products running on the same hardware and os, under the same lab conditions vs a baseline configuration (which, i'd assume is just the test bed without a product installed)

    the point of the controlled environment is to make sure it can be reproduced, ensuring the result has validity, and that the data is useful.

    yeah, they provide a pretty good description of the methodology used but leave out the scoring part. it's strange.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.