AV-Comparatives: Real World Protection Test February 2016

Discussion in 'other anti-virus software' started by anon, Mar 16, 2016.

  1. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    They say in the 2016 Feb report to refer to older reports for more information about the "Real World Test". Sure enough, the last test from 2015 says:

    "In this test, all protection features of the product can be used to prevent infection - not just signatures or heuristic file scanning. A suite can step in at any stage of the process – accessing the URL, downloading the file, formation of the file on the local hard drive, file access and file execution – to protect the PC. This means that the test achieves the most realistic way of determining how well the security product protects the PC. Because all of a suite’s components can be used to protect the PC, it is possible for a product to score well in the test by having e.g. very good behavioural protection, but a weak URL blocker."

    and

    "Security products should protect the user’s PC. It is not very important at which stage the protection takes place. It could be while browsing to the website (e.g. protection through URL Blocker), while an exploit tries to run, while the file is being downloaded/created or when the malware is executed (either by the exploit or by the user). After the malware is executed (if not blocked before), we wait several minutes for malicious actions and also to give e.g. behaviour-blockers time to react and remedy actions performed by the malware. If the malware is not detected and the system is indeed infected/compromised, the process goes to “System Compromised”. If a user interaction is required and it is up to the user to decide if something is malicious, and in the case of the worst user decision the system gets compromised, we rate this as “user-dependent”. Because of this, the yellow bars in the results graph can be interpreted either as protected or not protected (it’s up to each individual user to decide what he/she would probably do in that situation)."
     
  2. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    @m0unds,

    So Eset misses approximately 2.5 %. While Kaspersky, Bitdefender and some other vendors score 100 %.
    It's light, easy. But the score is not that great.
     
  3. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    honestly, i wouldn't be concerned about a 2.5% differential between eset and "top tier" products.
     
    Last edited: Mar 19, 2016
  4. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    The question is "what are your odds" of actually being infected by any of the 0-day malware used in the test?

    One thing the AV labs don't publish is statistics on the malware samples used. For example, were these malware that infected thousands, hundreds, or just one web site? Nor do the AV labs publish details on which of aforementioned instances did the vendor product miss.

    Finally, the AV labs tests are performed with default OS, browser, and vendor product settings with all software fully patched for this test. Eset along with a few other that were tested do not configure their products for maximum security although they claim they do. The product is configured for what I would classify as "pretty good protection with maximum usability" as was commented on as Eset's "lightness." Whereas products like Kaspersky, Trend, and F-Secure have caused noticeable performance impacts in the past for a number of users.
     
  5. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    F-Secure is lighter on my computer then Eset.

    Eset definetly provides more of a performance impact then the newer versions of F-Secure.
     
  6. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    that's ultimately why a +/- 2.5% differential doesn't bother me in the slightest.

    afaik, this is provided to the vendors, so they'd at least be aware of whether there's some sort of product deficiency that led to the test system(s) being compromised and would provide them an opportunity to rectify that.

    I don't believe they make cfg changes unless the vendor requests it. I'd imagine the vast majority of eset's users install the product and use it as-is without tweaking anything.

    Eh, Kaspersky & F-Secure both have had decent performance showings in both AV-C and AV-Test.org's testing the last few years. I generally ignore users' performance grievances as they're nearly always completely subjective, and impossible to replicate without the same hw/sw cfg. I've used both products and haven't noticed any more or less system impact than anything else I've tried on my systems.
     
  7. PeZzy

    PeZzy Registered Member

    Joined:
    Apr 2, 2011
    Posts:
    56
    I recently needed to uninstall Kaspersky, because it became a memory hog. While user experience is subjective, a sterile 'lab' is not the best place to test products. I highly doubt the tests that claim Kaspersky has lower cpu and memory impact compared to EIS (for example). Maybe on a fresh OS with minimal software installed. In the real world with a fully utilized computer, that's not the case.
     
  8. m0unds

    m0unds Registered Member

    Joined:
    Nov 12, 2015
    Posts:
    219
    That's precisely my point. Your experience with Kaspersky has nothing in common with *my* experience with it. A competent lab gives a comparison between products running on the same hardware/software configuration as a relative number.

    e.g. "in our test env this product is somepercent faster at xyz operations than this other product is". There are actual numbers, not anecdotal stuff like "our test rig /felt/ slower", or "our test rig /seemed/ to browse slower", etc.

    Test results are more interesting to me as a user than users on a forum telling me that they feel test results are suspect because their specific experience is different than the test results indicate. Of course your experience is different. It's a different environment, different hardware, etc. Too many variables, and usually completely subjective, as I said.
     
    Last edited: Mar 21, 2016
  9. Antarctica

    Antarctica Registered Member

    Joined:
    Feb 25, 2003
    Posts:
    2,177
    Location:
    Canada
    I fully agree with you because my experience with Kaspersky is totally different from PeZzy. I had to uninstall another brand of antivirus because it had become too heavy on my computer. I have Kaspersky Internet Security install since 6 months and my PC is light as a feather...
     
  10. waters

    waters Registered Member

    Joined:
    Nov 8, 2004
    Posts:
    958
    Where is Panda, normally at the top and now gone.
     
  11. Pat MacKnife

    Pat MacKnife Registered Member

    Joined:
    Mar 31, 2014
    Posts:
    620
    Location:
    Belgium
    They didn't participate.
     
  12. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    Do you mean lighter in terms of impact on browsing, on-demand or real-time scanning or it's something else where you don't feel ESET to be light? I'd be glad to work with somebody who is having some kind of performance issues with ESET to pinpoint the cause.
     
  13. avman1995

    avman1995 Registered Member

    Joined:
    Sep 24, 2012
    Posts:
    944
    Location:
    india
    Avira doing well as usual....avast is loosing their path..
     
  14. Pat MacKnife

    Pat MacKnife Registered Member

    Joined:
    Mar 31, 2014
    Posts:
    620
    Location:
    Belgium
    Every company has their ups and downs .... but still good results.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.