AV-Comparatives Aug-Nov Real World Test Complete

Discussion in 'other anti-virus software' started by Inside Out, Dec 12, 2013.

Thread Status:
Not open for further replies.
  1. Dave0291

    Dave0291 Registered Member

    Joined:
    Nov 17, 2013
    Posts:
    553
    Location:
    U.S
    Thousands of samples for each product, and 40-50 false positives are bad? Either I am very much misunderstanding these tests, or you and I have extremely different ideas of the term "bad". o_O o_O I, again, am not overly concerned about a false positive unless it is an example of what I spoke of earlier. What matters most to me is the actual blocking vs malware not getting blocked. Most antivirus products have the ability to override a false positive, usually through some type of "vault" like Avast has.
     
  2. blasev

    blasev Registered Member

    Joined:
    Oct 25, 2010
    Posts:
    763
    forticlient have the usual result
     
  3. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    So, even 52 out of 1000(x2) clean samples isn't that bad.
     
  4. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    I rather have an AV which doesn't give user dependent prompts every-time popular software is installed or updated because it's not in the whitelist - especially for users less tech savvy or when I'm installing a big program and the installer halts because it wants me to Allow an action while I'm away from my desk.

    Only want prompts for the less popular and more dangerous programs.

    Some AVs have more comprehensive whitelists than others.
     
  5. Dave0291

    Dave0291 Registered Member

    Joined:
    Nov 17, 2013
    Posts:
    553
    Location:
    U.S
    I had read a little over 3000. But no, 52 out of 1000 is not bad to me. It seems to me like some folks, and not just here of course, treat these tests like contests. I look at them as general ideas of what to expect out of any given product over the course of time. That is why blocked vs not blocked is so important to me. I am not concerned if a few specialized tools or applications get flagged as suspicious, nor files from relatively obscure domains. If they are system files or otherwise extremely critical, then I most certainly will care.
     
  6. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Where's the "like" button here? :thumb:
     
  7. SweX

    SweX Registered Member

    Joined:
    Apr 21, 2007
    Posts:
    6,429
    On Facebook where it belongs :D
     
  8. THESAWISFAMILY2005

    THESAWISFAMILY2005 Registered Member

    Joined:
    Aug 10, 2012
    Posts:
    198
    Location:
    SACRAMENTO CALIFORNIA
    Can bitdefender free run with webroot??
     
  9. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    lol ;)
     
  10. smage

    smage Registered Member

    Joined:
    Sep 13, 2008
    Posts:
    378
    Nice test, really like that false positive also is being considered now.

    Hopefully Comodo and Symantec will be included one day as there are high demands for them.
     
  11. Rompin Raider

    Rompin Raider Registered Member

    Joined:
    May 6, 2010
    Posts:
    1,254
    Location:
    Texas
    Yes!:thumb:
     
  12. guest

    guest Guest

    Actually, I disagree with that. IMO, just because a software is unknown doesn't necessarily make AVs can flag it all they want. Not to mention about the bad reputation that software (and its developer) got from FPs.
     
  13. ance

    ance formerly: fmon

    Joined:
    May 5, 2013
    Posts:
    1,360
    I remember Norton and Avira flagging unknown files, really stupid and senseless. :gack:
     
  14. guest

    guest Guest

    Excellent for Panda, 2nd plance and they are testing Panda Cloud Antivirus FREE.
     
  15. VXB

    VXB Registered Member

    Joined:
    Oct 2, 2010
    Posts:
    18
    Hi guys,

    In the 3stars quadrant, which one would you say got the smallest resource footprint?

    Back in the days when I was up to date in the AV industry, ESET was one of my favorite. Then they released V3 and things went south from there.
    Just to give an idea... I'm familiar with all the brands in the 3stars group except EmsiSoft (never heard of them)

    I chose to fall back on F-Prot for few years but since Commtouch took over it looks like the product is been artificially kept alive.

    I'm a bit lazy to do my homework's... It's time for me to move to another product.

    I'm looking for top ranked in credible tests AND lowest resource requirements AND Least interference possible with normal user activities.

    What do you suggest?

    EDIT: After reading a bit about EmsiSoft and their dual engine (E1-proprietary and E2-BitDefender), it sounds like something of interest but how's the system footprint?
     
    Last edited: Dec 15, 2013
  16. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Panda Cloud or paid Avira (the latter has no firewall of their own), maybe Emsi (F-Secure is fast too, but its RAM usage can be high at times in my experience).
     
    Last edited: Dec 15, 2013
  17. VXB

    VXB Registered Member

    Joined:
    Oct 2, 2010
    Posts:
    18
    Thanks for your input... Is F-Secure still based on the F-Prot Engine?
     
  18. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    No, it's actually *based on* Bitdefender now but uses a clever mix of in-house engines and licensed ones like BD's engine, Norman's sandbox, F-Prot heuristics. In other words, a vastly "overengineered" product, but in a good way.
     
  19. VXB

    VXB Registered Member

    Joined:
    Oct 2, 2010
    Posts:
    18
    Holly molly, I really have been living in a cave the past few years. Lots of catch up to do.
     
  20. Sandunes

    Sandunes Registered Member

    Joined:
    Dec 2, 2013
    Posts:
    75
    Am I reading these charts wrong or is MSE missing from this test?
     
  21. FleischmannTV

    FleischmannTV Registered Member

    Joined:
    Apr 7, 2013
    Posts:
    1,094
    Location:
    Germany
    It's the dotted 91.9% baseline.
     
  22. halcyon

    halcyon Registered Member

    Joined:
    May 14, 2003
    Posts:
    373
    Nobody changes their AV every single month - in advance - anticipating which product/platform will perform the best the next month.

    Hence, we are doing selection on backwards looking basis. There what we are after is tight quality control (small variance) and high protection rate (month to month).

    To me the top three show interesting combo of tight variability and high protection. This doesn't mean (of course) that they are the best, it's just only what I wrote.

    http://i.imgur.com/FA1C4gx.jpg

    This shows the variability and ranking for Aug-Nov.

    Now how consistent has this been over the past 4 years:

    http://chart.av-comparatives.org/chart2.php

    Pand not so good, Bitdefender a couple of drops, F-Secure several.

    Then the performance:

    Bitdefender 99% perf
    F-Sec / Kaspersky 98.8% perf
    Panda 95.5% perf

    This performance test data is mirrored in another independent AV-test that does their own testing (av-test).

    So, when you look for a tool that is suitable for you, look for an overall trend and small variability in the performance level you accept.

    A product that performs well this month and much worse the next may not be for you, unless you know each month (beforehand) which one is going to perform the best, and then you buy and install that accordingly.

    I for one do not have a working crystal ball into the future, so I base my decision on variability and trend statistics :)
     
  23. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Trends in test results are quite accurate, but sometimes a vendor can underachieve despite actually having shown the potential before. Avira was doing great in every test from 2007 to 2009 (the Avira/Antivir unification happened in 2006), then suddenly became second-tier and users went mad. Did that mean they'd forgotten how to make a good AV? No. Not much later, they added the Ask toolbar, made other questionable decisions, so it became obvious that they had stopped caring about the quality long ago, but it didn't necessarily suggest incompetence. They wouldn't get away with all this controversy, sloppiness and complacency for long anyway, so after they scrapped large ads, I knew they'd become good again. IMO they're a great vendor with actual brains and ideas despite not flaunting them all the time unlike some, they just need to always keep their greed in check.

    On the other hand, G-Data's brainpower has always been questionable, because they licensed two engines when it used to be a good, effective AV. Historically they've shown to be a very ambitious vendor in that KAV + BD had turned out way too heavy before, later KAV + Avast was giving average test results at times, then they found some balance in BD + Avast. Now it's BD + CG, neither light nor capable despite being designed/intended to run as light as any other AV with only a slight loss in detection, but there are some *supposed* improvements in the recent beta with the same engines. They are as committed as anyone and they do address things, but doing some stuff by themselves like:

    -look after their own update servers before claiming their product can look after our systems
    -become at least decent at removal and anti-phishing
    -add heuristics to CG
    -make code that doesn't require a supercomputer to run
    -promptly fix bugs and false positives
    -do enough malware research to update their blog more than once or twice a month on average
    -lastly, not laugh at others' rare mistakes while the only reason they themselves weren't h(ij)acked worse was that the hackers didn't even bother. Seriously, what an arrogant, delusional, self-aggrandizing blog entry. WTF?!? :rolleyes:
    ...have always been beyond their capabilities so far, and likely always will be, so I wasn't surprised at all when the recent tests and reviews exposed them.

    In short, sometimes test trends don't indicate the competence level of the vendor. I'll take a "poorly-performing" AV developed by a capable yet lazy vendor that will eventually be forced to step up anyway over a "star" product entirely built on others' talent and brains which will only go down the toilet when its enablers go away.
     
    Last edited: Dec 18, 2013
  24. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    I doubt you'll find F-Prot anywhere in their products now.
     
  25. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Doesn't Returnil still use it?
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.