AV-C On-Demand Comparative Test

Discussion in 'other anti-virus software' started by Cloud, Apr 12, 2011.

Thread Status:
Not open for further replies.
  1. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    Hmm.. Interesting results...
    1. Surprised at Qihoo! I knew it uses Bitdefender engine but why so much FP? BD has only 3. o_O
    2. Avira is consistant, well done as usual..!!:thumb:
    3. Avast well done again. Waiting for retrospective test on v6.:argh:
    4. AVG, drops so much in detection after showing promises in last year.:blink:
    5. Like Avira, Products using BD engine are consistant.:thumb:
    6. It seems K7 is working hard. But to cut off FP rates they have cut off detection too...:doubt:
    7. Well done as usual Kaspersky.:cool:
    8. McAfee, a huge surprise...:D
    9. Not very much astonished at Eset FP. I have encountered a few on some well known legit installers recently.*puppy*
    10. Symentec, keeping their records of not being consistent.:cautious:
    11. Panda seems to work on FP.:thumb:
    12. Sophos, Microsoft well done as before...:D
    13. Trend Micro, PC Tools still not gaining trust...:ninja:
    14. Surprised at Webroot!! How come they score so many FP and also so less detection compared to Sophos...o_O

    Where is Quickheal!!!!
     
  2. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    I was talking about on-demand scans exclusively as the AV-C reports only mentions on-demand performance.
    What you're saying is that other vendors are at fault for setting their on-demand scanners to be more thorough (since it comes at a "speed" price)... but I'll say it again: some people like their on-demand scans to be quicker, some like them to be more thorough. There's no "right", or "wrong", just "choice".

    And I completely agree with you on the on-access scanning.
     
  3. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Well, On-Demand scan specifically is no less thorough than from any other vendor really. Otherwise it wouldn't score this good, don't you think?
     
  4. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    Thoroughness has nothing to do with detections as samples used are unpacked and/or have simple PE file packers which any self-respecting AV will unpack and catch on-demand (if it has the signature, naturally)
    Avast doesn't scan a lot of archive types by default, other vendors (no naming, to avoid AvsB) scan those types of archives by def. Which is better? Neither, it's choice (talking about on-demand again, of course).

    Added: clean file set used by AV-C to test on-demand speed is bound to have all file types, including archive/compound files, which is less likely for malware files set.
     
    Last edited: Apr 12, 2011
  5. Cutting_Edgetech

    Cutting_Edgetech Registered Member

    Joined:
    Mar 30, 2006
    Posts:
    5,694
    Location:
    USA
    So how is it that Webroot can score much lower than Sophos when it uses Sophos engine plus SpySweeper together? Is Webroot using an outdated engine?

    Webroot= 85.5%
    Sophos= 94.1%.
     
  6. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    It's not really bad with FPs. If you look at the detection names you'll see that mainly packers exploited to a large extent by malware were detected (they are covered by the potentially unwanted applications group which means the user must agree with detection and that is what vendors want for tests). Given that such detections on legit applications are fixed immediately as they are reported by vendors, users, or there's a higher prevalence of such applications observed in the cloud (ThreatSense.Net) statistics, make the conclusion yourself on how prevalent are those applications and how serious are those FPs.
     
  7. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    Please PM me links to some examples of such installers. We are not aware of triggering an alarm on any installers unless there was a reason to.
     
  8. codylucas16

    codylucas16 Registered Member

    Joined:
    Nov 17, 2009
    Posts:
    267
    The results are interesting but not ground breaking. It is an on-demand test after all. I use other scanners for on-demand. I don't need my protection to be able to detect threats when I scan, I need it to be able to detect them when they're being launched.

    The false positives are interesting for sure though o.o. I can't say I've ever seen Avast give me a false positive, even with heuristics set to high. Nor Trend Micro but it had the most false positives. Very interesting.
     
  9. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,741
    Location:
    UK
    It does depend if one has the products listed on their system that initially triggered those 19 FPs. I looked over the Avast list and I haven't a single one of those items on my system. That being the case I, and many others like yourself, could have AV settings up to max and report no FPs simply because we don't have the more obscure packages on our computers. :)
     
  10. SM_Unlimited

    SM_Unlimited Registered Member

    Joined:
    Jun 8, 2010
    Posts:
    32
    Retrospective will still be on 5.1, as the retrospective tests are part 2 of the on demand tests. August on demand will likely be 6 unless avast specifically tell AV-C to do otherwise.
     
  11. dschrader

    dschrader AV Expert

    Joined:
    Mar 10, 2009
    Posts:
    54
    Don't read too much into this one test - it is a test that scans a bunch of files in a folder for malware - no real time protection, no net IPS, no host IPS, no sandbox . . . .

    It is a a data point - similar to looking at car saftey by testing airbags - nice to know, but how about testing stopping distance, seat belts, traction, saftey cage .....

    Real world tests by av-comp and av-test are a lot more informative.
     
  12. eBBox

    eBBox Registered Member

    Joined:
    Aug 10, 2006
    Posts:
    482
    Location:
    Aalborg, Denmark
    Yup thats where etc. Norton excel. Funny how many "trajams" we got in this forum. One "bad" test and people go nuts changing products and starts bashing. Ive been using Norton and F-Secure for years on many clients computers no matter how they performed in AV-C and they have never let me down. Dissapointed to see how people in here react - they should now better.

    Other than that its always nice to see a good full featured freebie like Avast improving :thumb:
     
  13. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Been that way for years here. I didnt invent it, I dont think.o_O

    Actually thanks to the trajams, =more people try vendors products=more products bought=more feedback on those products
     
  14. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Folks, IBK will be in later this week, but I know he said it was very important that when you review these results you also read the link about the details of discovered false alarms. You have to really take that into consideration and not just focus on the one finding.
     
  15. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,741
    Location:
    UK
    The false alarms do need to be taken into account I agree, but I can't help wondering how many of those software packages one actually has on their systems in the first place. Admittedly, if you download them all and ran the tests at the same time as IBK and his team at the settings given, you'd probably see the same FPs. Of course, by now they should be fixed.

    As I've said many times before, I hardly know most of the items listed let alone have them installed so as someone else said, it isn't a real world scenario, and this is why many here have said they don't get FPs with the product of their choice even at the highest settings. I think all this needs to be put into perspective. :)
     
  16. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Oh, he is vacationing in Scotland and did just send me this photo he took of himself today.

    Talk about a false positive :D
     

    Attached Files:

    Last edited: Apr 12, 2011
  17. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Maybe not real world as you say, but it does show which ones may be prone to them and as we all know, 1 FP in the wrong hands and, well....
     
  18. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    In theory, does even 1 FP on, let's say, a 10-20 year old file matters and indicates how much the security program is prone to trigger FP on current applications? And what about applications that have no or neglectable number of occurrences in the cloud reputation system?
     
  19. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,616
    Location:
    Milan and Seoul
    I only use Avira on demand lately therefore this test is very important for me. Like somebody has already mentioned Avira is always a good performer, although I am a bit disappointed to see its detection rate being beaten by 7 competitors (see graph of missed samples). I for one would prefer Avira with a bit more FPs and the best detection among the single engine AVs than the current situation. Avast has done really well which shows that free can be a real alternative to paid software.
     
  20. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    Interesting as always, but I'm far more interested in the Retrospective Tests.
     
  21. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    im not, id say retrospective is the least important of all the tests, because at least in this test u see hat detection rates are wen doing an on-demand scan on ur computer, and with the real-world test they do u see the real-time detection ability. with retrospective it creates a situation that no PC user will ever be in, nobody uses ancient signatures with their AV relying on just its heur, and such.
     
  22. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    I have already submitted them...:)
     
  23. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    My mistake, I meant Dynamic Tests. Just wish they'd test the free versions as well, instead of only suite.
     
  24. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    Oh! Yeh! I forgot...:p
     
  25. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    o ok, ye i agree dynamic test should be more interesting to see.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.