AV-Comparatives 2011 dynamic tests

Discussion in 'other anti-virus software' started by InfinityAz, Apr 22, 2011.

Thread Status:
Not open for further replies.
  1. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,618
    Location:
    Milan and Seoul
    There's no doubt that version 10 has been marred by several serious problems from the beginning, and it took a long time to rectify those issues. I suppose that all companies have ups and downs, I'm confident they'll get their act together eventually.

    I've just read the introduction to the dynamic test at 'Settings' page 5, it says that every security suite is used with its default settings (out-of-the-box). This sounds fair as an approach to testing, but I know Avira performs very well with high heuristics which isn't a default setting whereas other competitors have settings maxed out as default. I'm not saying this is why Avira had a lousy performance, but perhaps the detection rate could have been a little bit better.
     
  2. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    No, setting the heuristic to a higher level won't get a much better result for Avira. I am not doing heuristic detections for a long while now but instead release the detection rules as generics directly. Their lifetime is just too short, it doesn't make any sense to release them as heuristics.

    The Avira behaviour blocker is far from the detection level I would like to have it. I am now doing the redesign for version 2.0 of it and also helping out with the Avira Cloud design. But this also means I have even less time doing for generic detections. :doubt:

    The results are no surprise for me, I predicted them at end of 2009 and gave recommendations what is necessary.
     
  3. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    Symentec failed in dynamic test!!!o_O Have your read the report properly? Its among the toppers...
    All credit goes to their web filter...:)
     
  4. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    818
    Symantec was at the top of the dynamic tests if you only look at prevention. What brought them down was false positives.
     
    Last edited: Aug 22, 2011
  5. ams963

    ams963 Registered Member

    Joined:
    May 3, 2011
    Posts:
    6,039
    Location:
    Parallel Universe
    exactly.

    good for avira;)
     
  6. raven211

    raven211 Registered Member

    Joined:
    May 4, 2005
    Posts:
    2,567
    Wow! I'm surprised and impressed by Trend Micro - I will take it for a spin. :D
     
  7. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    they are Advanced, not the top. :(
     
  8. Matthijs5nl

    Matthijs5nl Guest

    Trend Micro really should make a free DNS service with their great URL blocking capabilities. That DNS service would easily outperform the old ClearCloud DNS and Norton DNS. I am sorry for Trend Micro, but the rest of their product is plain ****. However, I am also tempted to try it out again. The last time I had a look at Trend Micro was when 2011 entered the beta phase, let's say 15 months ago.

    Also, it is really unbelievable how it is possible that McAfee provides such poor protection, in every test. If you look at budget and human resources available they are one of the top three companies, together with Trend Micro and Symantec. It is simply pathetic.
     
  9. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    :D I see...;)
     
  10. sg09

    sg09 Registered Member

    Joined:
    Jul 11, 2009
    Posts:
    2,811
    Location:
    Kolkata, India
    :thumb: Same thinking here. but after that who will buy their AV? But even if they make that DNS paid I am ready to buy that....:argh: TM DNS coupled with AVs like Symentec/F-Secure or free ones like Avira/MSE would be great.
     
  11. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    interesting... You have edited my original post ;)
     
  12. dschrader

    dschrader AV Expert

    Joined:
    Mar 10, 2009
    Posts:
    54
    Re: AV-Comparatives 2011 dynamic tests - some thoughts

    It looks like these tests are of files from malicious web sites identified by AV-Comparative's web crawler.

    This won't necessarily say very much about how well a product will detect threats from USB devices, email, file shares . . . . Products that rely on URL blacklists might do very well on this test but perform less well on detections of non-web based threats. This might seem like a minor point - after all, most malware is web based. However there are some notable exceptions - such as Stuxnet or every email worm you ever heard of.

    Not to state that these tests aren't valuable - they are a big improvement over reactive static file testing. However, there is still a need for a more comprehensive set of tests.

    I would like to know how av-comparative's web crawler identifies malicious sites.
     
  13. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    1,649
    Location:
    Paris
    Totally agree with you. A product that relies on URL filtering may not protect against threats in the form you listed. Also someone with stolen FTP credentials may replace a legitimate file on an otherwise trusted website with a malicious one. URL filtering wouldn't be of much value then.
     
  14. luciddream

    luciddream Registered Member

    Joined:
    Mar 22, 2007
    Posts:
    2,545
    Looks to me what hurts Avira in this test is they have no "user dependent" to help their scores out, as they lack a sandbox or HIPS, and this test is for testing entire suites and not just AV's. Avira's strength is as an AV only, not an entire product. Heck, CIS probably would have waxed Avira in this test too even though their AV isn't very good, because of it's HIPS. I'll bet there would've been a lot of yellow on their graph.

    And let's face it, people that are serious about "user dependent" type programs are probably using Sandboxie and/or a HIPS. I'm only interested in the tests that test AV's as AV's only. I wonder, if taking my Comodo HIPS & Sandboxie into account how many of those "missed" samples by Avira would have been user dependent for me instead? Probably almost all of it.

    I apologize if I'm not interpreting the data or the methods correctly, but that's what I get out of it. I still think it's a great product, and very light on my system. I'm thinking about ditching it, but it sure isn't on account of this test. It's because I want my system as light as possible while still feeling safe, and I feel safe with Sandboxie auto-deleting the box after every session.
     
  15. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    Re: AV-Comparatives 2011 dynamic tests - some thoughts

    i agree completely with you.

    Another thing is that detected threats before/at runtime must be detected on demand, too. Similar to VirusBulletin methodology.
     
  16. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    Ideally that would be true, but some malware families are just much easie to catch with some of the detection/protection technologies and very difficult with others. The combination of all possible technologies is what you want. And if you can make use of all the available data from all the various technologies, you will have an excellent level of protection.
    The goal is to have either proactive protection (detect even new, unknown malware before it can get activated) or have a reaction time of all your AV systems and components of just a couple minutes, if not seconds ideally.

    But then, detection is creating itself, all from alone, no? The false positives are the real hard work... :rolleyes:
     
  17. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    If the on demand scanner is unable to detect such threats...
    How can you make sure that your USB key is malware-free?

    Potentially infecting other users not using such on-execution features.
     
  18. dschrader

    dschrader AV Expert

    Joined:
    Mar 10, 2009
    Posts:
    54
    Re: AV-Comparatives 2011 dynamic tests and Symantec FPs

    Symantec's FPs seemed large because of the definition used by Andreas. He counted any warning that the file might be unsafe as an FP.
    This works against how our Insight technology operates.

    Essentially, Insight is based on the concept that the prevelance of a file - how often it appears across the internet, tells you something about its risk. After all, only malware mutates. Besides which, do you really want to be the first person to run some program you found posted on some random web site?

    The Insight network tracks nearly every program file on the internet. It has a security rating assigned to each file - based on what our scan engine found and based on the files prevelence and associations.

    So Insight warns the user not only when a file is potentially malicious, but also when it is brand new or extremely uncommon. Andreas's test set included a number of unsigned, unique (but not malicious) programs - each of which Insight warned the user about - and each of which were considered a 1/2 FP. So Symantec had the highest detection rates, but also a lot of warnings to users and a poor score in FPs.
     
  19. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Re: AV-Comparatives 2011 dynamic tests and Symantec FPs

    How do companies manage this?
     
  20. lodore

    lodore Registered Member

    Joined:
    Jun 22, 2006
    Posts:
    9,065
    Re: AV-Comparatives 2011 dynamic tests and Symantec FPs

    Hello dschrader,
    file insight seems like a good idea but there are many files incorrectly flagged.
    the other big issue is that norton will quarantine the files without asking.

    the standard user may ignore smaller companies products simply because norton file insight says the file isnt very well known and cant verify the safety of the file. the other possible thing is that users will get annoyed that alot of the software they want to use keeps getting quarantined by norton and they will think this product sucks ill turn off file insight because all it does is block everything.
     
  21. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    Only 7 cases were Download Insight and counted 0.5
    24 cases were considered as threats and blocked without asking the user. Several files can be found on download.cnet.com too. As the downloads from the authors website are all blocked and considered as threats, such downloads will disappear also from some download portals which rely on Symantec's verdict, and therefore Symantec's prevalence will be lower (while other vendors may see a higher prevalence). The author may not know why download portals refuse to host his clean applications, and may have a hard time telling users that his program is clean and that the AV is wrong in blocking the downloads from its website.
    Insight network clearly does not track every program file on the internet, as otehr clouds have higher numbers and see files more often. What you can state and can be considered true is that Symantec's Insight network tracks nearly all executables used by Norton community users / Symantec's user base.
    A clean file which is blocked/treated as malicious (like it is counted/seen in the protection part) is considered as wrongly blocked. I does not matter how your technology works, after all the industry asked for a technology independent test.
    All other comments can be read in the report. I just jumped in because I considered the statement inexact.
     
  22. d0t

    d0t Registered Member

    Joined:
    Apr 23, 2011
    Posts:
    181
    I totally agree with you. It's a very good Av (y)
     
  23. acr1965

    acr1965 Registered Member

    Joined:
    Oct 12, 2006
    Posts:
    4,995
    Is there somewhere to find out how many cnet files there were and any other details?
     
  24. Hippocrates

    Hippocrates Registered Member

    Joined:
    Dec 21, 2008
    Posts:
    12
    Re: AV-Comparatives 2011 dynamic tests and Symantec FPs

    Hi dschrader, to a user who's not tech-savvy, Norton Insight simply doesn't help much because it doesn't tell the user what should be done. I'm not sure if it's technologically feasible, but it would be great if Norton could perhaps ask the user to re-execute the file after 10 minutes (for example), while the signature of the file is sent to Norton to be analyzed thoroughly (something like Norton Power Eraser). I'm sure most if not all users won't mind a 10, or even 20 minutes wait before re-running the application. This would certainly reduced the FP, and ensure the safety of the system in the events that the executable is a true malware.
     
  25. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    *I think these tests you and AV-Test are doing are becoming a joke. Here are a few reasons why:

    Firstly, you come up with this mindless scoring schemes - I mean WHY IN THE WORLD is 0.5 used for a penalty for an FP. Why not 0.75 or 2.3 or some other random number ? How does 0.5 quantify the impact that FP is going to have in the real world. It could affect one user, it could affect an entire country or worse. You just dont know. 0.5!!!

    Secondly, and this is a BIG ONE, you as one of a handful of testing houses in the world, dont actually pick you test samples by analyzing them indepedently to determine if they are legit or malicious samples, do you ? No you dont. If a vendor misses a sample, you let them know. If the contest that the sample is not really malicious you THROW IT OUT of the test set and pick a replacement!! What kind of a test is that. Its like throwing a bunch of results at all the AV-vendors and seeing what sticks. So why would I believe you when you say something is clean and was an FP by Symantec or Kaspersky or anybody else when you dont analyze any of the samples yourself. Were any of these FPs on executables that were signed by the vendors so you could tell with 100% certainty that they are clean. If they are, please let us know.

    Lastly the scoring system that is used to decide how many points to give for a automatic block and a user-dependent block, again COMPLETELY random. You have no idea what percentage of users out there would choose BLOCK or ALLOW on an alert. So you pick something at random.


    All these quirks contribute do these rankings, which have real impact on real companies and real people and yet there is little scientific basis for the testing methodology.

    Very sad.
     
    Last edited by a moderator: Aug 26, 2011
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.