AV Comparatives has released the newest report.

Discussion in 'other anti-virus software' started by trjam, Mar 19, 2010.

Thread Status:
Not open for further replies.
  1. xorrior

    xorrior Registered Member

    Joined:
    Mar 22, 2010
    Posts:
    66
    I've watched these benchmarks over the years. For the most part their tests are useless garbage cause they are for the most part generic. You can go write a injector or backdoor in something easy like basic and it'll pass through most of these engines with the best possible configurations, cause it's hand written and not generated or copied+compiled code like most of these tests are. Only exception is if the compiler builds the PE like it did another program with identical flow.

    Only now is behavioral analyses being implemented, but still no memory protection, and checking API calls is fail if the import table is obfuscated...current behavioral analysis like in Avira dumps the easy to hide imports from a PE along with two other easy to hide sections.

    Just my two cents
     
    Last edited: Mar 23, 2010
  2. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    Perfectly valid points however the likelihood of the vast majority of average users encountering this type of niche malware is extremely minimal.Those individuals that operate in the murky depths that may get targetted in such a way are the users that will employ HIPS,sandboxing/virtualisation,etc. to mitigate against said malware.
     
  3. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Vik and Stefan, thank you for the replies. I was aware, of course, that anti-virus products employ an emulator as one technique when evaluating the trustworthiness of an executable file. I was not aware, however, that the technique was employed during an on-demand scan. I mistakenly understood that it was done only “on-execution,” before authorizing the file to run.

    Here is some additional information that might be of interest to readers of this thread....

     
  4. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    If the question to be answered is “What product provides the best detection of malware?”, then an on-demand detection test seems to provide the most relevant information.

    If, however, the question to be answered is “What product provides the best protection against malware?”, then a whole-product dynamic test seems to provide the most relevant information.

    Different objectives merit different test procedures. Using an on-demand test to infer overall protection against malware seems as silly as using a whole-product dynamic test to infer on-demand detection performance.

    The key difficulty, in my opinion, is that casual users might interpret the AV-Comparatives on-demand tests as being indicative of real-world malware protection, which would be an incorrect inference as AV-Comparatives itself notes:

    The Whole-Product-Dynamic Test is not a simple “detection” test as usual, it is more a “protection/prevention” test. The test mimics malware reaching and executing on a user’s machine, as it happens in the real world.... This means that not only the signatures, heuristics and in-the-cloud detection are evaluated, but URL-blockers, Web reputation services, exploit-shields, in-the-cloud heuristics, HIPS and behavioral detection area also considered.... What matters is that the products provide reliable protection to the user....​
     
  5. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    if the casual users READ the reports, they would understand what conclusions can be drawn and which not. each report clearly states its limitations and what was tested.
     
  6. zx81

    zx81 Registered Member

    Joined:
    Jan 25, 2010
    Posts:
    58
    I'm a casual user whom just stumbled onto this thread. Its timely to me as my Kaspersky sub is running out in a couple of weeks, and I wanted to check it off against the current competition.

    Thanks for the links and high discussion, I'm starting a trial of GData Internet Security on one machine and will trial Avira on another.

    I have had most of the main vendors at one time or another; not dissatisfied with Kaspersky, which I have had for 2 years, just wanting the best/a change, plus got annoyed when the renewal offer for a subscription is more than for what you can get it on Amazon - makes no sense to me, if it were discounted for loyalty I'd have gone the path of least resistance. Nice to read that there is a lot of competitive products out there.

    I remember when I had trend micro and the mouse pointer suddenly being taken over when using public internet on my laptop, that's what got me on Kaspersky, with the software firewall, that day.
     
    Last edited: Mar 23, 2010
  7. NAMOR

    NAMOR Registered Member

    Joined:
    May 19, 2004
    Posts:
    1,530
    Location:
    St. Louis, MO
    You should just get rid of the % tables and bury the results in paragraphs to force people to read the PDF and not just look at the %'s. "Avira scored ninety-nine point six percent in detection of Windows viruses..." LOL
     
  8. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
  9. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Yes, I agree. Yet, it’s not just “casual users” who are making this error -- witness some of the discussions that have occurred on this forum, unfortunately.

    To reiterate, knowing that the malware detection rate for Product A is better than that of Product B doesn’t necessarily support the inference that Product A provides better protection against malware than Product B. As reported in post #130 in this thread, the rank order correlation between the On-Demand Detection of Malicious Software (February, 2010) test and the Whole Product Dynamic Test (December, 2009) for the 15 products common to both is low: r = 0.31. More specifically, that correlation coefficient is not statistically different than zero (p = 0.27, df = 13) -- i.e., the correlation between "detection" and "protection" is not significantly different than zero, meaning that the ordinal association between the two is very weak.
     
  10. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    @Pleonasm: i think there were mainly 2 things which contributed to that results in the dynamic test:
    - the URL filter
    - the detection provided by the on-access scanner
    In some few cases also the HIPS and BB came into play (as last instance).
    the whole-product dynamic test is done with out-of-the-box settings.
    if you look at the that results you will see that those that were not good in detection tests were also not that good in the dynamic test. otoh, some products which were good, were at the end very good due additional things like URL filters etc.
    anyway, let's wait for the next whole-product dynamic tests and what they will show over a period of time and higher amount of samples.
     
  11. InfinityAz

    InfinityAz Registered Member

    Joined:
    Jul 23, 2005
    Posts:
    828
    Location:
    Arizona
    Interesting observations. Any timetable as to when we can expect the next dynamic test and/or the frequency of dynamic tests?

    TIA
     
  12. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    we do not announce publications publicly - they will be released when done (otherwise people will again start asking/mailing if it will be released today, tomorrow or when). just be patient.
     
  13. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    ...sounds like sage advice. :)
     
  14. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Now who would ever do such a thing.:blink:
     
  15. Zyrtec

    Zyrtec Registered Member

    Joined:
    Mar 4, 2008
    Posts:
    534
    Location:
    USA
    Hello,

    I've been reading the results over and over and I have noticed that the posters here are so focused in the overall results for each AV that they are really missing a category where nearly all AVs tested scored very low and it's a category that should be taken seriously into account because the majority of Trojans/Fake AVs are using it to infect computers.


    Script Malware

    Avira = 91.6%
    Avast = 95.5%
    AVG = 60.0%
    BitDefender = 89.1%
    eScan = 89.1%
    F-Secure = 89.2%
    G-Data = 99.7%
    K7 = 42.4%
    Kaspersky = 90.9%
    Kingsoft = 17.6%
    McAfee = 66.9%
    NOD32 = 82.9%
    Norman = 63.7%
    Norton = 89.9%
    Panda = 40.8%
    MSE = 77.2%
    Sophos = 66.1%
    SpywareDoctor = 90.0%
    Trend Micro = 76.7%
    TrustPort = 91.0%

    I think that kind of Malware is the one we should be more concerned about since that one interacts with your browser [unless you are using Firefox + Noscript ] and delivers Malware embedded in malicious Scripts in compromised web-sites that in turn redirect your browser to a shady web-sites that download Trojans and Rogue AVs.

    So it looks like every one here is only focused on the general results and are overlooking this row on the .pdf report. right?


    Carlos
     
  16. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    I usually don't quote myself !

    But: how was Avira tested ? Were all the extended threat categories included ?
    And: by default Avira's (suite) on-demand scan heuristics is at MEDIUM, even when you have set the heuristics at high instead of medium. I know there is a way around that, but it's far from intuitive.

    So, how was Avira tested ?
     
  17. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,564
    Location:
    New York City
    Not everyone overlooked it.


     
  18. Leo2005

    Leo2005 Registered Member

    Joined:
    May 31, 2007
    Posts:
    179
    Location:
    Braunschweig (Germany)
    reading the report will answer your question
     
  19. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Well at least it is the malware which has the lowest threshold to take. Good point though, made me realise something :thumb:
     
  20. vojta

    vojta Registered Member

    Joined:
    Feb 26, 2010
    Posts:
    830
    About scripts, from Panda (post 93 in this thread):

    Other companies have their specific modules too, like Norton's IPS and AVG's Linkscanner, for example.
     
  21. zx81

    zx81 Registered Member

    Joined:
    Jan 25, 2010
    Posts:
    58
    Gdata is quite heavy on my system, as in treacle on increased security settings, and had to reinstall Avira twice on laptop, maybe a conflict with comodo firewall. I'll try the Avira security suite as I found the seperate firewall a pain, AV would not start on system restart

    Gdata found viruses in years old archived pst files which Kaspersky had not.

    Now running Gdata on laptop too to see if speed is okay with default settings this time. Fun fun, maybe even back to Kaspersky suite after all this, see how they do
     
  22. xorrior

    xorrior Registered Member

    Joined:
    Mar 22, 2010
    Posts:
    66
    I think some people confuse the static analyses in sanboxing/emulation engines with some form of runtime environmental analysis. Most engines I see analyze the PE structure for patterns. Here is what most can detect:

    -Statically linked encryption
    -Suspicious native API import patterns
    -known 'stubs' in the top of the code section and in special sections of a PE

    I've yet to see one that actually does some form of tracing to analyze flow. Using the detours method like sandboxie and bufferzone I think is done in panda and norman, but like those they are detected by malware because they don't hide.
     
    Last edited: Mar 24, 2010
  23. BenMar522

    BenMar522 Registered Member

    Joined:
    Mar 12, 2010
    Posts:
    67
    Congrats to Avira for an impressive showing and with the release of Avira 10.0...there may still be more improvement to come.

    I'm disappointed to see that Vipre was not a part of the testing as I was very interested in seeing this light av stack up against the major players in av comparitives thorough testing.

    It would be nice if they included how light or heavy the av's were on the test systems. For me, I think that this along with realtime detection/blocking rates are most important. JMHO.

    -BenMar
     
  24. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    Congrats to Avast. Using Avast about 3 years now and no problems here. After I started to play with digital photography, I'm thinking to go a Mac user, much better in RAW-conversions and no worries about infections. Quite expensive, I have to admit.

    Best regards,
    Firefighter!
     
  25. Get

    Get Registered Member

    Joined:
    Nov 26, 2009
    Posts:
    384
    Location:
    the Netherlands
    Just read it and I am very content with the Avast-score. Lowest is 93,1 and script is impressive. In the "lowest score counts-department" only Gdata scores better.

    @JoeBlack40: Also "impressed" with your signature. Great line!
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.