AV-Comparatives Whole Product Dynamic Tests updated

Discussion in 'other anti-virus software' started by King Grub, Apr 16, 2012.

Thread Status:
Not open for further replies.
  1. Brandonn2010

    Brandonn2010 Registered Member

    Joined:
    Jan 10, 2011
    Posts:
    1,854
    Ouch Avast! It's detection was better than Avira and AVG, but it got bumped to standard because of user-dependent cases :doubt: Yet Bitdefender which always scores near the top in detection, had none, or any FPs. Might switch to Roboscan.
     
  2. Technical

    Technical Registered Member

    Joined:
    Oct 12, 2003
    Posts:
    471
    Location:
    Brazil
    avast! performed very well against 0-day malware.
     
  3. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    agreed tech but... avast seems to be getting a bit to interactive, i get a lot of clients calling at times asking something like i have this pop up saying there was not enough info should i allow this to run or not? and this is with many common programs like winrar and many others. i really think they should take a step backfor just a minute and look at what made them the number one choice. adding to much is as bad as not enough.

    i also agree roboscan does very well but know there is no web scanner and it also can slow down boot times a LOT. otherwise its VERY light. prob my fav free one out right now.
     
  4. a256886572008

    a256886572008 Registered Member

    Joined:
    Oct 26, 2007
    Posts:
    103
    Can the new version of avast7 sandbox automatically isolate the malware executed from the Blackhole exploit kit?
     
  5. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    This has changed with the latest build. If analysis don't find anything, it will exclude it from future Auto Sandboxing. I've complained about it not being a good idea but they insisted it was. It's more user friendly but isn't nearly as secure as it was before, interactivity or not...
     
  6. Technical

    Technical Registered Member

    Joined:
    Oct 12, 2003
    Posts:
    471
    Location:
    Brazil
    Autosandbox has been turned on a very advanced scanning.

     
  7. Technical

    Technical Registered Member

    Joined:
    Oct 12, 2003
    Posts:
    471
    Location:
    Brazil
    Sorry, I'm not technically skilled to answered that.
     
  8. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    thanks for the updated info good to know. hopefully they will make it even more user friendly since it was prior in v5 which i loved.
     
  9. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    28 false alarms!
    Disapointing results for avast. They need to enhance the quality assurance.
     
  10. Brandonn2010

    Brandonn2010 Registered Member

    Joined:
    Jan 10, 2011
    Posts:
    1,854
    Actually those were user-dependent cases, not false-positives.
     
  11. Dru

    Dru Registered Member

    Joined:
    May 3, 2012
    Posts:
    15
    They have had some fp problems recently though, I've had far more recently than I ever have in the past. Especially as far as games go. Had steam broken twice due to fp's, crysis detected as a trojan, punkbuster and a few others.
     
  12. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    BIS, NIS, and KIS all have to be "tweaked" for optimization from the default settings. Once done, all perform well on most PCs.

    "No pain, no gain" folks. Read the user manual at least three times to fully utilize your investment.

    -Edit- Also make sure you thoughly read the prerequisites for AV-Comparatives tests. Many vendors request that their products be run at their "max" security settings. If you expect to receive the same results from your respective security product, ensure you do the same. Also be prepared for the end result of doing so; slow performance on many PC's.
     
    Last edited: Jul 31, 2012
  13. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    Please see the <wrongly blocked section>
    You'll see 28 false alarms for avast

    That is more false alarms than average (7), which is why avast just got a Standard award.
     
    Last edited: Aug 1, 2012
  14. vlk

    vlk AV Expert

    Joined:
    Dec 26, 2002
    Posts:
    621
    No, that's not true.
    They were all user dependent (and pretty much all of them were from FileRep, just informing the user that the file he's just downloaded is very rare and it's recommended to treat it with caution).
     
  15. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    28 user-dependent threats blocked + 28 user-dependent false alarms to be more exact.
     
    Last edited: Aug 2, 2012
  16. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    My complaint about AV_Comparatives testing is it's misleading.

    In this latest proactive test detail document, page 5, it states "Avast, Avira, and Kapersky all requested that their products be tested at high/advanced settings."

    First, no vendor should be able to detect test criteria. Next, none of these products install with these high/adavanced setings enabled. Why? Because their customers PCs would slow to a crawl. Using Kapersky IS 2012 for an example, its "optimal" default setting for hueristics in the File AV scanner is low and for the web scanner is medium.

    Since most retail users run with default settings, the AV-Comparative tests are misleading in my opinion.
     
  17. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    I agree with you 100%. They should all best tested with the default settings - the vast mority of users will never change these settings, making the test results from AV Comparatives invalid. It would be good to see them tested with the default settings, and it would be interesting to see the difference in results.
     
  18. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    This statement is incorrect as vendors doesnt know the malware test set, they are asked before starting the test. Any changes to heuristic settings could increase their detection rates, but also in false alarm rate :D
    Some products have maximum settings enabled by Default, e.g. ESET using Advanced heuristics while writing to Filesystem and Downloading Files
     
  19. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    Yes that certainly true. But what if the detection rate increases with to no false positives - then the software will be performing better under the test than it will be for regular users who do no change the default scan settings - thereby giving an unrealistic test result.
     
  20. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    Is up to the vendor if they want to assume the risk.
     
  21. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    I think you are missing my point. If the higher leves don't give any extra false positives, but do lead to extra detections then the av product is performing better for the testers than for the majority of users who don't change the default scan settings. So an av product may out perform a compeditor in testing, but with the default settings the other product may do better.

    I've got no problem with products being tested with non default settings - but in order to make the testing fair, they should be tested using the default settings as well.
     
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    This thread is about the Whole Product Dynamic Real-World Protection Test. In this test, all products are tested with out-of-the-box settings.

    In the other test you refer, the products would anyway e.g. either switch automatically to higher settings after a detection or use higher settings (in some cases even higher than the highest settable settings inside the product interface) when downloading/executing files.
     
  23. vlk

    vlk AV Expert

    Joined:
    Dec 26, 2002
    Posts:
    621
    That's what I am saying. Zero "normal" FPs + 28 "user-dependent" FPs, which in case of Avast meant FileRep on-download warnings about the file being too rare or too new.

    Andreas counted the "user-dependent" FPs as half points so the total FP score was 14.

    Vlk
     
  24. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    Yes, I was refering to the hueristics testing AV-Compartives performs.

    I assume this would be the case, but without observing the internal processing of the product there is no way of knowing this does occur.

    The question is does AV-Comparatives anonymously download the software from the vendors or third party web site or are they using software directly provided by the vendor? If the later, there is always the possibilty the software has been "tweaked."
     
  25. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    It was well known that avast! switched to highest settings if it detected 1 malware. I'm sure others told tricks like this to the testers. It's not exactly a top secret stuff that compromises anything if you tell it to the public. But in general, most users don't really care for as long as it works as it should.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.