Conclusion AV Tester Workshop: Wildlist is CRAP

Discussion in 'other anti-virus software' started by Inspector Clouseau, Jun 4, 2007.

Thread Status:
Not open for further replies.
  1. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Some more info here: (Click to enlarge)
     

    Attached Files:

    Last edited: Jun 4, 2007
  2. Sjoeii

    Sjoeii Registered Member

    Joined:
    Aug 26, 2006
    Posts:
    1,240
    Location:
    52?18'51.59"N + 4?56'32.13"O
    Thanx for the info. Nice piece
     
  3. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
  4. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    :D :D :D Did you recover from attacks and Vesselin? :D :D :D
     
  5. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    more or less... :p
     
  6. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    i dont know what to think of this piece,

    i dont like how now VB links to av-test and av-comparatives now,

    it seems if an AV fails one, it now fails all 3,

    is this how it is, or have i misread something here :rolleyes:

    there was already the a farse enquiry due to the workshop and now this piece is from VB, it seems all very....... fake, to me.
     
  7. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Fake? You think i'm posting here a FAKE virusbulletin page? Sorry, usually i'm very polite but i have to ask this now: Are you nuts? :eek:
     
  8. FanJ

    FanJ Updates Team

    Joined:
    Feb 9, 2002
    Posts:
    4,660
    Hi Mike,

    May I ask a question (but it could very well be that I am misunderstanding things)?

    Your thread title says "Wildlist is CRAP".

    What exactly is the crap?

    1.
    Is it the collection of samples (samples supposed to be in the wild), called the Wildlist.
    2.
    Or is it the way testing is done on those samples?
     
  9. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    no i didnt mean that at all IC, i have no doubt that its a proper page.

    i mean all these tests, they seem fake if they are all linking up like 'it sounds'

    i was v. surprised with the VB results, and if they have changed their testing which 'thanks to av-test and av-comp' (no offense ibk), it all just seems fake.

    it now seems, if one av fails one... they fail all.

    but like i said IC, ive probably mis-read something here or getting a vibe from it that i dont like that probably isnt there (misreading)

    no im not nuts, i dont think so .... although my doctor does o_O

    im not trying to create arguments here IC, i just think VB, av-test and av-comp should keep their testing methods to theirselfs and leave be, without the influance.

    lol
     
    Last edited: Jun 4, 2007
  10. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    mainly that
     
  11. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    who creates the so-called wildlists.

    the so-called malware on them are ones i never get, o they are not as wild as they lead to believe.
     
  12. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Vesselin explained that already years ago. Maybe he posts something to it here.
     
  13. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
  14. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
  15. ASpace

    ASpace Guest

    Very interesting reading ! Thanks :thumb:
     
  16. bontchev

    bontchev AV Expert

    Joined:
    Nov 13, 2005
    Posts:
    38
    Viruses which are ITW are not included in the WildList. Viruses, which are included in the WildList are not ITW. The so-called "WildList reporters" don't really bother to monitor what is actually ITW and just keep "confirming" that the same things they have reported before are still ITW. They often keep sending one and the same sample over and over.

    There are so many things wrong with the WildList that I can't hope listing them all here. Just refer to my paper on this subject. I wrote it 8 years ago but most of the problems discussed there have never been fixed - mostly due to the incompetence of the people behind the WildList (and sometimes because of their ego that does not allow them to admit that I am right and they are wrong).

    Basically, the viruses that are actually ITW and the WildList have very little in common. Nevertheless, "everybody" loves it. The WildList people love it because it gives them a sense of self-importance. The testers love it because it's easier to test AV products against a small test set somebody else provides you for free than against a huge virus collection that you build and maintain yourself. The AV producers love it because it's easier to score high detection rates against 200+ viruses than against 300,000+. Of course, the only losers are the users, who are lulled into a false sense of security.

    There was a valid question (I think in the VB article Mike posted) - if the WildList virus set is so easy to detect, why so many products are failing the "VB 100%" detection tests. The answer is simple - because passing these tests does not mean only detecting the viruses on the WildList. It also means no false positives, reasonably high (>90%) detection of the "zoo virus set", equal detection rate of the on-access and the on-demand scanners (sometimes there is a difference due to a bug, or an OS quirk, or a configuration issue), sometimes there is new stuff surprisingly added to the WildList and used by the testers before the AV producers can adapt and so on.

    Regards,
    Vesselin
     
  17. comma dor dash

    comma dor dash Registered Member

    Joined:
    Jun 5, 2005
    Posts:
    146
    Maybe we should go one step further and ask the question whether the ordinary QUANTITATIVE testing is the most relevant part of an AV test. Does it really say that much about the effectiveness of an AV scanner?

    If I had to decide for a scanner I would like to know whether:

    1.
    the scanner is unlikely to corrupt my windows installation, brutally slow down my computer, disrupt internet traffic or cause conflicts with other apps;

    2.
    the scanner suffers from a vulnerability and will allow for privilege escalation attacks (it appears that many scanners do actually decrease security...);

    3.
    the scanner will be able to detect modified variants of known malware (i.e., type and effectiveness of unpacking engine used, mem scanning abilities, quality of heuristics, quality of signature database);

    4.
    the scanner sports decent rootkit detection technology;

    5.
    the scanner frequently produces false alarms.

    Apart from a few exceptions (e.g., IBK's retrospective/proactive tests) it seems to me that no reputable tester makes an effort to engage in QUALITATIVE testing. And for sure, AV companies are not interested in such tests... :rolleyes:
     
  18. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    testing testing testing,

    its almost taking over now,

    what we need, are just some credible reviews of different AV's and also reviews of the different technologys that some av's create, there doesnt seem to be many of those.

    everyone likes a good review, as long as its credible and valid.

    im really surprised this isnt a field av-comparatives have not 'tried' to get into in some way.
     
  19. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    The most credible review is when a member posts here that their product failed to detect something. Not once about the vendor, but when you start to see a track record. Other then that, there isnt any more legit testing that can tell the true story.
     
  20. FanJ

    FanJ Updates Team

    Joined:
    Feb 9, 2002
    Posts:
    4,660
    Big thanks to Mike (IC), Andreas (IBK) and in particular to Vesselin (bontchev).

    Vesselin,

    That was absolutely an interesting reading; thanks very much !

    Best regards,
    Jan
     
  21. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    then bontchev, based on your comments, the high standing for Eset would hold true, in meeting this criteria. Hmmm?:)
     
  22. FanJ

    FanJ Updates Team

    Joined:
    Feb 9, 2002
    Posts:
    4,660
    Hi,

    Of course "everyone likes a good review, as long as its credible and valid".

    But doing it in a credible and valid way, isn't so easy.
    There was a reason for the recent AV test conference (workshop).
    And read Vesselin's article, and what Mike has been posting several times about testing, and what Blue has been posting about statistics.

    I thought that Andreas (IBK) did try it !
     
  23. dan_maran

    dan_maran Registered Member

    Joined:
    Aug 30, 2004
    Posts:
    1,053
    Location:
    98031
    As stated in this thread Dr Vesselin Bontchev declared this so years ago, yet many in the industry cried foul. Hopefully this meeting produces a quality test criteria for the future.

    I agree that the only "real" way to test is on end users and this brings up an interesting experiment;
    Take 1000/10000(or so) internet/computer users from all walks of life (high risk, low risk, corporate, etc.) and geographic locations, have them run 1 AV with the OOB settings that cannot be changed or disabled for 3-6 months. Upon the end of the "experiment" evaluate the totals and compare results. I know this is a broad scoped idea with a multitude of its own issues, but IMO this would be the only way to evaluate the true protection levels. Simulated environments are great but as always they are no substitute for real world applications.
    Before anyone says it is impossible, it is not impossible just not likely as the funds required would have to come from somewhere and who would be wiling to place money in an experiment such as this if the results for the particular application are not guaranteed? I know I wouldn't.

    //EDIT//
    Thinking about it I bet Blue could come up with some numbers needed to verify the validity.
     
    Last edited: Jun 4, 2007
  24. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    Totally fascinating reading. My special thanks to IC & bontchev.

    For me, for now, my anti-malware buy decisions are based 80% on AV-Comp reports, and 20% on the fact that I am a devout fanboy of IC and Stefan K.

    Hmmm.... I can't help but wonder if, unbeknownst to all, even one black hat attended the AV Tester workshop. {Don't flame me for wondering -- I have a diseased mind.}

    ^_^
     
  25. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Interesting...:)

    Well I think credentials would be required in order to get into the workshop, but...if we have a James Bond in there, then well :p :D
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.