Malware-Test Lab: Antivirus Comparison Report (February 26, 2007)

Discussion in 'other anti-virus software' started by sai7sai, Feb 26, 2007.

Thread Status:
Not open for further replies.
  1. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    Good results for VBA32, ArcaVir and Ikarus. Really bad results for Comodo.
    Why was VirusBuster dropped?
     
  2. Smokey

    Smokey Registered Member

    Joined:
    Apr 1, 2002
    Posts:
    1,514
    Location:
    Annie's Pub
    Of course feel free to put your faith and trust in that ehhhhh "test".
    I don't have such positive thoughts about it, factually, i see it as The Ultimate Year 2007 Joke;)

    Regards,

    Smokey
     
  3. CJsDad

    CJsDad Registered Member

    Joined:
    Jan 22, 2006
    Posts:
    618
    The Comodo wolves are going to start howling now :rolleyes:
     
  4. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
    See this thread for discussions about the test. :D
     
  5. sai7sai

    sai7sai Registered Member

    Joined:
    May 3, 2006
    Posts:
    21
    Location:
    Taiwan
    That means no one can verify your test result or you are afraid other people verify your test report (we are not afraid)....:(
     
    Last edited: Mar 7, 2007
  6. dan_maran

    dan_maran Registered Member

    Joined:
    Aug 30, 2004
    Posts:
    1,053
    Location:
    98031
    AV-Comparatives tests are verified by the vendors.
    Have a look in the methodology write up.

    @IBK - You still use CRC??
     

    Attached Files:

    Last edited: Mar 6, 2007
  7. sai7sai

    sai7sai Registered Member

    Joined:
    May 3, 2006
    Posts:
    21
    Location:
    Taiwan
    If detection rate is not over 85%, would they send samples missed to these vendors without receiving any fees?

    Are you sure the vendors can verify their test report?

    As I knew, the vendors in AV community would often exchange the virus samples, right?

    In our standpoint, if we cannot verify their test result, we don't think this test report is meaningful for us.
     
  8. FRug

    FRug Registered Member

    Joined:
    Feb 7, 2006
    Posts:
    309
    SHA1 values don't help you to evaluate the legitimacy of his results beyond looking whether he has multiple copies of the same file.

    What he does is probably the only way to weed out a large number of broken samples, sending them to trusted corporations and people with a proven moral background and well founded knowledge about the subject for rechecking. It's simply too much work for a single person.

    Again, a list of checksums proves nothing beyond "each file is unique". A broken sample is unique too in most cases.

    If he took money for submission of the missed samples, he would lose all credibility in a blink. That would be a VERY stupid thing to do. No serious AV researcher would still want to be involved with him.
     
  9. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    you want from me sha1 values in order that you can check the validity of YOUR files. You have to check your files by yourself. I do not want to get involved in your tests in any way.
     
  10. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    Andreas,
    If a vendor not included in the main test achieves 85 % overall detection, will it receive the missed samples? VBA32, ArcaVir and Ikarus are very close to reaching that mark.
     
  11. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    yes, i think so. but very close is not the same as reaching.
     
  12. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    Yes, if they work a little more, they will reach the minimum detection. So, in the next round of "Other AV test", they can receive missed samples.
    Thanks for the feedback.
     
  13. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    When is the next round of other AV tests?
     
  14. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    maybe next year
     
  15. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    Aw I think a couple wil be able to get 85% before that but o well.
     
  16. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    I think eScan should be taken off next year as the detection rate is the same as kaspersky. If it scores the same then what is the point of testing it? I think it should be removed to make room for new blood.
     
  17. lodore

    lodore Registered Member

    Joined:
    Jun 22, 2006
    Posts:
    9,065
    i think f-secure should be removed it mainly gets the same detection rate as kaspersky because it uses the kav engine. but this time it got like 0.1 percent more detection with its other 4 engines. why don't f-secure just use kav?
    why bloat a product with 4 extra engines that only account for less than 1 percent extra detection
    lodore
     
  18. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    yeah it doesnt make sense. the other engines just consume resources for almost no added detection. maybe they want to have pride in beating kaspersky even if it is by a fraction of a percentage point.
     
  19. sai7sai

    sai7sai Registered Member

    Joined:
    May 3, 2006
    Posts:
    21
    Location:
    Taiwan
    If we can get SHA1 or MD5 values of test samples, we can check to see if the samples used of other testing organization are the same as us INSTEAD of verifying validity of test samples. Also, we can check to see how old are samples used?

    We (Malware-Test Lab) will continue to publish related infomation on web sites, so you can verify our test result, but other testing organizations perhaps decline to publish these infomation, why?

    If you are familiar with quality assurance, one of the most important things is test result can or must be verified by other teams such as role play of developers and testers.

    We are not afraid you challenge our test result, if you found we did make mistakes, please inform us, we can improve our test process at least.
     
  20. Primrose

    Primrose Registered Member

    Joined:
    Sep 21, 2002
    Posts:
    2,743

    And thank you for that position you take :thumb:
     
  21. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Sorry, had to come out of my one month hiatus for this one.

    What in the hell are you asking IBK to do, your job for you. He said he isnt going to have anything to do with this, so let it lie my friend. This is a real dead horse that keeps being beat more and more.

    See ya in 29 days.:rolleyes:
     
  22. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    I find it funny that this band of jokers are still trying to put on a completely phony air of professionalism and respectability while trying to parlay with IBK. Being the charlatans that they are, they're still trying to claim that their opinions of IBK's testing procedures - a respected researcher whose work is recognized by the industry experts - even matter at all. In fact, I'd hazard a guess that they're thrilled simply just by having IBK pay them attention.

    To sai7 or whatever you choose to call yourself, please quit the pretentious act of inviting people to tell you what your mistakes are so you can "improve" your test process. I think the experts have already said enough about the matter, and if your claims of wishing to improve hold any weight at all, you'd probably find very good advice right there, and since you're "not afraid" of being challenged, as you put it, then you should spend your time getting to work instead of making these poorly-disguised remarks.

    Sorry if this sounds less than polite, but I believe it needs to be said.
     
  23. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Why are we wasting posting time on these guys?

    There is no way we should even get involved in such a source.

    IBK is not involved and won't get involved, that is the end of it.:thumbd:
     
  24. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    On-demand detection is about the same, but proactive detection is quite different. Pegasus (Norman Sandbox) is detecting quite a bunch of samples. Better heuristics = less pop-ups from the behaviour blocker (DeepGuard).
     
  25. lodore

    lodore Registered Member

    Joined:
    Jun 22, 2006
    Posts:
    9,065
    norman sandbox is only one of those 4 engines.
    the so called improved ad aware engine isnt doing much really.
    the f-secure heuristics arent needed now they have norman sandbox
    that cuts out two engines already.
    kav+Norman would be fine and take out alot of those processes and bloat.
    lodore
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.