New Malware-Test results

Discussion in 'other anti-virus software' started by Firecat, Jul 4, 2007.

Thread Status:
Not open for further replies.
  1. plantextract

    plantextract Registered Member

    Joined:
    Feb 13, 2007
    Posts:
    392
    well, if the test was correct, it shouldn't make any difference, the signatures are the same through all the v6 line.
     
  2. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    You can see the pattern file is the same, the only difference is product version....
     
  3. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    As I don't trust other testing organizations....because I cannot verify their test results.:rolleyes:
     
  4. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    Don't be subjective. Please point out the problem you mentioned.
     
  5. Don johnson

    Don johnson Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    77
    I don't know what to say,I think it's a very“special” result.
     
  6. VikingStorm

    VikingStorm Registered Member

    Joined:
    Jun 7, 2003
    Posts:
    387
    Do you have some affiliation with this test?
     
  7. FRug

    FRug Registered Member

    Joined:
    Feb 7, 2006
    Posts:
    309
    IIRC NODs Advanced Heuristics are off by default for on-demand scans.. Since the document states they've used default settings for heuristics, I guess it remained off. If that's the case, NODs results are quite accurate for that setting. I think things would look different if they were switched on.
    Avira's heuristics are on medium by default, I guess there'd be quite a difference with high as well. Not sure about the other tested AVs default settings....
     
  8. ASpace

    ASpace Guest

    No , I won't ask Kaspersky when the answer is so obvious - the problem is not in Kaspersky but in the testers . Even if KAV was version 4.5 the results must be the same as AOL AVS if both are updated at the same time and with the best possible settings.

    The on-demand scan results does not vary because of the engine versions at least for these products from 4.5 to v6 . Since AVS is KAV6 based product - the best possible settings are equal for both products . I see they say they used default settings , so default settings should be the same for both
     
  9. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
    just because their packers support is weak. ;) Maybe in addition to corrupted samples there were also unsupported packers. :)
     
  10. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    totally rubbish,

    drweb - Application Error

    and it has been in most of their tests

    :rolleyes:

    there is no program bars on removal, you just have to wait till its done removing, obviously if there is 200000 viruses, this could take a while, so just leave it grrrr and lets see the result.
     
    Last edited: Jul 5, 2007
  11. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Well, I will say that the results are indeed somewhat strange and inconsistent with what we've been seeing from AV-test or AV-comparatives, for example.

    For example, the report mentions that the malware files were collected from honeypot, but it does NOT mention how they were collected, and how the files were sorted and whether there were any corrupted files and what methods (if any) were used to clear these corrupted files from the sample set.

    I do have a few comments about this test.....From the PDF file:

    Your test machine seems to be a good configuration :D

    This could be one reason why we are seeing such strange things with Malware-Test compared to other tests like AV-test, AV-comparatives or even VirusP's tests. All those 3 have the AVs set to max settings and max heuristics, I think Malware-Test should also do the same. Of course, even that would probably not explain some of the strange results in this test.

    Since some details are still missing (see above), it is difficult for people to make a positive conclusion just yet. Perhaps if you could elaborate further on the testing method and process, maybe things will become clear? :)

    No, the screenshot suggests they were using the same signatures.

    Maybe the difference is due to different default settings? If I remember correctly AOL AVS also lacks the ability to quarantine. So if KAV 6 was set to quarantine infected files while AVS was set to simply delete, then it can explain how AVS scores better. Because sometimes a few files cannot be properly quarantined (don't ask me why, but I have seen it happen for example in BitDefender). If the detections were counted based on number of files remaining in a folder after the AV has detected and quarantined (or deleted) everything it gets, then there might be more files left in the folder after KAV's scan than after AVS' scan because KAV may not have been able to quarantine some of the files while deleting works in most cases except when the file is locked.

    Just a theory though.

    ZoneAlarm and F-Secure may score higher than KAV due to the addition of extra engines (ZA's own antispyware engine works on-demand along with KAV engine and F-Secure as you know has multiple engines). However it is also possible that this "quarantine/delete" thing might be affecting other AVs also (hence possibly artificially lowering scores). At this point I have no way to know for sure.

    I think so. If I am correct, RogerC is Roger Chiu, founder of Malware-Test lab. :)
     
  12. FRug

    FRug Registered Member

    Joined:
    Feb 7, 2006
    Posts:
    309
    Counting by deleted files would be stupid //I'm not saying they do it that way though!) since most AVs don't allow deletion of heuristic detections / suspicious files. If they did delete them, they'd fail every VB test.
     
  13. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,618
    Location:
    Milan and Seoul
    The test is either rubbish or biased (or both). My computer (Nod32) flagged more than 70 nasties coming from computers ALL running Ahnlab V3 (it's all documented if you check my threads).

    Unfortunately anybody can post anything on the internet with a legitimate facade.
     
  14. Tunerz

    Tunerz Registered Member

    Joined:
    Jun 12, 2007
    Posts:
    110
    Location:
    Philippines
    Another av-comparatives wannabe I say :cautious:
     
  15. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    Of course....Most antivirus softwares' setting is report only.
     
  16. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    Don't be subjective, you can get user dump and dr.watson log from here.
     
  17. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    You can try to test KAV and AOL, you will see product version is different...
     
  18. plantextract

    plantextract Registered Member

    Joined:
    Feb 13, 2007
    Posts:
    392
    that doesn't say anything, even if it is, the engine & signatures are the same through out the v5/v6 line. There are no build specific heuristics, signatures & stuff like that ;)
     
  19. RogerC

    RogerC Registered Member

    Joined:
    Jul 4, 2007
    Posts:
    9
    Thanks for your suggestions, we hope we can improve documentation in the future and use other test methodologies to test antivirus softwares' total protection capabilities instead of using old test methodologies to test antivirus softwares for known viruses/spywares.

    For settings, heuristic setting is medium (default, this is antivirus company's recommendation. If turnning it to high, it is very esay to cause false positive), the rest are max settings (report only when found a virus unless antivirus software has no this setting).

    For AOL and KAV, the settings are the same, the only different is product version, we don't know why?

    Finally, yes, I am Roger Chiu. :D
     
    Last edited: Jul 6, 2007
  20. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    this test sucks, if you think drweb has "application error"

    sort it, or keep your day job away from my eyes :cool:
     
  21. Tokar

    Tokar Registered Member

    Joined:
    Jul 22, 2005
    Posts:
    81
    It is great to see ClamAV at a near 75% detection rate, but I know something is fishy when 1+% separates Kaspersky and ClamAV......
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.