Antispyware comparison from Malware Test-lab (07 Aug 06)

Discussion in 'other anti-malware software' started by phasechange, Aug 7, 2006.

Thread Status:
Not open for further replies.
  1. phasechange

    phasechange Registered Member

    Joined:
    Aug 10, 2004
    Posts:
    359
    Location:
    Edinburgh
    Hi!

    I am not familiar with http://www.malware-test.com/test_reports.html are their methods and results well regarded?

    I was surprised at how well Trend did and even more so at the Windows Defender result. Are these what you guys would expect?

    Fairy
     
  2. Chubb

    Chubb Registered Member

    Joined:
    Aug 9, 2005
    Posts:
    1,967

    Attached Files:

  3. WSFuser

    WSFuser Registered Member

    Joined:
    Oct 7, 2004
    Posts:
    10,639
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    i was expecting superantispyware to do better. guess i better look for an alternative.
     
  4. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    I am trying to get in contact with Malware-Test to see what sample set they used - the fact that Ewido, Kaspersky, and SUPERAntiSpyware are towards the bottom of the list shows (in my opinion and testing) that the test may indeed be inaccurate - I say this because both Ewido and Kaspersky have high detection and removal rates in our own testing - SUPERAntiSpyware also fairs well in the real-world - as many users can testify to based upon actual use on actual infections. The fact that McAfee and Norton are high up on the list is interesting also because they focus on viruses, not spyware and they tend to be more behind the zero-day threat curve for spyware. I am not putting down any of the products - the tests simple don't match up to testing of actual real-world threats that we see from infections reported on users machines.

    Example CNET Download.com user reviews for SUPERAntiSpyware:
    http://www.download.com/SUPERAntiSpyware-Free-Edition/3640-8022_4-10564983.html?sb=1&v=0

    Another review of anti-spyware testing on Spyware Warrior, SAS comes out on top:
    http://www.spywarewarrior.com/viewtopic.php?t=22191

    Suzi Turner, of Spyware Warrior also added us to the Trustworthy AntiSpyware Application List here:
    http://www.spywarewarrior.com/rogue_anti-spyware.htm#trustworthy

    It is hard to actually test anti-spyware applications as the threats change daily. They need to be tested in the real-world against actual sample sets that are current and changing.

    We update our definitions daily to handle new threats:
    http://www.superantispyware.com/definitionupdatehistory.html

    They do not answer e-mails nor provide the samples they are using as a test set. They also do not clarify what was not removed, meaning errant registry keys or .GIF files, or actual harmful infected files.

    My hopes are that they will actually answer our e-mails and provide the sample set, and we can promptly update our definition set if the samples are actually harmful and current.

    Nick Skrepetos
    SUPERAntiSpyware.com
    http://www.superantispyware.com
     
    Last edited: Aug 7, 2006
  5. Ngwana

    Ngwana Registered Member

    Joined:
    Jul 5, 2006
    Posts:
    156
    Location:
    Glasgow, United Kingdom
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    Comparisons, ratings and rankings are coming thick and fast from reviewers, testing labs and security websites: My personal take is that though it may be impressive for a product to perform consistently well as the testing samples change, is not the end of the story.

    In a real life situation most users use layered security and it is wise to regularly update and use the best secure configuration for whatever security product(s) they have installed. The wisdom of using the most popular or most rewarded products is becoming questionable. :D
     
  6. Legendkiller

    Legendkiller Registered Member

    Joined:
    Jun 29, 2006
    Posts:
    1,053
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    thats pretty starnge indeed...i never knew that mcafee AS was so good...
     
  7. Robyn

    Robyn Registered Member

    Joined:
    Feb 1, 2004
    Posts:
    1,189
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    Very interesting I am running the beta version of Trend for Vista I was about to install what I thought was adware free software (specifically asked about this) it was the demo version but the full version has to be paid for applian software Replay AV or Radio.

    Thankfully on a beta computer with Vista, Defender installed as default with the OS.
    ran the initial install and Trend active AS immediately stopped the install Detection name: Adware_2020Search I checked the link
    Defender didn't get a chance to even blink.
    I know this isn't actual cleaning after something was installed but Trend certainly didn't let the software install - pity they didn't have separate components for their AS. I don't have Trend on my XP computer but was impressed by the prevention (disappointed the adware was in the software :'( ) I have never had any security alerts so this one shocked me.
     
  8. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    From my own experience mcafee AS was really good. ;) Spy sweeper also. I never tested Trend Micro AS which seems to be the leader. o_O
     
  9. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    What software are you talking about ?
     
  10. Robyn

    Robyn Registered Member

    Joined:
    Feb 1, 2004
    Posts:
    1,189
  11. TNT

    TNT Registered Member

    Joined:
    Sep 4, 2005
    Posts:
    948
    Re: Malware Test.com

    I don't get whether you're surprised at how bad or how good the Windows Defender result is compared to what you expected. I never thought it was ever going to be anything special, and it sure doesn't seem it's very good nor truly awful.

    Personally, I'm surprised that Ad-Aware got quite good results and I don't understand what Kaspersky Internet Security and NOD32 are doing here, since they are not really applications for "spyware cleanups". As for their test methods, they have a description on the site (though arguably not very detailed).
     
  12. phasechange

    phasechange Registered Member

    Joined:
    Aug 10, 2004
    Posts:
    359
    Location:
    Edinburgh
    Last edited: Aug 8, 2006
  13. Metting

    Metting Registered Member

    Joined:
    Aug 3, 2006
    Posts:
    100
    Re: New Antispyware comparison from Malware Test-lab (07 Aug 06)

    Hi

    Used samples names are in their full report

    and here they are

    Able2Know Toolbar
    CAPP (CNNIC/3721 Chinese Keywords)
    InlookExpress
    PeopleOnPage
    Sidesearch
    Speedup
    StartSpy
    STIEBar (0CAT YellowPages)
    SideBySideSearch (sbsse)
    WildMedia
    Csnoop (computersnooper)
    SideStep
    WinHound
    SaveKeys (Save Keys Undetectable)
     
  14. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    Now things make a little more sense - they are not running these samples in their native environment, meaning infecting a machine/system with the infection, then cleaning it. Many of them appear to be copied simply to "c:\virus", which is not where these infections "live" in the real-world. Our heursitic guarding and false-positive prevention will often kick these out and not detect them as not to produce false positives when the infections are completely out of their observed and researched environments.

    We could simply update our rules to get those - but these are not real world tests - I have contacted them several times, but can't seem to get a reply.

    Nick Skrepetos
    SUPERAntiSpyware.com
    http://www.superantispyware.com
     
    Last edited by a moderator: Aug 7, 2006
  15. dah145

    dah145 Registered Member

    Joined:
    Jul 3, 2006
    Posts:
    262
    Location:
    n/a
    All these antispyware reviews are making me crazy o_O
    In what we can trust? o_O
    KIS and SUPERAntispyware in last places. I just cant believe it. :cautious:
     
    Last edited: Aug 7, 2006
  16. lu_chin

    lu_chin Registered Member

    Joined:
    Oct 27, 2005
    Posts:
    295
    I think it will do users the most benefits when a security application tries to protect users as closed to real world scenarios as possible. Reviews that do not reflect real world setups can be misleading.

     
  17. aigle

    aigle Registered Member

    Joined:
    Dec 14, 2005
    Posts:
    11,164
    Location:
    UK / Pakistan
    How they can do camparatives with just a dozen of samples. It,s a pitty.
     
  18. TairikuOkami

    TairikuOkami Registered Member

    Joined:
    Oct 10, 2005
    Posts:
    3,417
    Location:
    Slovakia
    Thanks for responses, I marked, that Malware-Test Lab does not provide trustfull AS test. [​IMG]
     
  19. eburger68

    eburger68 Privacy Expert

    Joined:
    Mar 4, 2002
    Posts:
    244
    Nick:

    You wrote:

    Actually, it does appear that they ran these programs in their native environment -- see the test methodology on p. 25 of the report. Testing against threats in their native environment is one of the distinguihsing features of many of the earlier rounds of tests from this group. Thus, I'm assuming that the C:\Virus path for some of the apps was a reporting error or oversight of some sort, because I've not seen that in previous reports from this group.

    No, the biggest problem with this round is their testbed, which leans heavily towards "low risk" apps that many vendors would likely have excluded from their definitions. See my comments over at DSLR:

    http://www.dslreports.com/forum/remark,16661920

    This was a problem with some of their earlier rounds as well:

    http://www.dslreports.com/forum/remark,16527721

    Folks, instead of judging the apps tested -- or even the testing itself -- by one round of tests, it would be much more productive to review the entire series of tests that this organization has conducted. This organization's tests aren't without problems, but there is a lot of promise here as well. See my comments in the second DSLR thread listed above.

    For more thoughts on what constitutes quality anti-spyware testing, see this thread over at Spyware Warrior:

    http://spywarewarrior.com/viewtopic.php?t=22210

    Best,

    Eric L. Howes
     
  20. sosaiso

    sosaiso Registered Member

    Joined:
    Nov 12, 2005
    Posts:
    601
    So, from what I gather, the testing LOGIC is good, but the samples are bad?

    But that leads to the question about the results, how much weight should we give them for those?
     
  21. eburger68

    eburger68 Privacy Expert

    Joined:
    Mar 4, 2002
    Posts:
    244
    sosaiso:

    Yes, that's essentially it -- good methodology (at least better than most); lousy test bed (doesn't reflect actual threats to users in the real world).

    The C:\Virus path is a bit puzzling, and that bears looking into.

    Again, I would urge folks to look over the entire series of tests, not just focus on one particular test.

    Best,

    Eric L. Howes
     
  22. sosaiso

    sosaiso Registered Member

    Joined:
    Nov 12, 2005
    Posts:
    601
    Interesting.

    Thanks, that clarifies what to look for in these tests. I was originally just going to disregard them due to the lack of forum replies on their forums and the supposedly lousy scores. I'll have to look at the reports now. So less weight to the percentages and more weight into what each program did.
     
  23. dallen

    dallen Registered Member

    Joined:
    May 11, 2003
    Posts:
    825
    Location:
    United States
    I'm not sure if my method would qualify as sound scientific testing and this may sound backwards, but when I look at comparison test results like these, I simply look for products I know, both good and bad. When I find them, I see where they ranked. Then I ask myself, "Does that sound right?" If the answer is no, then I am suspect of the test right away. In this case, I see where Lavasoft's Ad-aware is and I am shocked to find SUPERAntiSpyware ranked below it. SuperAntiSpyware may not be the best on the market (maybe it is), but it certainly is better than Lavasoft's product. Something must be wrong here.

    Since SUPERAntiSpy is present in this thread, I ask you sir (without wanting to put you on the spot), in fair, accurate testing, wouldn't you expect your product to have higher detection and removal rates than Lavasoft's product and thus be ranked higher?
     
  24. Manny Carvalho

    Manny Carvalho Registered Member

    Joined:
    Jun 3, 2004
    Posts:
    270
    Why do you think Eric that vendors would exclude the low risk apps from their databases? It seems to me that they always add new things not remove signatures. Not that I disagree that they should have used higher risk apps but you got think - perhaps erroneously - that if a vendor can do the low risk stuff then they can get the hot ones too.
     
  25. Manny Carvalho

    Manny Carvalho Registered Member

    Joined:
    Jun 3, 2004
    Posts:
    270
    If fair accurate testing doesn't provide the results that "sound right to you" how can any testing be done? You can disagree with methodology sure but not just on an instinctual basis alone. If your instincts are 100% accurate then I got some money I want you to bet for me and I'll give you a nice cut.:D
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.