malware scanner test

Discussion in 'other anti-malware software' started by Page42, May 21, 2008.

Thread Status:
Not open for further replies.
  1. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    Rogue Detections (old, not so old, new threats) by malware scanners

    The following free antispyware/antimalware applications were tested:

    1. A-Squared by Emsisoft (A2)
    2. Ad-Aware 2007 by Lavasoft (AAW)
    3. Malwarebytes Anti-Malware by Malwarebytes (MBAM)
    4. RogueRemover by Malwarebytes (RR)
    5. SUPERAntispyware by Superantispyware.com (SAS)
    6. Spyware Doctor Starter Edition by PC Tools (SD)
    7. Spybot Search & Destroy by Safer Networking (SSD)
    8. Windows Defender by Microsoft (WD)
     
  2. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    "Analysis:

    1. It goes without saying, but we'll say it, anyway, a perfect malware scanner does not exist.
    2. Of the scanners tested, a-squared by Emisoft performed significantly better than the others. It only failed to detect six items. This product performed equally well detecting the old, not so old and new threats.
    3. Malwarebytes' Anti-Malware and RogueRemover both performed well on the tests. RogueRemover failed to detect 24 rogue samples and Malwarebytes' Anti-Malware, concidentally, also failed to detect 24 rogue samples. It should be noted that the failed samples were not the same items on both scanners. When scanned with both of scanners, the total number of samples missed was only ten. This is important to note since it clearly displays the benefit of using multiple scanners. One succeeds where the other fail, but both used together failed on a much smaller group of samples.
    4. Ad-aware, Spyware Doctor, Spybot S&D, Windows Defender and SUPERAntispyware all performed very poorly."
     
  3. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    How about testing some hidden rootkits that the other products won't even see? And maybe some threats that are in circulation still :)
     
  4. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    The tester was asked if the samples are in circulation.
    She responded, "Yes they are in circulation and alive because I gathered the samples few weeks ago only.
    All samples were not gathered in one day. Those were gathered between April 23, 2008 - May 4, 2008
    I scanned the sample on the day I downloaded them (between April 23 - May 4). I updated the definitions each time I will start the scan."
     
  5. EASTER

    EASTER Registered Member

    Joined:
    Jul 28, 2007
    Posts:
    11,126
    Location:
    U.S.A. (South)
    I see a trend where everyone is now looking for ghosts that either are bodiless with no muscle to size up to today's security ware, all AV's not included, or are pinched off at the source.

    But by all means keep the attention up, maybe something clever will finally surface to give serious attention too.

    Security Vendors have the upper hand and thats the bottom line. Read BlackHat Forums and the old Rohitab Forums, they're out of breath and ideas. :D
     
  6. ErikAlbert

    ErikAlbert Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    9,455
    It's always the same with these tests : incomplete and incorrect and that will never change.
    Collect ALL the malware in the world in one or more test environments and then run all these scanners. Don't collect all malware in one single test environment, they might kill eachother.
    This is the only way to give each AV/AS/AT/AK/AR/... scanner a fair chance to prove its quality and then you will know for sure which ones are the winners and the losers.
    Collecting ALL the malware in the world is impossible and an enormous job, that will last forever, while the bad guys keep on creating them so fast that the good guys can't keep up with them.

    What these testers do is collecting malware on a random base and when they are sick and tired of collecting or think they have enough and it's never enough, they start running all these scanners and publish the results.
    Because each tester performs a one man show with his personal test environment, which is as incomplete as it can be, the final results of each test environment produce another winner and another loser and all the readers are fooled, because the results are incorrect and will never be correct.

    The same counts for all scanners. It's impossible to create an anti-dote (signatures/heuristics) for each existing malware in the world. So all scanners are as incomplete as it can be and the user is lucky, that his favorite scanner(s) will detect and remove all the malware on his computer completely.

    When incomplete test beds and incomplete scanners meet eachother, the results are unpredictable and incorrect. Isn't that logical ?
    Is A2 the winner and is SAS the loser ? No, A2 was lucky and SAS was unlucky in this test and only in this test, it's nothing more than that. If you have A2 on board, keep it running, if you have SAS on board, keep it running and be glad they detect and remove some malware on your computer.

    These tests mean NOTHING to me and don't tell me anything what I already know, old news, that repeats itself over and over again. I don't have to be knowledgeable or an expert to come to this conclusion, it's nothing but pure logical reasoning.
    I'm not interested in the results of SCANNERS vs. SCANNERS, I'm interested in the results of SCANNERS vs. SOMETHING ELSE, whatever that may be. I never read anything about this, that's why I do my own testings. :)
     
    Last edited: May 22, 2008
  7. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    I fail to see where or why anyone can come up with reasons to fault these tests, other than that they don't like the results. I read so much about "real world" situations, well, what did this tester do? She downloaded malware from the internet. Isn't that what people do in "the real world" when they get infected?
     
  8. ErikAlbert

    ErikAlbert Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    9,455
    She certainly showed, how weak AS-scanners can be as protection, but I knew this already.
    But the winners and losers are pure accidental, because the testbed was incomplete. Run A2 and SAS in another incomplete testbed and SAS could be the winner and A2 the loser.
    Personally, I don't care about the results, because I don't use scanners anymore to protect my system.
     
    Last edited: May 22, 2008
  9. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    Another test should be coming soon...

    http://www.dozleng.com/updates/index.php?showtopic=18279
     
  10. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,099
    Location:
    Hawaii
    Whoops! I double posted this test. Abject apologies. :doubt: :oops:
     
  11. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    Hey, the post was buried, with barely any commentary from the anti-malware users. No apologies needed! I look forward to the next one.
     
  12. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,099
    Location:
    Hawaii
    Bottom Line- The data from that test indicate that the answer to THIS question re MBAM is... so-so. :ouch:

    A-squared? Shocka brah!!! :thumb:
     
  13. nosirrah

    nosirrah Malware Fighter

    Joined:
    Aug 25, 2006
    Posts:
    561
    Location:
    Cummington MA USA
  14. Night_Raven

    Night_Raven Registered Member

    Joined:
    Apr 2, 2006
    Posts:
    388
    I'm no security expert but I'm gonna go ahead and call this test a complete garbage. Spybot - Search & Destroy and SUPERAntiSpyware are two excellent free products that are perhaps the best overall AS products out there. And in this test they perform very poorly? Is this some kind of a joke that I don't get? Malwarebytes' Anti-Malware may not be able to detect such a large amount of malware as the other two previously mentioned but it's very efficient in removing some nasty and more common malware that other products might have serious problems with. Again a top notch product. a-squared is such a load of bull droppings that I'm having difficult time finding the words to explain it. I scanned with it recently and it flagged so many false positives: HFS (HTTP File Server), one of WarCraft III's executables (flagged it as some trojan), and more, and more. I now regret deleting the log file, I thought I wouldn't need it. AVG Anti-Spyware Free was a bit better, but still had false positives. Anyway there is no way I can consider a program that produces a lot of FPs as a good piece of software and I certainly can't find it better than other products that have proven themselves as very reliable and efficient on numerous occasions, including in my own experience.
     
  15. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    You are overstating the scope of the test. Rogue Detections only. So the results aren't complete garbage.

     
  16. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,099
    Location:
    Hawaii
    As the saying goes: "I you can't raise the bridge, lower the water.":cautious:
     
  17. Night_Raven

    Night_Raven Registered Member

    Joined:
    Apr 2, 2006
    Posts:
    388
    I still find it garbage. I just can't believe that a-squared would perform better than MBAM, Spybot and SAS. Even if we assume it actually does (which might be possible in another dimension), there are other factors that are important as well, when considering AS software, so the test becomes very incomplete and therefore practically almost worthless.
    Again: if it's a joke, it didn't get me laughing.
     
  18. nosirrah

    nosirrah Malware Fighter

    Joined:
    Aug 25, 2006
    Posts:
    561
    Location:
    Cummington MA USA
    MBAM is quite comfortable raising the bar .

    The tech we have in the pipe will will raise it quite a bit further over the next few months .

    When I decide to work 6AM to 4AM the next day its comments like "MBAM is... so-so" that keep me going so in an odd kind of way , thanks . :thumb:
     
  19. Stijnson

    Stijnson Registered Member

    Joined:
    Nov 7, 2007
    Posts:
    533
    Location:
    Paranoia Heaven
    Can I add another 'rogue application': the user...

    I would like to see some test against that particular threat. :)
     
  20. GES/POR

    GES/POR Registered Member

    Joined:
    Nov 26, 2006
    Posts:
    1,490
    Location:
    Armacham
    I appreciate you sticking up for these underdogs and as much as i love these they are not as superieur in detection as some might want to think. Especially spybot is notting to brag about. :thumbd:

    MBAM and SAS are fine layers but not overly great in detection. The most important features though are lightness, extreme well cleanup and top notch support and that is good enough for me. If i want high detection ill go with overly bloated products.

    Im noticing a bashing asquared trend and although i prefer not to use it at this time it has always been well with detection, just less in the cleanup and fp's area. There are reports to backup the rates but if you take them as garbage then please do show us some better ones.
     
  21. jdd58

    jdd58 Registered Member

    Joined:
    Jan 30, 2008
    Posts:
    556
    Location:
    Sonoran Desert
    Spybot doesn’t scan the ADS iirc.

    One of the things with A2 is it’s a large download, plus the updates are large. Dialup users may find that inconvenient. It does work on 9x systems as does SAS.

    I prefer smaller apps anyway. It makes imaging quicker. MBAM is good in that regard. I can’t wait to use it on a severely hosed machine.
     
  22. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    The tests are interesting, but Rogues don't represent the actual problems on user's systems - the rookits that can't be seen, detected or noticed are some of the real problems - our next generation technology goes after those....

    http://www.superantispyware.com/blog/
     
  23. Night_Raven

    Night_Raven Registered Member

    Joined:
    Apr 2, 2006
    Posts:
    388
    They might not detect everything, but they do have a good detection rate. In my experience their detection rate is superior than most other products. I don't care how high the detection rate of a program is, if it flags too many FPs, it isn't worth using. Sometimes FPs are worse than lack of detection. I don't give a damn if a-squared would detect a few/several more tracking cookies but would also flag some programs or files that are harmless and important to me as harmfull and would remove them. This would be unacceptable.
     
  24. Page42

    Page42 Registered Member

    Joined:
    Jun 18, 2007
    Posts:
    6,941
    Location:
    USA
    Here's what Eric Howes has to say about rogue listing criteria:

    In my book, those are "actual problems on user's systems", and I want an AM that detects them.
     
  25. GES/POR

    GES/POR Registered Member

    Joined:
    Nov 26, 2006
    Posts:
    1,490
    Location:
    Armacham
    Total lack of interrest to fix fp's was what drove me away some years ago.
    Right now i don't want a program that goes after cookies or even has cookies HIPS control. That's just absurd! Nice feature if you like adaware though.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.