New AV Test From SSU

Discussion in 'other anti-virus software' started by guest, Oct 28, 2008.

Thread Status:
Not open for further replies.
  1. Don johnson

    Don johnson Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    77
    :eek: :eek: :eek: Sunbelt is "crapware"?wow!I hope the man who said this word can write better software than the "crapware".
     
  2. IceCube1010

    IceCube1010 Registered Member

    Joined:
    Apr 26, 2008
    Posts:
    963
    Location:
    Earth
    Any so called testing site that labels any software as "crapware", really can't be taken seriously. It looked interesting at first but now I have my doubts.

    Ice
     
  3. littlebits

    littlebits Registered Member

    Joined:
    Jul 7, 2006
    Posts:
    262
    It was only listed because it displays nag screens to buy their other products, it should have been removed from MalwareBytes.org and a-squared's detection long time ago. In other words it can be considered as nagware but not harmful.

    It has had pretty good detection for awhile and is really free.

    Thanks.
     
  4. Defcon

    Defcon Registered Member

    Joined:
    Jul 5, 2006
    Posts:
    337
    Why don't these tests also include the free version of Avira? I find that its enough for my needs but I want to know exactly what I'm losing out on..
     
  5. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    I wonder about several things:
    1, how the test set was chosen. Was it based on detection by certain AV programs?
    2, how the samples were verified. Did they test them to make sure they are actually functional and not just corrupt? For instance, advanced heuristics doesn't react on corrupt samples.
    3, how the testers prevent an AV program from winning the test by simply flagging every single file. Did they verify that the samples included in the set are not false positives?
     
  6. MarkW

    MarkW Registered Member

    Joined:
    Dec 24, 2006
    Posts:
    48
    I have an important question.

    Did they test the sweeping ability on an infected machine or did they test the real-time protection on a machine exposed to these malware atttacks from a second machine via the Internet or network?

    Second interrelated question: did the test the free or paid versions of these products? Many of the products (e.g. Malwarebytes) don't have real-time protection in their free trial versions.

    Knowing what I do about statistics (which is a lot) I have a very difficult time accepting the range of reported results as well as the mean, median and approximate mode. Methodology is everything. Sometimes you read a report and it rings invalid. This one feels wrong. The shape of the curve is odd. Bell curves make statisticians sleep like babies. Things like this feel like fingernalis on a chalboard.

    Questions:

    1. Did you double-test the outliers to verify your test bed?
    2. Did you test antimalware against viral assaults?
    3. Did you use anonymously obtained, paid versions of each product?
    4. Would you pass off your methodology to someone else so that it could be repeated?
    5. Did you ask yourself why Malwarebytes, something well reviewed and never rogue listed would test so incredibally abyssmal? It's either the product or your methodology or the given execution of your test on that product.
     
    Last edited: Oct 29, 2008
  7. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
    As already mentioned, more questions by the author are answered on Comodo's forum:
    http://forums.comodo.com/feedbackco...cis/ssupdaters_antimalware_test-t29030.0.html

    In summary, people are most surprised with the low ratings/rankings of Nod32, AVG and VIPRE.

    And most surprised with the high ratings/rankings of A-squared.

    I don't think anyone needs to let loose at the author, as it is their test. The focus should be on discussing rather than simply dismissing. People would prefer to have the author here to discuss their results rather than not feel 'welcome' here. There are plenty of tests around and not all of them are the same. So to put it into perspective, this is just one test. For example, I'm no expert, but maybe the author did not have Nod32, AVG and VIPRE set/configured to their optimal settings.

    Anyway, I would still use either of these programs and recommend them to others.

    But the author should provide/list information, such as what is being provided in the Comodo forum, on their site to avoid backlash. So in the Wilders' forum defense, the test appears to be lacking 'data' and testing method information. And I agree, the authors should show more 'tact' in describing their 'under-performers'. eg. 'we hope product xyz does better in the next test'.
     
  8. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
  9. wildvirus88

    wildvirus88 Registered Member

    Joined:
    Feb 28, 2004
    Posts:
    331
    Nice to see Twister in the test. Kaspersky, F-Secure and Avira got good results.
    \o/
     
  10. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    1,649
    Location:
    Paris
    I have a feeling that as it is a SSU test those with specific problems/questions about the test should go to their website to be satisfied. This wasn't, after all a Wilder's test.

    As to why Panda wasn't tested- I repeat my previous post- the AV's that were be tested were selected by SSU's members. Panda just wasn't very popular.
     
  11. RubbeR DuckY

    RubbeR DuckY Developer

    Joined:
    Jul 7, 2006
    Posts:
    227
    This is probably the WORST test you could have performed. You should be embarrassed. Do you know nothing about malware?

    P.S. Where did you get the sample of malware 15 years old?
     
  12. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
    For those not wanting to transfer/read threads.

    The author/administrator has updated their test method:

     
  13. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    The test methods are laughable. They don't even replicate fileinfectors. What a joke. How do they make sure that the products they test are able to detect viruses in other files? Basically you can just add all samples from a specific collection as full file crc or whatever to detect every sample, but that doesn't protect the user in the end because if a virus infects another file the Checksum will be completely different because it's another host program. Gosh...

    As for polymorphic viruses you have to replicate that to several generations and take a snapshot out of the middle (because you know then for sure that the virus was working because it infected other files afterwards!)

    Regarding source code in the collection - since when is high level compiler source (or even assembly source) considered to be malicious? The compiled result - the executable/binary is but not the source. Otherwise you would have to detect word and notepad files that contain the following:

    Hi there! I'm too dumb to write my own virus please erase some system files on your machine and give this document to a couple of friends and ask them for their support. Thanks for understanding and have a nice day!
     
  14. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    1,649
    Location:
    Paris
    Inspector C- I really thing your are missing the point pf the test. It never was to prove which product is the best (as no security product is 100% effective), but instead to steer SSU members away from marginal and poor products.

    I wish those who are so vocal here in their condemnation of the SSU test would contribute here on the AV forums. I can't count how many times I've seen comments extolling the virtues of one AV or another because it is "light" on their systems (where was your expertise then?). At least the SSU folks can say they use a certain AV because it performs well vs malware, and not because it has a pretty interface.
     
  15. geko

    geko Registered Member

    Joined:
    Jan 31, 2008
    Posts:
    35
    Looking at the results, they do not differ from what I already thought.

    Nice to see a FREEEEE product at the top, as Comodo Internet Security.

    I expect Vipre to get better, all I can say, its a new product (VIPRE™ Antivirus + Antispyware).

    Nod32, it was nice meeting you, maybe some day we will meet again.

    And why not, to AV, AS, AM merchandising dont say that a product does something when it doesnt. Well wait a minute it might do it, but others do it better.

    Great job (IMHO).
     
  16. doktornotor

    doktornotor Registered Member

    Joined:
    Jul 19, 2008
    Posts:
    2,047
    Riiiight, e.g. such "marginal" and "poor" product as MBAM or SAS?! First crapware, now "marginal and poor products" - what kind of insult comes next? Leaving the methodology details aside, comparing apples and oranges doesn't exactly fly as this nonsensical test shows.

    Folks, really grow up first... :rolleyes: :thumbd:
     
  17. EraserHW

    EraserHW Malware Expert

    Joined:
    Oct 19, 2005
    Posts:
    588
    Location:
    Italy
    And how do you judge some products as "marginal and poor"? Using this test I suppose. But if the test is not rightly run (and you can read my, IC, RubbeR DuckY posts) then the test is as valid as every user's thought
     
  18. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
    I think I can understand the debate about 'old and out-of-date samples' vs 'new and current samples'.

    Nosirrah (MBAM) mentioned gathering all the viruses found in the past month (as this is what affects current users), and then comparing the products. As this would be an unbiased test.

    I think that is fair.

    I don't know the proportion of old vs new samples. But say this sample (by SSUpdater) used say 80% old viruses (7+ years ago), and 20% new and current samples, then programs such as MBAM and VIPRE, which focus on protecting a user against everything out there today, are going to fare poorly in a test with mainly 'out-of-date' samples.

    However, the out-of-date samples aren't affecting users' systems. So MBAM and VIPRE look bad in the test, but would actually protect users browsing and downloading from sites today.

    :cautious:
     
  19. Pedro

    Pedro Registered Member

    Joined:
    Nov 2, 2006
    Posts:
    3,502
    Hard to get the point.

    Wrong information (or fabricated), accusing AV's for making malware, weak/ non existent methodology information (Marcos questions illustrates this), and attacking AV Comparatives with no real arguments. Not to mention the "crapware" part.

    I don't even think AV Comparatives is that great, but comparing this to them is insulting..

    I would be embarrassed, but then again, i wouldn't do this.
     
  20. Sm3K3R

    Sm3K3R Registered Member

    Joined:
    Feb 29, 2008
    Posts:
    611
    Location:
    Wallachia
    This test seems to fit my own day by day usage experience excepting the first one.Avira,Avast,Kapsersky and BitDefender are in the top ten and this means the tests are not so bad.AVG and Esset are also well put that is their value for the moment.
    I trust this test.
     
  21. doktornotor

    doktornotor Registered Member

    Joined:
    Jul 19, 2008
    Posts:
    2,047
    You cannot test products when you lack even a basic understanding of how they work and what's their scope, period. Well, actually you can as this sad example proves - but then you inherently run a high risk of looking like a complete moron. :rolleyes:
     
  22. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
    Sorry, just quoting myself. :p

    Going by this same hypothetical, as we don't know the proportion of 'new viruses' used in this sample, a product which has performed well in past (but not so well now), might do extremely well in this (80/20) test.
     
  23. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    This is not entirely true. I can still get you malware thats over 5 years old on lets say P2P.
     
  24. Cooper_it

    Cooper_it Registered Member

    Joined:
    Jan 9, 2008
    Posts:
    6
    Strange that G Data 2009 is not in the test, as they were at the #1 position in previous tests performed by those guys.
     
  25. nosirrah

    nosirrah Malware Fighter

    Joined:
    Aug 25, 2006
    Posts:
    560
    Location:
    Cummington MA USA
    I replied in their forum , said all that needed to be said and issued a challenge that would give real results .

    People can read and come to their own conclusions , I trust the smarts of you guys over that test any day .



    As always MBAM has no plans of bloating the DB to pass these pointless tests .
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.