New Virus.gr Tests!

Discussion in 'other anti-virus software' started by ....., Apr 24, 2005.

Thread Status:
Not open for further replies.
  1. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    I think the results looks strange maybe due e.g. the following reasons:
    - I suppose the tests were again done on different days, so the products were not all updated at e.g. the 2nd April, some could be updated 2 weeks later (and yes, there can be much difference in the signatures in 2 weeks). I think this explains why some products which should have the same percentages have different percentages.
    - the samples were chosen by using some scanners (which is a bad method)
    - there is a category which includes: "Malware = Adware, DoS, Constructors, Exploit, Flooders, Hoax, Jokes, Nukers, Sniffers, Spoofers, Virus Construction Tools, Virus Tools, Corrupted, Droppers, Intended, PolyEngines." maybe there are many jokes included or corrupted samples that only the original KAV does detect (due the inclusion of x-files databases or because KAV detects some garbage as infected).
    - etc.
     
  2. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    Of course, then add that to other discrepencies i've noted, and we can discount the entire test as a bunch of useless numbers, computed incorrectly..

    Reading even more, it seems he "Lumped" a bunch of useless crap into the Malware section, things such as jokes, constructors, etc. Well duh, no wonder some AV's scored bad on that section, i'd be pretty pissed off if my AV detected Jokes and Constructors and what not.. Sheesh.

    Also, his "File" section contains threats that are of no interest to Windows users. Hence, most of the AV's scored really bad on this section, severely slanting the results towards KAV engine products.

    If you take out the garbage in his test, products like NOD32 and DrWeb would suddenly score in the 98% range.

    As such, I have to discount this entire test.
     
  3. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    The 12-month free license they give out?
    Apparently according to discussions here with FireFighter;

    "There is no error in the results. Drweb found 3% less dos viruses, 40% less malware files and 25% less script viruses.These are the reasons it had such a low rank."
     
  4. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    So, now another question arises to me: Did he use the evaluation versions? DrWeb does in evaluation mode not scan within archives, and as Drweb consider HTML as archives, this could be the reason why it scored lower in those mentioned categories. Or did he use valid licenses for all the products?
     
  5. Stephanos G.

    Stephanos G. Registered Member

    Joined:
    Mar 29, 2005
    Posts:
    720
    Location:
    Cyprus
    I think we have given more attention than it deserves to that test:)
     
  6. VirusP

    VirusP Registered Member

    Joined:
    Mar 23, 2003
    Posts:
    22
    Location:
    Athens, Greece
    To Happy Bytes: I did use all possible heurs in Nod.

    As for the categories:

    File = BeOS, FreeBSD, Linux, Palm, OS2, Unix, BinaryImage, BAS viruses, MenuetOS.
    MS-DOS = MS-DOS and HLL*. viruses.
    Windows = Win.*.* viruses.
    Macro = Macro and Formula viruses.
    Malware = Adware, DoS, Constructors, Exploit, Flooders, Nukers, Sniffers, Spoofers, Virus Construction Tools, Virus Tools, Corrupted, Droppers, Intended, PolyEngines.
    Script = BAT, Corel, HTML, Java, Scripts, VBS, WBS, Worms, PHP, Perl viruses.
    Trojans-Backdoors = Trojan and Backdoor viruses.
     
  7. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    "BeOS, FreeBSD, Linux, Palm, OS2, Unix, BinaryImage, BAS viruses, MenuetOS"

    I dont expect any Windows AV to detect non-Windows malware.

    16-bit MS-DOS viruses are no longer relevant in a world that's going to 64-bit...

    Malware - Anti-Virus programs are not primarily meant to tackle Adware, Spyware, PolyEngines, nukers etc.

    Detection of corrupted files is almost worthless.

    This category should be excluded from the tests in my opinion.
     
  8. Happy Bytes

    Happy Bytes Guest

    Since when do we consider corrupted viruses as malware? :eek:
     
  9. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    Exactly. Also his list of "File" viruses is most annoying, why would he even include those when most of these products are specifically NOT targetting them?

    The "Malware" section is most dubious... Constructors? Corrupted Viruses? Jokes and Hoaxes? Please...

    This test does nothing but create false perceptions, misconceptions, and incorrect assumptions to potentially unknowing customers out there. Worse, it potentially damages the business of otherwise highly competant AV products. As such, this test has no merit for discussion other than to discount it and move on.
     
  10. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Here's an update guys!

    http://www.virus.gr/english/fullxml/default.asp?ACT=17&grp=2&msg=614&vp=0

    "About the versions i used, i downloaded the available demo versions of all
    software, about a week before the beginning of the test. As for Drweb case, i
    had to use a valid lisence in order for the software to scan all files.

    P.S. The .xls and .pdf files of the test are corrected. Thank you for letting me
    know."

    I asked him the question IBK wanted to ask ;)
     
  11. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    Why bother Firecat? I think most of the AV experts on this forum have shown that there are far too many questionable things about this test, that it really shouldn't merit any indepth examination.

    Frankly, I don't think it deserves discussion to this length.
     
  12. rdsu

    rdsu Registered Member

    Joined:
    Jun 28, 2003
    Posts:
    4,537
    Another lame tests....
     
  13. Diver

    Diver Registered Member

    Joined:
    Feb 6, 2005
    Posts:
    1,444
    Location:
    Deep Underwater
    Its lame if your favorite AV did not do well. Its no problem if your favorite AV did well. Ever notice how KAV and its clones always top the charts no matter who is doing the testing, and no matter what the methodology. I think that speaks volumes.

    Some of the weak ones like E-Trust are consistently weak, no matter who does the testing. So, I would not be so dismissive of these tests just because one or two of the local favorites did not meet expectations. Even good horses lose a race now and then.
     
  14. mnosteele

    mnosteele Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    194
    Location:
    Chesapeake, VA USA
    Every single time anyone does an antivirus comparison everyone flames the tester..... why? The tests are not as flawed as everyone is pointing out, besides there is no 100% fool proof test, the best test is real world experiences. I think it's great to see people post these tests so that the average user can see what has an excellent detection rate and what doesn't.

    Personally I think VirusP's tests are one of the best because they are more realistic, he uses every type of malware out there.... something that every antivirus program should detect.... ALL malware. I hate when people use the excuse that an antivirus program is supposed to only find viruses not trojans, spyware or anything else.... that's BS. The definition of a virus is "Viruses - A virus is a small piece of software that piggybacks on real programs. For example, a virus might attach itself to a program such as a spreadsheet program. Each time the spreadsheet program runs, the virus runs, too, and it has the chance to reproduce (by attaching to other programs) or wreak havoc." Well according to that definition anything malicious is a virus..... including spyware.

    I don't call viruses a virus I classify everything under malware "malicious-software" thus the term malware. It is all malicious and it all does harmful things to your pc in some shape or form. The classic "virus" is the least of your worries these days, spyware is the worst thing to hit the personal computer ever, we all know it and we have all seen the damage it does. Spyware costs the pc industry more money than any other single problem and the average users only uses an antivirus program so that's all the more reason it should detect everything.

    As far as everyone stating "this antivirus app is a clone of KAV (or whomever) so it should have the same detection rate as KAV".... sorry but you are all incorrect. For instance, Defender Pro, a supposed clone of KAV.... it has absolutely NO advanced features or customizing and does not allow for anything but to update and scan..... no extended database or option to scan ALL files... just a scan. Just because a program uses the definitions from another doesn't mean it will have the same detection rate. Why do you think so many use KAVs databases? Because it's so good, but none of them have the features that KAV has. If you want KAV (or whatever program) then buy it, not a "clone", it's like anything else in life..... buy the original not a knock-off.

    :) :cool:
     
  15. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Yes, when KAV does well in every test, it speaks volumes to say things like:

    -KAV detects corrupted/garbage files
    -KAV detects Non-Windows and MS-DOS malware, and is good at it
    -KAV detects software which are really not malware (Riskware)

    However, we can also say that:

    -KAV is very good at overall malware detection

    However, Any true AV test will never include such files (corrupted/garbage files, Non-Windows/MS-DOS malware, Riskware) into their test set.

    Remember,

    1)Corrupt/Damaged files can never cause damage
    2)The test is of Windows AVs not Linux or other OS.
    3)Riskware is really not malware.
     
  16. rdsu

    rdsu Registered Member

    Joined:
    Jun 28, 2003
    Posts:
    4,537
    I think that you have to see yourself the incoherences of the test, and some replies made by professionals in this Topic...
     
  17. Tinribs

    Tinribs Registered Member

    Joined:
    Mar 14, 2002
    Posts:
    734
    Location:
    England
    Most a/v producers and security product developers would not suddenly become either very good at detection rates when previously they were not, nor would they suddenly become very bad.

    These periodical tests mean little to me as the 'pecking' order changes very little either time a 'test report' is filed.

    If you can afford, and have system spec capable, then I suggest you go for one of the leading programes as they are consistantly there.

    If a a/v programme suddenly climbs to the top whence it was previously ranked pretty low then either the testers abilities are to blame or the programe really has hit upon a revolutionary approach to malware detection.

    Basically, my view is that if it works for you then continue letting it work, be clean be safe and be ready.


    Tinribs
    :) Kev
     
  18. richrf

    richrf Registered Member

    Joined:
    Dec 11, 2003
    Posts:
    1,907
    Hi all,

    What I take away from these tests, as well as my readings of other tests are:

    1) KAV's (and its clones) detection rates are excellent and it should be on everyone's review list.

    2) There are a number of excellent AV's that consistently do well and should also be reviewed, if the top rated ones do not fit the bill. Anyone can look at these tests (and others) and very quickly determine which other AVs should be evaluated along with KAV.

    3) All top-rated AVs have excellent anti-trojan detection, but for those who like layered protection, it seems like Ewido is doing a terrific job nowadays. In time, it would be great to see how BOClean stacks up.

    As such, I am pleased that I have chosen KAV and Ewido, based upon previous tests and overall experiences with these products and others over a three year period of time. I am confortable that this combination, together with ZoneAlarm Pro (no anti-virus), ProcessGuard and RegDefend, create a very nice security system.

    I think each user and each person who is recommending products to friends, associates, acquaintences, and fellow forum members, should evaluate each product based upon a variety of tests and experiences and come up with an architecture that best fulfills security objectives without compromising required performance. For example, NOD32 + Ewido seems like an excellent combination with reasonable overlap, but it would be difficult for me to recommend NOD32 as a stand-alone defense at this time.

    Thanks for sharing the tests and the results with me.

    Rich
     
  19. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    Why is it difficult to understand that one of the reasons KAV detects are good, is because it detects junk? Don't make me dig out the thousands of files I have that KAV detects as threats and which clearly are garbage.

    1) Many sites use KAV to verify samples, that in itself, slants the test results grossly in favor of the product used to verify samples!

    2) When some threats have 1 legit threatfile, and 10 extra useless files, KAV detects all of the useless files as the same threat. While other AV's tend to rightfully detect the main threat, and ignore the other files other than for cleanup. (For example, I have many threats that leave harmless INI files and junk around, KAV labels the INI files the same as it labels the trojan DLL itself). This is incorrect!

    3) In this test, the Malware section and File sections are DUBIOUS at best, and clearly most AV's purposely don't score well in those catagories - why should they? Constructors, Joke Programs, Hoaxes, and Corrupt Viruses are *NOT* real threats. If you take out those catagories, and the DOS viruses, then the results are very very slight between the top 20 AV products.

    I can't understand why more people just don't understand some of the basics of this. Every new test that comes out, I think most of us in the AV business are left scratching our heads - because we know that these incorrect tests actually carry weight with consumers that don't know better.

    Finally many things should be considered when picking an AV. Size in memory, size on drive, drag on system performance, interface style, detection rate against common threats, etc. The last thing I think most people should consider, is how many dead "Zoo" threats from 20 years ago a product detects.
     
  20. richrf

    richrf Registered Member

    Joined:
    Dec 11, 2003
    Posts:
    1,907
    Hi SDS909,

    I was involved for many years in industry tests, so I can definitely understand your frustration if you believe that all of the tests are skewing consumers to the wrong products. Do you have any evidence, either architectural or test-oriented, that one of the other non-KAV products has better detection rates. I use to provide "architectural" evidence to my customers, when test-data was either incorrect or skewed. So, I think some evidence must be presented by anyone who has contrary opinions. Frankly, I would even accept personal experiences, if for example someone told me that Norton continued to leak (as it did to me) and KAV (or some other product) never did.

    Cya,
    Rich
     
  21. Diver

    Diver Registered Member

    Joined:
    Feb 6, 2005
    Posts:
    1,444
    Location:
    Deep Underwater
    Vampiric Crow-

    I did not say the test was perfect, just that there is enough correlation with other tests like AV Comparatives, and even what Fire Fighter does around here, to show that it is not invalid.

    Apparently, you feel if there is any inconsistency the entire test is invalid. I don't agree. If you ask two people who witnessed the same event for their story, and both give the exact same story, that is a strong indication they are both liars. If a few things are different, that is OK.

    So, how did your favorite AV do?
     
  22. richrf

    richrf Registered Member

    Joined:
    Dec 11, 2003
    Posts:
    1,907
    One thing I do not like about these tests, is that it is not apparent at all how difficult it is to uninstall some of these products once someone has completed an evaluation. So users, may go into an evaluation stage not knowing that they installing programs that may be very, very difficult to uninstall. Norton, for example, is notorious for this. The ADS KAV 5.0 Pro places on the hard drive are also almost impossible to get rid of, should one want to uninstall KAV 5.0 Pro (Not an unusual occurrence). Pro, unlike Personal does not have an option to suppress ADS during installation, so the ADS are placed on the hard drive before anyone has a chance to stop them.

    It would be nice of users have some indication of these issues before they undertake a comparative evaluation.

    Rich
     
  23. rabmail

    rabmail Registered Member

    Joined:
    Feb 11, 2005
    Posts:
    47
    Location:
    Phuket, Thailand and Jakarta, Indonesia
    Rich, I agree with you that anybody evaluating an AV product and publishing the results should also include something about what it takes to uninstall the program.

    I don't agree that it is almost inpossible to remove the ADS that KAV 5 places on the hard drive. The Kaspersky removal tool works fine for me.

    Probably the easiest AV program to remove in NOD.

    Dick
     
  24. Honyak

    Honyak Registered Member

    Joined:
    Jul 19, 2004
    Posts:
    346
    Location:
    Deep South
    I do not see how you can claim that tests are slanted toward Kaspersky, just about all test confirm that Kaspersky has the best detection rates (by the way, I do not use KAV). On Jotti's site KAV rarely misses, now Jotti's may not be a reliable indication of an av's ability, but the fact remains, KAV rarely misses malware on that site.
    While these test results may carry some weight with a small group of people who do not know better, I don't think that it has a profound effect on the vast majority, the members in this forum are going to choose what is the most efficient AV program on their comp, sort of a cost to benefit based on the criteria you mentioned above or their surfing habits, high risk or low risk surfers, the rest are gonna use whatever comes on their computer.
    The virus.gr test may be flawed, but I still like to see different results when done by an expert or novice.
    Look at it this way, even if KAV flags junk it helps you keep your comp clean. LOL.
     
  25. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    The more I'm reading this anti-KAV posts, which seems to be more like a religion now, the more it makes me convinced about that, detection rate tests have only the second priority nowadays, the first priority is to make proper false positive (FP) tests. VirusBulletin, ICSA-Labs and Westcoastlabs-Checkmark have all failed in their FP-tests by accepting the KAV and it's clones still so often in their tests. Also av-test.org failed in their clean file tests by showing KAV and it's clones with very few FP:s, even so that NOD, DrWeb and BitDefender were all able to produce multiple FP:s.

    Best regards,
    Firefighter!
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.