VB 100 awards for this month!

Discussion in 'other anti-virus software' started by Technodrome, Jul 30, 2003.

Thread Status:
Not open for further replies.
  1. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    VB 100 awards for this month:

    CA Vet NetWare AntiVirus,
    DialogueScience Dr.Web, --- 100% all categories!
    Eset NOD32, --- 100% all categories
    Kaspersky AntiVirus,
    NAI McAfee NetShield,
    Norman FireBreak,
    Sophos Anti-Virus,
    Symantec AntiVirus, --- 100% all categories
    VirusBuster VBShield


    tECHNODROME
     
  2. jdong

    jdong Registered Member

    Joined:
    Jul 21, 2003
    Posts:
    13
    Location:
    At DSLReports...
    Note -- This is the VB for NETWARE ;)
     
  3. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    Its a great way to see AV product detection rates across all platforms. ;)


    tECHNODROME
     
  4. JimIT

    JimIT Registered Member

    Joined:
    Jan 22, 2003
    Posts:
    1,035
    Location:
    Denton, Texas
    TD,

    How many submitted AV's?
     
  5. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    11.



    tECHNODROME
     
  6. jdong

    jdong Registered Member

    Joined:
    Jul 21, 2003
    Posts:
    13
    Location:
    At DSLReports...
    Yes, I know. I value that, too. But initially when I looked at it, I thought


    (given) Some version of Windows was tested.
    (1) Avast Failed
    (2) F-Secure failed
    (3) F-Prot failed
    (4) AVG failed
    (5) A lot more than normal failed...
     
  7. zazou112

    zazou112 Guest

    hello,

    how do you said for the antivirus have passed the ALL cathegorieso_O?


    what is the web page for all the detail for each antiviruso_O


    thank you
    zouave
     
  8. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    ITW, ZOO,Macro, Polymoric test. See Virus bulletin test criteria.


    You need to be a subscriber of Virus Bulletin magazine to see this.


    tECHNODROME
     
  9. Bouton

    Bouton Guest

    Dr.Web missed 3 virus and gave 12 false positives.

    Symantec AntiVirus missed 35 virus.

    Rod Fewster (NOD32 Australia, formerly Kaspersky Australia, and before that, ThunderBYTE Australia) has many recorded publishings exposing the misleading of "bare-bones" VB 100. It is clearer now than before, but unless you are subscribed to Virus Bulletin, you will still be misled by the "bare-bones" figures.

    Snake oil is a sad fact of life in anti-virus marketing. Dr.Web and Symantec and Kaspersky alike are today praising themselves with superlatives, as if they made perfection in the testing, but their words are bending the truth a little.

    Only NOD32 made a truthful 100% detection of all virus in all tests in all categories in VB 100 August 2003, with no false positive.
     
  10. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    I have made a general comment here about the Virus Bulletin Pass award status;


    http://www.wilderssecurity.com/showthread.php?t=11872


    I presume the 100% award is for ITW detection and the Viruses that Symantec missed were in other categories e.g. zoo? However, it does state that Symantec achieved 100% detection in ALL categories. How do these 2 conflicting statements therefore match up?

    This is very confusing for newbies in particular in trying to select the best AV. As long as when running the program, it does not bring your computer to its knees, the virus detection ability of the particular AV is probably the most important property.

    The Virus Bulletin comparison Tables should therefore come with some sort of health warning as these 'summary' tables are highly misleading.

    I would much rather my primary AV show some false positives rather than miss out on some virus.

    At least the other 2 well known testing sites ( University of Hamburg and AV-Test.org)

    http://agn-www.informatik.uni-hamburg.de/eng.htm

    http://www.av-test.org/

    give you a little more idea of the general picture of testing.

    Overall, its probably best to look at a variety of test sites, forums etc to get some sort of idea and then you can trial out the particular AV to see how well or not it sits on your system.

    I found the recent posts initially confusing as it was not brought out it was NETWARE and the Virus Bulletin page still only refers to the June XP results as their latest results.
    Although it could be related to old age!

    Moreover, it was a pity that Bouton did not post first, as the true picture then unfolded.

    PS If you take a look over at dslreports you can see that I was not the only one confused ;); :D.
     
  11. Acadia

    Acadia Registered Member

    Joined:
    Sep 8, 2002
    Posts:
    4,332
    Location:
    US
    LOL Yeah, you can tell that this is a NOD support site, it can get a little thick in here at times. NOD is indeed a superior product, but not the only one!

    Acadia.
     
  12. Bouton

    Bouton Guest

    Unfortunately the choice of anti-virus testers is extremely limited, if you want professional reporting. University of Hamburg and University of Magdeburg testers are amateurs who produce amateur results. Virus Bulletin is IMHO the only truly honourable and credible and impartial anti-virus tester in the world today. I have no respect for paid tests, because they leave too much room for manipulation of the results.

    Virus Bulletin is not perfect, but I admire its honesty. Mistakes can be made in any test, but Virus Bulletin always admits any mistake it makes, e.g. in the August 2003 issue, Virus Bulletin confesses to a blunder in its May 2003 Linux tests. On the other side of the coin, www.av-test.org made a huge and very obvious blunder in a recent test, but adamantly refuses to admit it. Compare the honesty and integrity of the two testers, and decide for yourself where to place your faith.

    Security Forums like this one can be helpful, but do not believe everything you read in them without question. For all you know, I could be lying about the Virus Bulletin results, and relying on the fact that the odds against another subscriber being here to contradict me are high. :D
     
  13. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    When I commented about 'a true picture' I was referring to the fact of the fuller reporting of the Virus Bulletin facts by yourself.

    I would not rely solely on the results given by any ONE source.

    Unlike yourself, I cannot judge whether the other 2 testing sites I posted are run by 'amateurs' or that the recent av-test.org testing was faulty. How can you be so certain?

    We only have YOUR views here?

    It would be nice if you registered and gave an honest indication of your background so people can judge the fairness of your posts?

    In addition I would still like the answer to the following;

    How could Symantec score 100% in ALL categories, yet miss 35 viruses?
     
  14. eda

    eda AV Expert

    Joined:
    Jan 29, 2003
    Posts:
    6
    I can't understand such statement. The latest comparative review ran under Netware and you can see that some of the mentioned AVs weren't tested and they couldn't fail at all - see: http://www.virusbulletin.com/vb100/archives/products.xml?table

    There is a difference between: "Pass", "Fail" and "No entry" marks. Some of vendors do "No Enter" because they don't support the tested platform, some of them support the tested platform but they decided that it's much better to be listed under "No entry" than "Fail" mark. It can destroy the long term "Pass" line.

    avast! is not available under NetWare and it couldn't be marked with "Fail" sign.

    Eduard
     
  15. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    To Bouton from Firefighter!

    I don't think that the other testers than VirusBulletin are amateurs. VB does not test trojans or other nasties like av-test.org does. So how you can see the detection rates against that kind of malwares in VB tests.

    I think that av-test.org testbed is more wide than VB has, but I haven't seen it totally, only the summary numbers of each category!

    I counted manually the number of all viruses from the VB WinXP 2003 June test in macro, polymorphic and standard category from that link below.


    http://www.virusbtn.com/old/comparatives/WinXP/2003/test_sets.html

    You can check if my countings were right, but I think there were not so big mistakes that counts.



    Total number of viruses in each category VB WinXP 2003 June:


    Macro: total 1 004 viruses in 4 106 samples; an example, "W97M/Vampire.J 4", where I took one virus in 4 samples.

    Polymorphic: total 43 viruses from which 15 140 modifications; an example, "Russel.3072.A 500", where I took one virus from which 500 modifications.

    Standard: total 551 viruses in 1 667 samples; an example, "W32.Tuareg.B 8", where I took one virus in 8 samples and so on.


    After that I have got the summary results like this:


    Testing summary in all categories: 1 598 viruses in 20 913 samples or modifications.


    An example: VB WinXP 2003 June On-Demand; missed viruses with Hauri ViRobot Exp 4.0 (VB 100% Award):


    macro polymorphic standard total
    missed missed missed missed

    43 10 795 530 11 368

    How Hauri can miss 10 795 polymorphic viruses, when the total amount of polymorphic viruses was only 43? I think that it doesn't miss 10 795 viruses, but those modifications of those 43 viruses.

    Personally, I don't know if 1 598 different viruses is enough to test av-programs, but I doubt it!



    "The truth is out there, but it hurts!"

    Best Regards,
    Firefighter!
     
  16. Trevor Marsh

    Trevor Marsh Guest

    I am also somewhat sceptical of the value of VB100 awards. I arn't a subcriber so I don't have access to the detailed reports and have to rely on the published figures. The problem that I have is that they either, don't test all OS's supported every month, or don't publish the results for all OS's every month. After all, I run WinXp Home and it doesn't matter one jot to me if say NOD32 failed a Win 2k based test as I'm not running that OS. So all I care about is if the AV I am running consitantly scores 100% in all categories on the OS that I'm using. So I would be much happier if VB100 actually tested all OS's supported by AV products every month.
     
  17. sig

    sig Registered Member

    Joined:
    Feb 9, 2002
    Posts:
    716
    The VB is a mag primarily directed at industry pros rather than home users. It doesn't test every single month, at least according to the archives. And when it does test it's for one kind of system. Vendors are notified in advance what OS will be used for the upcoming test can decide if they want to submit their products (and what versions) to be tested. As mentioned previously, not all vendors support all systems or the system up for testing and sometimes even if they do, they may not choose to submit their product for testing.

    Also, AV testing done in a professional manner is not some quickie endeavor, or so I understand. It takes time and resources and the VB is not simply a testing organization although that's what it seems to be known for thanks to forums like these. Testing all types of systems for all AV's monthly? Does anyone do that?

    Paul Wilders is on vacation as I recall but Wilders does test the AV's and AT's they review. Since it's done by volunteers they limit the products they test and the frequency thereof I believe. Perhaps when he returns from his vac he could describe some of what's involved to credibly test such products.

    XP was most recently tested in June 2003. And W2K is sufficiently comparable that test results should not be qualitatively irrelevant for XP.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.