Does NOD32 Personnel Agree With (Virus Test by GEGA IT-Solutions)??

Discussion in 'NOD32 version 1 Forum' started by agoretsky, May 1, 2003.

Thread Status:
Not open for further replies.
  1. _anvil

    _anvil Guest

    I agree. :)
    ITW-malware spreads fast (almost) by definition, so it attracts the attention by AV vendors quickly, and they can add signatures to their database (no matter if packed, or non-packed.) Still, good unpacking capabilities _can_ even help here, e.g. if a known worm is just simply packed and redistributed (and there are examples for this!)


    Yes, a single non-spreading, uniquely packed trojan is not a great threat for the whole 'internet community.' ;)
    But ironically, _that_ is the big problem with non-spreading malware, which can be easily made 'undetected' by kids: it will likely never be added to AV's signature databases.
    You cannot 'rate' this threat by single malware samples (like the one, which is supposed to "scare the own granddad") - you have to look at the whole picture: how easy is it to create hundreds (thousands, millions,...) of malware samples, which will remain undetected for (probably) a long time?
    Not _one_ uniquely modified trojan will do much damage... but many of them will do...

    So, if you want to deal with trojans as a AV vendor, you have to cope with runtime-compressors, crypters, hex-editing, patching,... :rolleyes:
     
  2. Krazykidjoe

    Krazykidjoe Guest

    Hey, someone sent my herpes virus in the mail and NOD missed it. What's up with that??LOL...Had to bring some humor to this tread, since I'm the one that started it all. Sorry folks..
     
  3. Tinribs

    Tinribs Registered Member

    Joined:
    Mar 14, 2002
    Posts:
    734
    Location:
    England
    :D Dont you worry Krazy, its been on the cards for a long while now ;)

    And I heard Yoghurt is good for Herpes (or is that thrush??) ;)
     
  4. Hello!

    We have performed a _small-scale_ (internal) heuristic test just a few days ago. For this, we were using OLD (non-updated) virus scanners and let them run against the most recent viruses. What does such test show? If the detection for the *heuristic* (some also call it *generic*) detection works well. Good unpacker (decompressor) engines also helps to improve the detection. This kind of tests are called "retrospective tests". This test was only performed against two set of old databases (three and six months old scanners) against different new ItW viruses which can be found on the WildList (see below for more details). As a reference, I can tell you that all scanners (including Nod32, of course!) were able to find 100% of the ItW files from this test with the most current update.

    Again: That's a small-scale test and the only conclusion should be "we need to regularly update the av program". It should not show what's the best av program, because the test did not provide statistically useful results. We will perform a much bigger test quite soon (we'll start on Monday next week) - with more scanners (25+) and much more different product version (about 30 ones we've collected during 2002, and part of 2001 and 2003). This bigger test will give a quite better impression than this very short one.

    To the background of this test: In the past months, we have already collected lots of updates of nearly all av programs (e.g. all updated NAI, Symantec, Norman & Co. have released). However, Nod32 was not included in our collection and so I've asked Anton several times if it would be possible for him to send us the missing program versions but I never got an unseful answer about this. (He told me, that they have a copy of all updates ever released.)

    But he don't want to send me the files I need? Does he wants to hide something in Nod32? Anyway, if we're getting the updates fast enough, I'd be happy to include Nod32 in this retrospective (heuristic) test, too! Of course, everyone can easily check our results - they will be provided in the form you can find below. The a full set of infected samples (for virus researcher only), log files (for everyone who is interested in this) and in a form of an XLS/CSV sheet (for everyone).

    Here is the posting I have written a few days ago for an other (more internal) newsgroup with the announcement of the these small-scale test results. BTW, please excuse my relatively bad English. I know that's far from perfect. ->


    Hello!

    For the fans of heuristic tests - here is another one, with a three and a six month old virus collection. However, this time we have NOT used Zoo viruses, but *only* ItW viruses!

    The results are presented in absolute values (xx out of 46 different *new* WL viruses were detected), not in %. The first results show the detection rate with a six month (2002-09-23) scanner, the second value of a three month (2002-12-16) old scanner. WildLists 2002-10 (released at 2002-10-31) up to 2003-03 (released at 2003-04-03) were used. The testset was the same for the three and six month old scanners (therefore, the three months old scanner were already detecting a few more WL viruses, because they were already updated with signatures - however, this has no influence on the ranking below).

    Reference date: 2003-03-27 (date of the main test for all other results besides heuristics and where all scanners detected all of the viruses up to the WildList 2003-02). Test platform: Windows XP (SP1), English.


    OK, here we go (again: xx out of 46 viruses were detected):

    AVK (Kaspersky+RAV): 19 / 35
    F-Secure (Kaspersky+F-Prot+Orion): 15 / 33
    Kaspersky: 15 / 32
    McAfee: 21 / 34
    Norton: 15 / 30
    Panda: 12 / 30
    RAV: 19 / 35
    PC-cillin: 15 / 29


    Therefore, the ranking would be for the six month old scanners (best first):
    1. McAfee (21)
    2. AVK, RAV (19)
    4. F-Secure, Kaspersky, Norton, PC-cillin (15)
    8. Panda (12)

    And here the ranking for the three month old scanners (best first):
    1. AVK, RAV (35)
    3. McAfee (34)
    4. F-Secure (33)
    5. Kaspersky (32)
    6. Norton, Panda (30)
    8. PC-cillin (29)


    Please note that this was only one very small-scale test!!! It does not show much about the fact, how well the heuristics of a program really works. I'm currently working on more reliable results (with much more data (different updates) and much more programs). But this can take a few more weeks/months... ;-)


    For the one who like details - here are the results as CSV file ("0" means not detected; "1" means detected). You can also play a bit with this sheet, to get more results (e.g. how many programs were able to detect the Opasoft variants in advance etc.)

    ---start---

    "# Small-Scale Heuristic Test, Copyright (c) 2003 Andreas Marx, AV-Test.org, University of Magdeburg, Germany - Last update: 2003-04-23 #";;;;;;;;;;;;;;;;;;

    Name of the product;;;AVK;AVK;FSAV;FSAV;Kaspersky;Kaspersky;McAfee;McAfee;Norton;Norton;Panda;Panda;ReliableAV;ReliableAV;PC-cillin;PC-cillin
    Last updated at;;;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16;2002-09-23;2002-12-16
    WildList#;WildCore released at;;;;;;;;;;;;;;;;;
    2002-10;2002-10-31;VBS/Redlof.A;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Bugbear.A;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Chir.A;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Opasoft.A;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.B;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.D;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1

    2002-11;2002-12-03;Win32/Braid.A;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.E;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.G;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1

    2002-12;2003-01-05;W97M/Bottra.C;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Dupator.1503;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Opasoft.H;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Oror.B;1;1;0;1;0;1;1;1;0;1;0;1;1;1;0;1
    ;;Win32/Parite.A;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Winevar.A;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Yaha.C;1;1;1;1;1;1;1;1;1;1;0;0;1;1;1;1

    2003-01;2003-02-10:DOS/Fono;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/CIH.1049;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/ExploreZip.L;1;1;0;0;0;0;0;0;0;0;0;;1;1;0;0
    ;;Win32/Fono;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Frethem.K;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Hantaner;0;1;0;1;0;1;0;0;0;1;0;1;0;1;0;0
    ;;Win32/Holar.C;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Lioten;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Lirva.A;1;1;0;0;0;0;0;0;0;0;0;0;1;1;0;0
    ;;Win32/Lirva.C;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Lorez.1766;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Opasoft.F;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.M;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Opasoft.N;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Opasoft.R;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Sobig;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Surnova.E;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1
    ;;Win32/Whog.878.B;1;1;1;1;1;1;1;1;1;1;0;1;1;1;1;1
    ;;Win32/Yaha.J;0;1;0;1;0;1;1;1;0;0;0;0;0;1;0;0
    ;;Win32/Yaha.K;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0

    2003-02;2003-03-04;Win32/Bodgy;1;1;1;1;1;1;1;1;1;1;0;1;1;1;0;0
    ;;Win32/Opasoft.Q;0;1;0;1;0;1;0;1;0;0;0;0;0;1;0;0
    ;;Win32/Yaha.L;0;0;0;0;0;0;1;0;0;0;0;0;0;0;0;0

    2003-03;2003-04-03;W97M/Ekiam.A;1;1;0;1;0;0;1;1;0;0;0;1;1;1;1;1
    ;;Win32/Gibe.B;0;0;0;0;0;0;1;1;0;0;0;0;0;0;0;0
    ;;Win32/Lovgate.C;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
    ;;Win32/Opasoft.C;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Opasoft.G;0;1;0;1;0;1;0;1;0;1;0;1;0;1;0;1
    ;;Win32/Oror.Y;0;0;0;0;0;0;1;1;0;0;0;0;0;0;0;0
    ;;Win32/Parite.B;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1;1

    ;;SUM TOTAL:;19;35;15;33;15;32;21;34;15;30;12;30;19;35;15;29

    ---stop---

    cheers,
    Andreas
     
  5. solarpowered candle

    solarpowered candle Registered Member

    Joined:
    Jan 9, 2003
    Posts:
    1,181
    Location:
    new zealand
    Hey Spy1

    I know this question is diverging from the main flow of this thread , but I would really love to hear your results using benign if you were sent that set from Vamp. I use benign also. :)
     
  6. sig

    sig Registered Member

    Joined:
    Feb 9, 2002
    Posts:
    716
    OK. Someone starts the bashing in the second post in this thread by declaring NOD is just "hype" not because he really dislikes or has problems with the product but because he doesn't like the users?

    Then he proceeds to complain about bashing? Why not start the discussion off on a professional objective tone rather than bashing a product AND a testing organization because of a personal animus against the product's users? (Which, really, is rather silly.) Not as much fun perhaps, but perhaps more credible since there actually are issues worthy of discussion, although initially obscured by a personal agenda.
     
  7. JacK

    JacK Registered Member

    Joined:
    Jun 20, 2002
    Posts:
    737
    Location:
    Belgium -Li?ge
    Hullo sig,

    The thread seems getting in the right direction now ;)

    Rgds,
     
  8. Vampirefo

    Vampirefo Guest

    Spy1, you got mail.

    I hope you understand theses pictures, As you can see there are 12 viruses 6 packed 6 unpacked, See McAfee got all 12, NOD only 6.
     

    Attached Files:

  9. krazykidjoe

    krazykidjoe Guest

    Maybe Vampirefo knows more than he's getting credit for.
     
  10. wizard

    wizard Registered Member

    Joined:
    Feb 9, 2002
    Posts:
    818
    Location:
    Europe - Germany - Duesseldorf
    What do these pictures tell me? Nothing. I can dig into my collection and I am pretty sure I will find some samples that NOD32 catches while McAfee VirusScan misses them. So what?

    wizard
     
  11. wizard

    wizard Registered Member

    Joined:
    Feb 9, 2002
    Posts:
    818
    Location:
    Europe - Germany - Duesseldorf
    I had some disucssions about your "retrospective tests" before and I came to the conclusion that hese kind of test are too "theoretical". First weakness is that your scenario is based on a non-proper usage of AV software. I assume with nearly all av products you tested you will get an outdated virus signature warning.

    Second weakness is that your test scenario assumes that nobody is developing and updating his heuristic/generic detection. Let us take the following example: On a Thursday a new virus is released. On Monday a virus scanner added generic detection for that virus. So the scanner detected the new virus heuristically. But as in your test scenario described only 3 or 6 month old products are used such example would be rated as a failure.

    Please don't get me wrong: It is in intressting test but for pratical usage I think it does not have that much value.

    Just my two cents on this topic, ;)

    wizard
     
  12. Yes, but this is the ONLY scientific method to test heuristic/generic virus detection. There are other possibilities - like to write new viruses, to create some using a Virus Construction kit etc., but I'm sure that I do not need to add a comment why we are not performing such kind of test. (Igor Muttik from Network Associates has given a quite interesting presentation at a Virus Bulletin Conference two years ago about this topic and last year, I have give a presentation about the retrospective tests we've performed - you can find it at http://www.av-test.org -> References -> Paper -> VB 2002.)

    The test definitely only shows how well performed the program *in past* to detect new, yet unknown viruses. However, like in history courses we could learn a lot from the past and draw our conclutions for the future. For example, if a program detects almost 50% of the new ItW viruses already without updates, it has quite good heuristics. If it would only detect 10% ItW, the heuristic is likely not working very well - in this case, the product maybe do not have a heuristic at all, but only some signatures are "accidentially" catch the new variant, too. But there are also other possibilities what we can learn from such tests - for example, if the detection rate of unknown viruses has increased (in this case, we can be sure that the company has added a lot of heuristic technologies) or even decreased (in this case, it's likely that they have not updated their heuristics anymore).

    And you can measure a lot more - like the standard update internal, the size of the updates (and how fast or slow they were increasing in past) and so on. A lot of data, indeed. But you can really learn from it, how well a product has performed in past and you can draw your conclutions for the future...

    Oh, please read my message carefully: I was telling you, that the results with only two measured values (the detection rate 3 months and 6 months ago) has indeed no real statistical or scientific background - it was just a test for us *internally*. We wanted to test our test procedures only for the "big test".

    For the "big test" we are going to use ALL updates of ALL products we have. These results will be indeed much more useful. And of course, they will really show how well (or poor) a product was performing.

    We hope, that we can also tell you something like "Nod32 with the update from Monday was able to detect the new virus which was released three days later. The heuristic has catched it. And two days later, the company has added a proper name (using signatures)." I don't know, if this is possible, but we'll try it!

    cheers,
    Andreas
     
  13. mrtwolman

    mrtwolman Eset Staff Account

    Joined:
    Dec 5, 2002
    Posts:
    613
    Your primary assumption is not correct. There is at least ONE (one) other option to test heuristic/generic virus detection. Get the actual scanner and turn off the scanstrings. This would show how strong the heuristic engine of particular antivirus is at any given point in time. Maybe this is called proof by contradiction :cool:

    Hope this method is scientific enough .....
     
  14. Vampirefo

    Vampirefo Guest

    Too bad the pictures tell you nothing, they tell me a lot, NOD can't unpack the files, So it can't detect them, 12 files. They are the same viruses just I packed six of them.

    I show you a picture and it teaches you nothing, I am sure it taught some people something, I can show you a picture of NOD missing 50 viruses, or more. NOD can't unpack so it will miss every single virus I pack.
     
  15. mrtwolman

    mrtwolman Eset Staff Account

    Joined:
    Dec 5, 2002
    Posts:
    613
    There is not a wise idea to state in public forum you create viruses :D
     
  16. JacK

    JacK Registered Member

    Joined:
    Jun 20, 2002
    Posts:
    737
    Location:
    Belgium -Li?ge
    Hello Vampirefo,

    And I could do exactely the same for any AV product choosing an exotic packer :8

    ASAIK, NOD32 claims 100 % detection for ITW virus, nothing more nothing less.
    As they are packed there are no more the same virus in essence but a new variant ;)

    If you begin to spread them widely maybe they will be add into the ITW List and NOD32 will include them in the def ;)

    Rgds,
     
  17. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    Just because antivirus can unpack it does not mean you are safe. There are plenty of packers out there that will make trojan undatectable. So I agree with Jack on above point.



    Technodrome
     
  18. Vampirefo

    Vampirefo Guest

    Jack, NOD's claim is false they can't detect 100% of anything, I run NOD against a collection of 50 ITW viruses, NOD can detect 37, that's not 100%.

    50 unpacked viruses, I had to unpack them, So NOD can scan them LOL. Kind of like opening the candy wrapper for a baby, and NOD still misses a lot.
     
  19. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    Show the list of those 50 ITW viruses that nod32 missed.
    By your claim NOD32 will only detect around 25% of ITW? Funny.



    Techodrome
     
  20. Vampirefo

    Vampirefo Guest

    LOL, well you know NOD misses all 50 when they were packed, So after I unpacked then, NOD can detect 37, So a little better than 25%.
     
  21. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    Well show those 13 then. If those 50 files (packed or unpacked) were ITW they should and will be detected by NOD32 or any good ITW scanner.

    So names would help.



    Technodrome
     
  22. Vampirefo

    Vampirefo Guest

    McAfee detects them, LOL.

    I am looking at the logs, So far NOD miss these by name.

    Win32.Aldebaran.exe
    Win32.Chiton.exe
    Win32.CabInfector.exe
    Win32.Dislex.exe
    Win32.Alcaul.exe =W32/Alcon

    I am going to son's soccer game, will give you list when I get back, it takes time to go through list a see what 13 NOD, miss.
     
  23. Technodrome

    Technodrome Security Expert

    Joined:
    Feb 13, 2002
    Posts:
    2,140
    Location:
    New York
    These are not ITW viruses!



    Technodrome
     

    Attached Files:

  24. spy1

    spy1 Registered Member

    Joined:
    Dec 29, 2002
    Posts:
    3,139
    Location:
    Clover, SC
    Got some mail from my friend Vamp here:

    (No changes made by Benign)
     

    Attached Files:

  25. Metallica

    Metallica Guest

    Does that mean they are ItW now? :D
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.