i am new and i have a noob question

Discussion in 'other anti-virus software' started by chris2busy, Sep 8, 2007.

Thread Status:
Not open for further replies.
  1. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    Hello everyone!I am pretty newbie in those issues and decided to read over those pages to get informed since to me my confidential data are precious as to anyone here i guess..ok..enough with the intro..

    I been looking over Andrea's tests and would like to ask a question..When "detection rate" is stated does he mean that all the antiviruses got the signatures of all these samples and so he does that test to check how thorough their scanning engines are?Or there are just samples collected and he just counts how many of the antiviruses got signatures for each of those threats?been over his methology as well but they are way too full of terminilogy for a newbie like me..any response is greatly welcome :)
     
  2. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    not quite sure what your asking, i thought the results are pretty simple.

    1. There were 808,344 threats in the test
    2. percentages are given to how many each antivirus 'detected'
    3. removal is not tested
    4. detections could be either heuristics or signature based, the AV either detects the threat, or doesnt.
    5. They are samples collected/sent in that have been checked for malware-code by automatic tools.
    6. Andreas does NOT use any single AV to check if a sample is malware or not.

    ie. Norton detected 798,627/808,344, not detecting 9,717 samples. (98,80% detection)

    thats pretty much as simple as i can make it for you.
     
  3. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Samples are collected, analyzed and grouped into their respective sections. It should be made sure that there are no samples of adware/spyware/riskware (since this is not covered under AV-comparatives' regular tests), so additional analysis is performed for this.

    The term "signature" is pretty relative - for example, some AVs may detect several malicious samples under the same name (say, for example: "Worm.Blastme"), while others may detect these malicious samples under different names (eg: "Worm.Blastme.A", "Worm.Blastme.B", etc.). As such there is no AV in the world which can detect 100% of the malware out there. The goal of Andreas' (Clementi) test is to provide a picture of how many malicious samples are identified as being malicious by the respective AV products (provided they are updated to the latest definitions at the time of testing).

    Therefore, "detection rate" means a percentage approximation of the exact number of files/samples identified by the AV products as being infected by some kind of malware. This includes detection by both signatures and heuristics.

    So, the number of signatures is not a factor, what is represented is how much malware is actually detected by the combination of the signature and heuristic technologies in various AV products. :)

    The above comments are true for the On-demand comparatives. The retrospective tests have a different methodology - Only the new samples collected during the period between the last test and the current one are used, and the AV products are kept updated only to the point of the previous On-demand comparative. For example, if the On-demand comparative of February 2007 had the products updated at 2nd February 2007, then the Retrospective comparative of May 2007 will also have the product updated only till the 2nd of February 2007. This is done to get a gauge on an AV's ability to protect from threats that it does not "know" about yet, i.e. it is assumed that the AVs are not able to protect against these threats since they were released and acquired after the day of updating of the AVs. :)
     
  4. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    to make it more clear..E.g norton didn't detect those 9.717 threats due to lack of signatures(their labs never found those malware) or due to weakness of their scanner engine?
     
  5. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    they just didnt find the threats, no signatures for them etc. (or.... follow firecat's techie explanation below)

    engines do not find threats, engines enable the scanning of certain files within a file, but if there are no signatures or no heuristics for that type of threat, an AV will state the file as clean.

    an AV such as kaspersky, include a PDM (proactive defense) that even if a file is undetected, if the threat tries to do anything to the users machine, the PDM will pick up on it. (99.9% detection)

    as you are a n00b, this is called a behaviour blocker, you can also get these as a standalone product, to run alongside any AV you may use. (Norton Antibot or cyberhawk (Free))

    im sure between firecat and myself, you understand, right? :)
    as ever, firecat always comes in with his technical-spin on the topic.
     
  6. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    We can have many possible scenarios (some "simple" ones are listed below):

    1) The product couldn't detect those threats because the company's labs did in fact find those threats but did not add detection for them as they felt that these files were of low priority.

    2) The product couldn't detect some threats because maybe their scan engine cannot scan the file due to whatever reason, maybe their unpack engine is weak or something. Maybe an error during scanning due to some bug.

    3) The labs simply did not find this malware.

    4) A bug in the detection routine, signature and/or program code caused the file to either completely not be detected or to be not detected under specific circumstances. For example, some files may be detected on-demand but not real-time and vice versa.

    The reasons are not analysed in the testing process, what is shown is how much an AV does detect. The reasons could be either or all of the ones above in varying degrees. ;)
     
  7. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    yes :) i thanks you all for your almost immediate responses :) so..firecat also helped because in the beggining i thought that the on demand test was only a test between the labs of each av but with his answer i got that its not since HIPS heuristics and number of engine are also included to this test..thanks to both of u :)
     
  8. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    HIPS are not included in the test, i may have confused you there by mentioning it, i was ust alerting you to the fact of this extra security if you feel you need it.

    the tests: heuristics & signature are included in the on-demand tests.

    oh, and welcome to the forum :)
     
  9. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    ow..and what about kaspersky?till recently it never had a heuristic(generic) analyser..just PDM..i thought that is how it was tested...Andreas also writes that kav-kis does not detect but prevents after malware was executed o_O so behaviour analysers and hips should have been included all along?no?

    P.S : glad to be among u :p joined earlier but was lurking till i learn a few stuff before i go embarassing myself :)
     
  10. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    kaspersky 7 has new heuristics. (see the retrospective test of av-comparatives)

    kasperskys PDM will find 99.9% of 'undetected' threats due to its behavior blocking, if the threat that is undetected tries to do anything on the machine, the PDM will stop it, well ... 99.9% of the time.

    you shouldnt worry about that, ive embarressed myself quite a few times on here, i feel strange if i dont get something wrong on a daily basis, but at least there is always someone to correct you on here, this place is full of smart-ass gits. (points towards firecat as one of them)
     
  11. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Well, HIPS is not the same as heuristics. Heuristics in the traditional sense means looking at the code of each and every file scanned so as to identify any code that looks suspicious and behaves like malware. When such code is identified, a heuristic detection is reported.

    In recent days, heuristic engines are able to "run" the malware in a very limited "virtual environment" (sort of like an OS-within-an-OS, OS = Operating system [example: Windows XP, Vista], but extremely limited in their functions).

    Heuristic engines work whenever the file is scanned, either real-time or on-demand.

    Whereas a behaviour blocker or Proactive Defense Module (as Kaspersky calls it) works on-execution, i.e. when you run the file and it begins doing whatever it was meant to do. Whenever the file, after execution, does something that the program thinks is suspicious, the program immediately identifies this as a possible threat. :)

    As for number of engines, the number of engines are indeed specified, but are not of any practical use to the actual testing process. The goal is to find out how much malware was detected by the product, and the number of engines has little or nothing to do with this.

    For example, F-Secure has many engines, such as Kaspersky's engines, Orion, Libra and Pegasus (Norman Sandbox). It detects more than Kaspersky, maybe because it provides a solution to the four factors I listed in my previous post through its additional engines which may not suffer the same problems as each other. ;)

    At the same time it does not have the highest detection rate. This is because the other engines do not contain a significant amount of signatures as compared to the main engine (Kaspersky). :)

    So, in the end, the number of engines does not always determine the overall detection rate of the product.
     
  12. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    As mentioned in my previous post, Kaspersky's PDM works on-execution, i.e. it works when the file is actually opened by the user, and not on-access (i.e. before actually opening/running the file) like real-time/on-demand scanning. AV-comparatives measures only the on-demand scan rate (if you don't know, an on-demand scan is when you load your AV and tell it to scan, say, your entire computer for viruses). In the on-demand scan, the PDM cannot be active.

    Kaspersky 7.0 introduced new heuristics technology, which is independent from the PDM module and works in real-time as well as on-demand scanning. :)
     
  13. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    i'd say they do?its most probable to have a huge ammount of packers support so the detection is pretty supperior to a single engine that misses samples because it cannot unpack them and just says the sample is clean because it cannot read within it and treats it as a file itself..P.S been using nod32 for a while...no comments about its packers support to the product i use now.
     
  14. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    One can say its probable, but that also depends on the experience and competence of the developer of the respective engines. ;)

    For example, if I have my custom engine which unpacks 32 packers but all 32 are already supported by say, BitDefender, which I already use in my product, then my custom engine will make little difference, except in cases where the BD engine has a bug in unpacking some file due to wherever reason - then there is a chance my engine may unpack it. :)

    And then maybe I have a signature for some malware in my engine which is not detected by the BD engine. This is another case where an additional engine may help.

    But such cases are in general, quite few, so, the addition of a new engine may indeed help the detection rate (i.e. increase the detection rate), but usually not by a significant margin over the "main" engine (i.e. BD in my imagined case, and KAV 6 in case of F-Secure). Note that F-Secure uses the KAV 6.0 engine and hence does not have the new heuristic analyzer of KAV 7.0, and this is why it scores less than KAV 7. However, it still manages a marginally better score than KAV 6.0, whose engine it uses. :)

    But....if I use, say, the Rising AV engine with the VirusBuster engine (both do not qualify for testing at AV-comparatives due to low detection rates), then my product may reach the minimum score required to qualify for testing at AV-comparatives, but it still is not going to score 95% or above. This is a case where having more engines does not (really) make a product superior. Indeed it is superior than the parent products (Rising and VirusBuster), but in overall score is quite average. :)

    Of course, multi-engine products have their own set of problems, the most common of which is somewhat slow scanning speed and slightly high use of system resources. If you can bear with it, then its fine. ;)
     
  15. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    yes i noticed that :) and since the last proactive test they would score up to 5 times more than kaspersky proactively..what also amazed me is that kasp initiated first heur engine with very very few false positives(saw on the graph :p)
     
  16. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Actually the last proactive test featured Kaspersky 6.0 with 9% proactive detection rate. Kaspersky 6.0 already had a heuristic analyzer, just that it was not very good and it was a bit old. F-Secure scored better than Kaspersky due to the presence of the custom engines as well as the Pegasus engine, which contains heuristic technology licensed from Norman. :)

    Kaspersky 7.0 just added a new analyzer which improved its detection rates a lot. Kaspersky 7's performance is tested in a separate whitepaper on the website. ;)
     
  17. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    yep i saw that as well..it was pretty impressive actually
     
  18. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Yes, I guess it was. :D

    Which product do you use now anyway? (out of curiosity) :D
     
  19. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    kav ofcource :)
     
  20. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    I use AVG (Anti-Malware) as primary and BitDefender whenever I have a showstopper bug with AVG. :)
     
  21. chris2busy

    chris2busy Registered Member

    Joined:
    Jun 14, 2007
    Posts:
    477
    ow..im very fond of the fast uprising avg as well :) but i think we are about to go off topic here :D if any mod could,should lock this topic,as it has surved its purpose tyhrough the excelent help provision of Firecat and C.S.J !!thank u guys :)
     
  22. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    57,794
    Location:
    Texas
    Since chris2busy has his questions answered to his satisfaction, we'll close this thread as requested.
     
Loading...
Thread Status:
Not open for further replies.