Automated whole product testing of 5 programs (2010 versions) vs. over 14,000 samples

Discussion in 'other anti-virus software' started by MrBrian, Dec 6, 2011.

Thread Status:
Not open for further replies.
  1. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    From "Building an Automated Large-Scale Analysis System for Measuring the Effectiveness of AV-Software on the End Host" (hxxp://sel.bfh.ch/resources/thesis/linder11.pdf):

    14,200 malware executables were executed in an automated testing environment. These products were tested: Norton Internet Security 2010, Kaspersky Internet Security 2010, Sophos Endpoint Security 9.0, F-Secure Internet Security 2010, and Comodo Internet Security Plus. Total detection rates were 82.6%, 78.9%, 72.4%, 91.8%, and and 97.8%, respectively - see p. 185.

     
  2. Ranget

    Ranget Registered Member

    Joined:
    Mar 24, 2011
    Posts:
    846
    Location:
    Not Really Sure :/
    Look Really reliable
     
  3. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Some details from the thesis:

     
  4. Ade 1

    Ade 1 Registered Member

    Joined:
    Jun 21, 2006
    Posts:
    471
    Location:
    In The Bath
    Why are the 2010 versions being tested and not 2011/12?
     
  5. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    It's an interesting methodology,somewhat like the Bluepoint tests from a while back.Unfortunately I'm getting a 404 on the link posted so I can't read all the details.

    @ Ade 1
    It would appear that the tests were run some time ago according to MrBrian's quotes.
     
  6. abels

    abels Registered Member

    Joined:
    Apr 14, 2007
    Posts:
    102
    Location:
    Danang, VN
    Comodo IS did a very good performance.
     
  7. markusg

    markusg Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    248
    hmm samples collected last year, does this means this test is 1 year old and was publish this year?

    It contains samples collected during one month starting by the date 8/18/2010. The sample set that we
    did receive has already been preprocessed by Trend Micro.
     
  8. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    977
    Location:
    Paris
    For those wondering why 2010 products and malware are being used, remember that this is a thesis (or more properly a dissertation of this was in support of a Doctoral degree). The tests first have to be run (obviously) and the paper written. After the paper is finished it has to be presented to the student's Mentor for review followed by a presentation and defense in front of a Thesis committee. The paper must pass this defense prior to publication.

    Trust me that this is a lengthy (and nerve-racking) process. A span of 12 months between the paper's completion by the student and publication is not out of line at all.
     
  9. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    I thought this kind of work was outside the purview of thesis research at doctoral level, but it's pretty cool if it is allowed. I think it may be a master's thesis rather than doctoral level. In any case, it can take up to 12 months to actually get published, regardless of how quickly the paper was completed/submitted/reviewed.
     
  10. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    977
    Location:
    Paris
    Totally agree with you, Firecat. A true PhD Dissertation has to provide new knowledge, and this paper hardly qualifies. If anything this is a BS thesis, or else a MS thesis from a school like Joe's College of Science and Knitting.
     
  11. EscapeVelocity

    EscapeVelocity Registered Member

    Joined:
    Apr 1, 2010
    Posts:
    368
    LOL!

    BS Thesis it is!
     
  12. smage

    smage Registered Member

    Joined:
    Sep 13, 2008
    Posts:
    377
    Good Comodo no longer needs AV-C then:)
     
  13. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    977
    Location:
    Paris
    It's amazing how much we learn when we actually read! I finally looked at the original paper- It is a Master's Thesis and was originally published in January 2011.

    And for any out there who have gone through this process, you will absolutely LOVE this comment: "Although we did not include statistics in our work, our measurements should be significant". That one has to be right up there with "The dog ate my Homework".
     
  14. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    I'm now getting this also.
     
  15. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,516
    I'm surprised Comodo didn't openly object to anything here.
     
  16. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Somebody else already posted this topic at Comodo forums.

    Since the paper apparently is no longer available, I should note that the author mentions that the good results for Comodo and F-Secure were at least partially due to sandboxing of unknown executables (Comodo) and blocking of suspicious executables (F-Secure).

    The paper indeed has a date of January 2011. I didn't post about it earlier because I didn't know about it earlier. Nonetheless, I thought the information might be interesting to some people.
     
  17. cruelsister

    cruelsister Registered Member

    Joined:
    Nov 6, 2007
    Posts:
    977
    Location:
    Paris
    To any that want to view the paper,

    http://webcache.googleusercontent.c...thesis/linder11.pdf &cd=1&hl=en&ct=clnk&gl=us

    One thing that should be noted about this paper is that the results are heavily biased toward the AM product with the strongest HIPS. Please note this statement (page 159) "We use outdated signature databases and samples that have been collected after the last update of the AV-software and their databases". Although not well worded, what this person did was update the signatures prior to the collection of the malware and then shut off automatic updates. So the sigs were at least a few days old, thus handicapping to an extent those products with a heavy reliance on signatures.

    Further, it seems the author makes the same mistake many still do: that a piece of malware that shows up today in a honeypot MUST be zero day. Actually nothing could be further from the truth! As an example this morning this morning my group received a packet of "zero day" malware from one of the pots we use. A quick analysis from both the file time stamp and a crosscheck with our database showed only 40% of the files were unknown; of all the rest, they had made their first appearance from a few days prior to collection to as much as over a year ago (26-11-2010). Hardly Zero-day!!

    Beating a dead horse even further, some of the files that seem to be new actually are not. Malware writers tend to be verbose when initially coding- they will add things like self-limiting loops, self-checks, etc. The reason for this is that in a day or so they can nop out a few lines of the non-malicious code thus transforming something that may have been around for a while to a "new" entity (Ransomware is notorious for this). This may fool a signature based product, but something with a strong HIPS laughs at it.

    Finally, many thanks to Brian for the initial post! It was a fun read.
     
    Last edited: Dec 14, 2011
  18. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    You're welcome :). Thank you for the cache link.

    The signatures were last updated on August 18, 2010. The automated testing was done from October 12, 2010 to October 26, 2010. The reason for using outdated signatures is the author's desire to heavily test the on-execution parts of the products.

    The author did a preliminary test with 50 samples.
     
  19. m00nbl00d

    m00nbl00d Registered Member

    Joined:
    Jan 4, 2009
    Posts:
    6,623
    I'm pretty sure he would, had COMODO done really bad. :blink:
     
  20. pbust

    pbust AV Expert

    Joined:
    Apr 29, 2009
    Posts:
    1,173
    Location:
    Spain
    This is not true. Of course we do have on-execution behavioural analysis (TruPrevent has been integrated into our products since 2004!). I would like to see proof where we say we don't have behavioural analysis.
     
Loading...
Thread Status:
Not open for further replies.