Enterprise Endpoint Testing Results (Jan 2011): Real World Malware Blocking

Discussion in 'other anti-virus software' started by King Grub, Mar 22, 2011.

Thread Status:
Not open for further replies.
  1. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    818
  2. tesk

    tesk Registered Member

    Joined:
    Aug 9, 2007
    Posts:
    100
    Having a hard time finding those results serious. :rolleyes:
     
  3. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,644
    Location:
    USA
    Thanks, I needed a laugh. :D
     
  4. malexous

    malexous Registered Member

    Joined:
    Jun 18, 2010
    Posts:
    830
    Location:
    Ireland
    Last edited by a moderator: Mar 22, 2011
  5. dschrader

    dschrader AV Expert

    Joined:
    Mar 10, 2009
    Posts:
    54
    My AV-Test results trumps your AV-Test result . . . . .

    Sigh.

    So why did 2 reports from the same tester (AV-Test) have 2 very different results? It is all in how you configure the test. This report was carefully configured to test Trend's strength - URL filtering and not to test their weakness - malware detection.

    The Trend report is all about malicious URLs - no malware was delivered by email, USB, network share, CD/DVD . . . .

    Trend's first line of defence, and their strength is URL reputation. They block known malicious URLs. Will they do as well with malware delivered by email or a drive-by download from a legitmate site that was hacked? This report doesn't say. The Trend product only had to detect 6 malware samples - because they had most of AV-Test's set of malicious URLs in their database. URL filtering is great, I'm all in favor of it. But it is not sufficient. Again, what if the malware came by network share, USB, email or from a legitimate site? What if the malware sample came from a site that Trend didn't have on their block list?

    The bottom line, this wasn't a test of malware detection, it was a test of known malicious URL detection. Was his sample set of URLs sufficient? Statically significant? Where did he get it from? The report doesn't address those questions.

    Oddly, in a test of enterprise products, AV-Test chose to test Symantec's older, SEP SBE (Small Business Edition) instead of testing Symantec's newly announced (though not yet shipping) SEP 12.1 for enterprises. He certianly knew of 12.1 - since he was testing it for us. But Andreas Marx never asked to use it in Trend's report.

    His other reports (done for my employer, Symantec) can be found here:
    http://www.symantec.com/content/en/...rces/b-real_world_testing_report_OR.en-us.pdf
     
  6. kmr1685

    kmr1685 Registered Member

    Joined:
    Aug 22, 2009
    Posts:
    62
    they need their own base to sell you know. :D
     
  7. nosirrah

    nosirrah Malware Fighter

    Joined:
    Aug 25, 2006
    Posts:
    560
    Location:
    Cummington MA USA
    200 URLs represents less than 2% of my current live watch list and this does not include any 'stepping stone' URLs.

    I post this for a little perspective on what 200 may or may not represent.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.