Trend Micro IS Scores 100% on Matousec!!

Discussion in 'other firewalls' started by Scoobs72, Aug 30, 2011.

Thread Status:
Not open for further replies.
  1. Scoobs72

    Scoobs72 Registered Member

    Joined:
    Jul 16, 2007
    Posts:
    1,108
    Location:
    Sofa (left side)
    Following on from the Matousec discussion in the OA64bit thread, I thought I'd dig a bit deeper into the testing approach to take a look at why Matousec is such bad science. In Nov 2010 Matousec tried to justify the tests as relevant to real malware by analysing 20 malware samples. From these 20 it was found that 30 of the 148 test methods were used by real malware.

    Despite any evidence that the other 118 tests were relevant Matousec concluded "We are sure that if larger set of malware was used we would see even more techniques that are implemented in our tests". Bad bad science. No evidence at all. Just "we are sure". Yeah right.

    Anyway, take these 30 methods used by real malware and take a look at how one of the "Not recommended" apps performed.
    Let's pick Trend Micro, which scored a pitiful 9%:

    • Well, it passed Yalta and Leaktest which is used by 50% of the samples. That's good.
    • And it passed Autorun 3, the second most common method used by the malware
    • And it passed Autorun 1, the next most common method
    • Next up is Hostsblock which Trend passes also
    • Then Kernel 1, Kernel2, and Svckill but Matousec doesn't allow Trend to be tested against these because it doesn't pass his arbitrary "levels" testing approach.
    • Next is Jumper which Trend passes
    • Runner 1 & 2 next, along with Autorun7, Inject2, DNSTester, and a bunch of autoruns but again Matousec doesn't allow them to be tested. So far 100% for Trend.
    • Back to tests Trend is allowed to be tested against and Wallbreaker4 and AWFT1 are passed, but then one sample of malware uses the DNSTest technique and Trend has its first fail. :(
    • Every other test is not allowed to be tested against Trend.

    So by my reckoning we have 1 test failed out of the methods being used by malware in Matousecs own sample, yet Trend gets awarded 9% by Matousec and a "not recommended" status. And since 50% of the malware used Yalta or Leaktest, there's a good chance that Trend would have alerted to that piece of malware anyway. So I'm going to take artistic license and award it 100% against real malware rather than the 89% it got against the tests.

    Well done Trend!
     
    Last edited: Aug 30, 2011
  2. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,516
    Re: Trend Micro Scores 100% on Matousec!!

    Are you sure this is the right category? Trend Micro isn't a firewall, and Matousec only tests HIPS features.
     
  3. Scoobs72

    Scoobs72 Registered Member

    Joined:
    Jul 16, 2007
    Posts:
    1,108
    Location:
    Sofa (left side)
    Re: Trend Micro Scores 100% on Matousec!!

    Should have read Trend Micro Internet Security. Now amended. Probably would be better in the Other Anti-Malware Forum as it's a generic discussion about HIPS and Matousec's relevance but I was following on from the OA thread so not to derail it further.
     
  4. lordraiden

    lordraiden Registered Member

    Joined:
    Jan 30, 2006
    Posts:
    3,080
    You are inventing the results based on a limited sample of 20 malware files aleatory chosen with the only condition that were not detected by 2 av's, if you want to proof something you need give some evidences not just invent an imaginary test and see what would happen. Try to get 1000 malware samples to see which methods they use and test them and then come back so there will be something not invented to read, if you are able, of course.
     
    Last edited: Aug 30, 2011
  5. Baserk

    Baserk Registered Member

    Joined:
    Apr 14, 2008
    Posts:
    1,317
    Location:
    AmstelodamUM
    lordraiden, in your urge to write a reaction to Scoobs72 post (which is somewhat in gist, I think), you've given me an idea for a malware detection test with 1000 malware samples, in a way just as done by Matousec.

    Why not divide the 1000 samples in 100 detection levels of 10 samples.
    A security product can only go through to the next level of 10 samples, if it detects/passes the first 10 samples.
    Now let's start; Oops, product A fails at detecting sample number 8, it can't pass to the next level; samples 11 to 20.
    So, just as Matousec's score system, we can conclude that security product A failed to detect 993 out of 1000 samples.
    Ergo, product A has an abysmal detection rate of only 0.7%.
     
  6. Scoobs72

    Scoobs72 Registered Member

    Joined:
    Jul 16, 2007
    Posts:
    1,108
    Location:
    Sofa (left side)
    Once more, you miss both the point and the irony of the post. The irony is that in Matousec's attempt to justify their tests they have shown exactly why they are worthless. An application that they score as 9% protects you just as well against their own malware sample set as one which scores 100%.

    One has to wonder why Matousec would go to such great lengths to provide evidence that his tests are meaningless. Is the joke on us perhaps?!
     
  7. lordraiden

    lordraiden Registered Member

    Joined:
    Jan 30, 2006
    Posts:
    3,080
    Don't worry I get the irony and also the absurd of your post.
    The application scores a 9% and then in a totally different and imaginary test that only represent the 20,2% of the variety of the total (and just using as a sample 20 malware files) of 148 scores whatever you have invented...
    Everybody knows (Everybody how cares) that the levels things of matousec can not be realistic with all the products, anyway is written in the methodology so I still don't get what are you trying writing something so obvious.

    That's not the point (this thread comes from another where we are discussing other things) and I don't care about the products that has not been fully tested since they don't have a full HIPS so if it scores 71% or 9% I really don't care. But a "HIPS" that is not able to block the most simple leaktests is not going to block the rest with a 100% of rate, so you can not make the analogy with malware engines.

    I understand what Scoobs72 means but I don't understand why is so difficult (knowing that the matousec test has many tricks) to separate for some people the rubbish from the useful information and have an idea how good an HIPS can be, also (in my case) taking into account the rest of the leaktests public available, zemana, spyshelter, grc, even some malware.
    Matousec is just a part of my opinion about an HIPS, but seems that some people enjoy telling how bad those test are if you look at the table, I hope that for somebody this can be useful but I'm tired.
     
    Last edited: Aug 31, 2011
  8. Consoleman

    Consoleman Registered Member

    Joined:
    Aug 11, 2006
    Posts:
    15
    OP should have focus more on firewall leak test coz this section is about FW not AV. :p
     
  9. Scoobs72

    Scoobs72 Registered Member

    Joined:
    Jul 16, 2007
    Posts:
    1,108
    Location:
    Sofa (left side)
    Poster should read and understand the thread before
     
    Last edited by a moderator: Sep 1, 2011
Thread Status:
Not open for further replies.