New RAP From VB 100 - MSE Better Than Avast???

Discussion in 'other anti-virus software' started by FreddyFreeloader, Sep 26, 2013.

Thread Status:
Not open for further replies.
  1. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Then its a pointless test (i admit, i haven't checked how they perfom it). No client antivirus works on-demand only these days.
     
  2. Sprocket

    Sprocket Registered Member

    Joined:
    Dec 24, 2012
    Posts:
    75
    You raise a more general question - are the testing methodologies used by any of these various sites valid? I.e., are their results indicative of the effectiveness of these anti-malware products when used by typical users doing typical things on typical PC's? Microsoft has been saying that they are not - that these tests do not represent the threats that real users encounter, and therefore are not really all that useful. As Vojta pointed out earlier, Kaspersky and Symantec have been saying similar things. Of course, you can draw your own conclusions, and your opinion may differ.

    I have no dog in this race - I dabble with many AV's, and all of them have strengths and weaknesses. Frankly, I have had more real problems caused by antivirus programs misbehaving than by actual malware. If I had my druthers, I wouldn't use any of them. Unfortunately, the world being what it is, that's not an option. I am still looking for one that runs unobtrusively in the background, and keeps stuff I don't want on my PC, off my PC, without causing more problems than it solves. I am still looking...
     
  3. Behold Eck

    Behold Eck Registered Member

    Joined:
    Aug 23, 2013
    Posts:
    574
    Location:
    The Outer Limits
    Have you dabbled with UnThreat yet ? Good result for it.:thumb:
     
  4. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    TBF there's nothing inherently wrong or pointless about detection tests in general. They just measure one important aspect of an AV, and aren't meant to be the be all and end all of AV testing. The execution may be lacking in most cases including this, but one done well is no "inferior" to any other type of test.

    My take on them in bullet points:

    - most detection tests are flawed, but some can be reliable for what they are
    - some vendors do cheat for them, but it can be prevented
    - there's no excuse for consistently performing much worse than the competition no matter what Microsoft and Symantec say :rolleyes: as it's possible to do well in every test without cheating

    PS. I like the fact that Kaspersky enter every test anyway. :thumb:
     
    Last edited: Sep 29, 2013
  5. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Thanks for the update!
     
  6. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Agreed!
     
  7. ctrlaltdelete

    ctrlaltdelete Registered Member

    Joined:
    Oct 16, 2005
    Posts:
    318
    Location:
    NL
    MSE was not tested in the last test, it was Microsoft System Center Endpoint Protection and the RAP score was not that good as you can see here. See Microsoft System Center in that graphic?

    MSE is only tested once during the Feb-Aug 2013 period and in that particular test the RAP was a piece of cake for most AV's. They all scored high, but the following RAP tests were harder and lowered the average score for almost all AV's except MSE because it was never tested again.
     
  8. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    You can answer that yourself by reading their methodology. If you think on-demand scanning is more valid in real life than real-time protection, go ahead.
     
  9. Nebulus

    Nebulus Registered Member

    Joined:
    Jan 20, 2007
    Posts:
    1,635
    Location:
    European Union
    Maybe I am missing the point, but from a "Reactive" point of view (that means samples before the deadline, as their procedure says) MSE and Avast are on the same level. I don't quite understand what relevance the "Proactive" test has, because if it uses old signatures to detect unknown threats (and not some form of heuristics, for instance), then I think they are doing it wrong... It seems more like a lottery than real quantifiable testing.
     
  10. acr1965

    acr1965 Registered Member

    Joined:
    Oct 12, 2006
    Posts:
    4,995
    There are some benefits to these tests, despite their criticisms. Take Check Point for example, their results are pretty abysmal while relative unknowns such as Fortinet do fairly well. Consumers need somewhere to look in order to see how products stack up. And consumers need to rely on something besides the marketing schemes of security software companies which sometimes are the equivalent of scare tactics. Of course, this test shows limited results and may not include all the factors consumers may want to consider. And these testing companies usually need to make a profit somehow or another. But between a test result and an online advertisement of a security vendor I'll usually pay closer attention to the test results and dismiss the online ads altogether.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.