New RAP From VB 100 - MSE Better Than Avast???

Discussion in 'other anti-virus software' started by FreddyFreeloader, Sep 26, 2013.

Thread Status:
Not open for further replies.
  1. FreddyFreeloader

    FreddyFreeloader Registered Member

    Joined:
    Jul 23, 2013
    Posts:
    527
    Location:
    Tejas
  2. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    8,005
    Microsoft Security Essentials:
    ---------------------------------
    From the chart (bottom, on the right side):
    Also:

     
    Last edited: Sep 26, 2013
  3. FreddyFreeloader

    FreddyFreeloader Registered Member

    Joined:
    Jul 23, 2013
    Posts:
    527
    Location:
    Tejas
    MSE in red, but VB100 has tested MSE since 2009. Kinda suspect testing methods, 'eh? Not exactly a standard applied equally.
     
  4. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Whatever, good job F-Secure. :thumb: Better red than dead.
    Always nice to see G-Data shown up. :argh:

    PS. Oh Avira, Ikarus. :(

    PPS. I know the results might be questionable, but there must be ones that aren't cheating, too.
     
    Last edited: Sep 26, 2013
  5. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,567
    Location:
    New York City
    You can see the effect of Emsisoft using BitDefender signatures. First time I've seen this test where they weren't in red. Ikarus, in red.
     
  6. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    8,005
    VB100 has tested MSE since 2009:
    http://www.virusbtn.com/vb100/archive/vendor?id=70

    but MSE changed during the time (MSE, new MSE + Microsoft Windows Defender, etc).
     
  7. FreddyFreeloader

    FreddyFreeloader Registered Member

    Joined:
    Jul 23, 2013
    Posts:
    527
    Location:
    Tejas
    VB100 testing methods compare products over different platforms and operating systems means there are apples to apples, and oranges to apples comparsions. In the RAP comparsons of this sort - it's most definatly apples to oranges.
     
  8. BoerenkoolMetWorst

    BoerenkoolMetWorst Registered Member

    Joined:
    Dec 22, 2009
    Posts:
    4,872
    Location:
    Outer space
    Indeed, while Virusbulletin may be very well known, it is imo not the best when it comes to comparative testing.
     
  9. vojta

    vojta Registered Member

    Joined:
    Feb 26, 2010
    Posts:
    830
    Check this post of mine about this type of tests:

    https://www.wilderssecurity.com/showpost.php?p=2284329&postcount=696

    And this blog entry from Eugene Kaspersky that Inside Out posted in another thread:

    http://eugene.kaspersky.com/2011/10/18/the-holy-grail-of-av-testing-and-why-it-will-never-be-found/


    To put it shortly:

    -Microsoft says that "We used to have part of our team directed towards predicting test results and figuring out what might be in someone’s test. There’s always a cost to that, If they are doing that work they are not looking at those threats that are affecting our customers. We always felt that was wrong. There’s something not right about that – we’re not doing the best job for our customers."

    -Symantec, "some vendors assign as many as a dozen engineers specifically to the task of ensuring good test scores." They refuse to keep on doing so.

    -Kaspersky, "It just seems a pity that AV firms’ time and resources are spent – not on getting their products to do their jobs properly – but instead on going after the best ranking in the same old BS-testing; on getting a result no worse than those who hone their products – not to achieve real quality – but only to pass tests better than the rest......As a result not the best software gets to take the gold, and the other rankings also practically in no way correspond to real levels of provided protection. In short, it’s all just a lot of misleading nonsense, with the consumer getting conned."
     
  10. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    @vojta: To be fair, not everyone cheats. I wrote there "Kaspersky, for example, don't care", because they even put it on their blog to inform us, knowing that it will be embarrassing if they don't practice what they preach. KAV/KIS also gets good but not great results in detection tests, so IMHO they aren't cheating, and they shouldn't need to either judging by other tests. If it's true, this shows good results are achievable and already achieved even fair and square. Though I'd be much more suspicious of vendors with a history of gross incompetence, arrogance, misinformation, dependency on others (especially Bitenabler) and obsession with test results eg. G-Dud winning this year's AVC September file detection test.
     
    Last edited: Sep 26, 2013
  11. George942

    George942 Registered Member

    Joined:
    Oct 11, 2008
    Posts:
    13
    so if those tests does not reflect the quality of the product how a consumer suppose to know what product to take?
    o_O
     
  12. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Simple, take mine. Kees says it may be the best emerging product, and its free.:thumb:
     
  13. Frank the Perv

    Frank the Perv Banned

    Joined:
    Dec 16, 2005
    Posts:
    882
    Location:
    Virginia, USA
    Test bashing is stupid.

    Different tests use different methodologies to assess performance.

    Tests are in one way or another based on real-world criteria.

    There are shortcomings and limitations to testing – just as there are shortcomings and limitations to testing in schools.

    Testing is great -- and is the basis for determining which products are better than others.

    That is all.

    .
     
  14. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    Nothing to complain about concerning Avira.
     
  15. Jeff, that was in the context

    . . . using 'only' core AV-engine technology for on-execution and URL filtering.

    For this core AV-focus (and offcourse its improvement in last three years), I think it is a great free choice when you:

    a) want a no pop-up AV (no 'depending user response', solely blacklist/AV-engine security).
    b) use some other layer (HIPS/FW/Sandbox/Policy) of security to minimise risk of interferance or incompatibility

    Regards Kees
     
    Last edited by a moderator: Sep 27, 2013
  16. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    For real world and proactive tests that would be an argument - here it is not. For pure signature scanning (which RAP is) this is no point, cause it doesn't matter for detection on which platform you scan.
     
  17. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Re: New RAP From VB 100 - G-Data Better than Emsi, F-Secure, Qihooo_O

    You can also see the results for only August here, which contradicts the title.

    I meant it isn't doing great as usual.

    PS. Fortinet would have you mistake it for a BD knockoff. :thumb: The next best score after Trustport.

    PPS. All the figures and graphs are now up on comss. G-Data (3rd place) scoring higher in the August test than the BD clones with decent in-house engines like Emsi or F-Secure looks dubious.
     
    Last edited: Sep 29, 2013
  18. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Not always the same product from the same vendor is tested in various operating systems.
     
    Last edited: Sep 28, 2013
  19. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    I start to doubt about things when the most basic product you can imagine becomes magically better than some other one that has several high tech proactive features. Because if you check MSE is better than avast! in the proactive scale which makes no sense at all. It has zero behavior analysis, zero URL blocking, zero reputation services and all this covered by an update cycle of 1 update check per 24 hours. It's just not logical on any level.
     
  20. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
  21. Sprocket

    Sprocket Registered Member

    Joined:
    Dec 24, 2012
    Posts:
    75
    It's interesting to watch people's reactions when test results don't fit their preconceived notions. Rather than trying to explain how the test must be flawed when Microsoft does well, perhaps we should be congratulating Microsoft on their improved performance.
     
  22. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    So every other testing organization (AV-Comparatives, AV-Test, etc.), RAP's outdated methodology linked above, and this thread is a "preconceived notion" in your opinion as well.
     
  23. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    If you know how the proactive test of VB is designed than it's not unlogical. Proactive Testing in that test is just scanning with old signatures. It's a pure signature based test.
     
  24. shadek

    shadek Registered Member

    Joined:
    Feb 26, 2008
    Posts:
    2,538
    Location:
    Sweden
    Ikarus was disappointing. That engine is usually performing a lot better.
     
  25. vlk

    vlk AV Expert

    Joined:
    Dec 26, 2002
    Posts:
    621
    RejZoR, what's not logical about it?
    RAP is an on-demand only test, so neither behavior analysis, nor URL blocking, nor reputation is taken into account (at least in case of avast)... Just the traditional signature-based scan with some heuristics (pretty traditional as well)...

    Vlk
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.