MRG Effitas Time to Detect Assessment Q2-Q3 2013

Discussion in 'other anti-virus software' started by IBK, Oct 20, 2013.

Thread Status:
Not open for further replies.
  1. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,819
    Location:
    Innsbruck (Austria)
  2. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,531
    Location:
    Sneffels volcano
    No Avira, no GData, no F-Secure ...hmm very interesting...
     
  3. SweX

    SweX Registered Member

    Joined:
    Apr 21, 2007
    Posts:
    6,429
    Thanks Andreas :thumb:
     
  4. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    3,736
    Location:
    New York City
    Initial detection rate graph does not match with the raw data for Detect and Remediate Q2 2013.
     
  5. Frank the Perv

    Frank the Perv Banned

    Joined:
    Dec 16, 2005
    Posts:
    882
    Location:
    Virginia, USA
    Thanks Andreas.

    In initial detection rates:

    Kaspersky, Emsisoft & Bitdefender with their usual strong finishes.

    Avast, SoftSphere & Symantec did great.

    MBAM did well, but I like to see them do better.

    I'm sure some will be disappointed with the 11th place finish of SecureAnywhere.

    McAfee showing no signs of a comeback here.

    And SAS is a joke. As usual, the worst in its class, and the worst product tested. And in this test, the worst product in three different testing categories.
     
  6. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,634
    Location:
    UK
  7. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    All it would have achieved would've been make rogueware look good.

    But yup, both Avira and F-Secure are fine.
     
    Last edited: Oct 21, 2013
  8. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,468
    Thanks for the update.
    Interesting results.
     
  9. Feandur

    Feandur Registered Member

    Joined:
    Jun 15, 2005
    Posts:
    401
    Location:
    Australia
    What can I say, but, on x32 bit -

    DEFENSE WALL RULES !!!


    :D

    -cheers,
    feandur
     
  10. Windows_Security

    Windows_Security Registered Member

    Joined:
    Mar 2, 2013
    Posts:
    3,079
    Location:
    Netherlands
    Nice a real life test (internet & USB) which measures first protection (simular to VB RAP's proactive) and time to protect (better than RAP's reactive test).

    So we have RAP, which is more or less useless because it is unclear how many times a program was tested and does on-access scanning in stead of program execution (please correct when wrong)

    AV-Test has on execution zero day real world test of(I guess) malware collected in last 4 weeks (does somebody know what time frame AV-test uses) and a larger on-access test with 'reference' samples of last 4 weeks.

    AV-Comparatives has seperate tests for on-execution real life tests and a massive on-access file test.

    So we have (testing with frequent intervals)

    0day real life = MRG (initial) & AV-Test (real life) & AV-Comparatives (real live)
    Time2Reapond = MRG
    On-access 'in the wild' file test = AV-test
    On-access 'wild + zoo' file test = AV-Comparatives


    Are there also tests on URL-blocking effectiveness which test on a regular basis?
     
  11. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,819
    Location:
    Innsbruck (Austria)
    we do not have zoo since some time (we use prevalance), thats why the scores are in general so high.
     
  12. Windows_Security

    Windows_Security Registered Member

    Joined:
    Mar 2, 2013
    Posts:
    3,079
    Location:
    Netherlands
Loading...
Thread Status:
Not open for further replies.