MRG Effitas 360 Assessment & Certification Q3 2016

Discussion in 'other anti-virus software' started by Dark Star 72, Nov 8, 2016.

  1. Dark Star 72

    Dark Star 72 Registered Member

    Joined:
    May 27, 2007
    Posts:
    778
    Q3 2016 report now out

    https://www.mrg-effitas.com/wp-content/uploads/2016/11/MRG-Effitas-360-Assessment-Q3-2016.pdf

    Should get the fanboys going :D
     
  2. plat1098

    plat1098 Guest

    Well, as a Microsoft "fanboy": Scary DISMAL. Its rating doesn't seem to fluctuate much either. Time to look in the piggy bank for a decent alternative, no?
     
  3. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,555
    Location:
    New York City
    Would love to see the latest Microsoft prevalence based analysis to see what malware actually ends up on computers.
     
  4. Triple Helix

    Triple Helix Specialist

    Joined:
    Nov 20, 2004
    Posts:
    13,269
    Location:
    Ontario, Canada
    Dam I'm jumping ship now. Not! LOL
     
  5. ArchiveX

    ArchiveX Registered Member

    Joined:
    Apr 7, 2014
    Posts:
    1,501
    Location:
    .
    https://www.mrg-effitas.com/wp-content/uploads/2016/11/MRG-Effitas-360-Assessment-Q3-2016.pdf
    Interesting Results...:cool:
     
  6. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
  7. Sveta MRG

    Sveta MRG Registered Member

    Joined:
    Aug 16, 2009
    Posts:
    209
  8. shadek

    shadek Registered Member

    Joined:
    Feb 26, 2008
    Posts:
    2,538
    Location:
    Sweden
    Kaspersky is acing this test too. I'm very impressed with the ransomware results which is known to be very hard to block 100%.

    Malwarebytes are falling behind as months flies by, test by test. They were performing so good in the past.
     
  9. Mover

    Mover Registered Member

    Joined:
    Oct 1, 2005
    Posts:
    180
    One thing that can be said, irregardless of which source you look at, Kaspersky consistently scores well.
     
  10. cavehomme

    cavehomme Registered Member

    Joined:
    May 19, 2010
    Posts:
    137
    Location:
    Alps
    Oh dear, Webroot has failed for 2 consecutive quarters. I guess I am not entirely surprised because it matches my own limited test results with the latest zero day malware in the form of trojan downloaders in the form of Word docs, PDF files etc.

    I have raised these observations on several threads and over at Webroot but it's like water off a duck's back, flat denial of any problem. I still use Webroot and I hope for it to be the best product, but it is not, and it is getting worse as evidenced in my own results, other people's and of course these independent test agencies.

    Another surprise for me is Bitdefender which in AV Test and Comparatives comes out jointly with Kaspersky but here, and in my own tests, is mediocre.

    Unfortunately Kaspersky slows down my PC too much and other isues such as the MITM problems with them trying to secure the browser, so I am still not a fan.
     
    Last edited: Nov 21, 2016
  11. guest

    guest Guest

    There are differences between the pdf-file "MRG-Effitas-360-Assessment-Q3-2016.pdf" which was released 2016-11-07 and a newer one: 2016-12-02.
    Each graph was updated, and if i now look at the new "Grade of Pass" (i'll only show security products whose "Grade of Pass" was changed):

    These products switched from Level 2 to Level 1 (=Improvement) (Level 1="All Threats detected ...")
    • SurfRight Hitman Pro
    • Watchdog Anti-Malware
    • Zemana Anti-Malware

    These products switched from Failed to Level 2 (=Improvement) (Level 2="At least 97% of threats detected ...")
    • Bitdefender Internet Security
    • Symantec Norton Security
    • Trend Micro Maximum Security

    These products switched from Level 2 to Failed (=Deterioration) (Failed="Security product failed to detect all infections ...")
    • Panda Internet Security
    • ThreatTrack Vipre Internet Security
     
  12. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    LOL, you might say it's dangerous that certain people on this forum are actually promoting Win Defender. I'm sorry but those are really bad results, same goes for MBAM.
     
  13. Minimalist

    Minimalist Registered Member

    Joined:
    Jan 6, 2014
    Posts:
    14,881
    Location:
    Slovenia, EU
    I don't think it's dangerous, IMO it's decent AV. Dangerous would be, advising regular users not to use AV.
    Whether system get's compromised or not, has IMO more to do with people computing habits, than with AV they are using.
     
  14. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    It just so happens that I saved the original Nov. report. There are differences in scores for example for every vendor in the "In The Wild 360/Full Spectrum Test" report from the original report to that currently posted. The most notable score changes were for Panda and AVG.

    A couple of possible reasons:

    1. The original report was in error
    2. MRG retested using different malware samples.
    3. MRG retested using a different methodology.

    The only one for sure that knows the reason is MRG. If the discrepancy is of concern, you should directly contact them as to why.

    -EDIT- I will add that the proper thing to do is to note on the current report that it is a correction/revision to the original one. However, it appears this procedure is "alien" to many web publications.

     
    Last edited: Feb 5, 2017
  15. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    You are kidding me right? It can be the difference between a nasty ransomware infection or not. So I would always advice to use a top quality AV.
     
  16. Minimalist

    Minimalist Registered Member

    Joined:
    Jan 6, 2014
    Posts:
    14,881
    Location:
    Slovenia, EU
    Yes it can, but likelihood that AV's detection is deciding factor when it comes to infections (at least in long run), is IME not big.
     
  17. guest

    guest Guest

    Can be Nr. 1 or Nr. 2, but only MRG knows the real reason.
    And four weeks after the initial release of the pdf-file, with major changes to the "certification awards", i would have expected a note in the pdf-file.
    But there is none. It's like a "silent" revision.
     
  18. ExtremeGamerBR

    ExtremeGamerBR Registered Member

    Joined:
    Aug 3, 2010
    Posts:
    1,351
    I wouldn't trust any AV. Using only an AV and nothing more is dangerous.

    WD + SmartScreen + Edge or Chrome are very efficient.
     
  19. clubhouse1

    clubhouse1 Registered Member

    Joined:
    Sep 26, 2013
    Posts:
    1,124
    Location:
    UK
    Zemana drubbing MBAM
     
  20. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    That is not what the test report detection rate showed after 24 hours:

    In The Wild 360 test: 64.9%(Initial) + 17.4%(w/24 hrs) = 82.3% Total

    Ransomware: 61.5%(Initial) + 11.5%(w/24 hrs) = 73% Total

    Financial malware: 74.3%(Initial) + 13.5%(w/24 hrs) = 87.8% Total​

    This report also casts doubt on Microsoft's claims of a fast response to detection and loading of malware identification into SmartScreen's reputation database.
     
  21. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    There is also another reason for the discrepancy between the initial and current report that I forgot to mention.

    It is a standard but not widely publicized practice for the AV Labs to contact test participants after initial testing and allow them to take corrective action to discrepancies found. Another test is then performed using the same methodology employed by the initial test with those results being determined as final.

    The above is in marked contrast to the current procedure being employed by AV-Test in its past testing of MSE. After protest by Microsoft, AV-Test developed a unique test methodology to accommodate MSE's preference to detection of most prevalent malware in existence. Many in the AV testing community viewed this accommodation as "controversial."
     
    Last edited: Feb 9, 2017
  22. guest

    guest Guest

    You mean that the samples that AV-C uses are the more widely spread? and not just random?
     
  23. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    Their current File Detection test methodology is:

    File Detection Tests
    Aims of the test and target audience The File Detection Test assesses the ability of antivirus programs to detect malicious files on a system. Whilst the test only assesses one antimalware feature of the programs, this feature is important for a security solution. This is because it can identify malware attacks from sources other than the Internet, and detect malicious files already present on the system.

    Test Procedure

    We install each antivirus program on its own physical PC, and update the signatures. The malware sample files are then scanned using the program’s standard scanning procedure, and the number of detections is recorded. The PCs remain connected to the Internet during the test, meaning that the security programs can use any cloud features provided by their manufacturers.

    Typically, more than 100,000 malware samples are used for the test. These are prevalent malicious files of all types that have been recently collected, i.e. within a period of a few weeks or months before the test is performed.

    -EDIT- My quote on this is "Old malware never dies. Instead of gently fading away, it resurfaces in a new tweaked version." Many AV vendors employ "generic" signatures to detect these "new and improved versions." Eset for example uses the term DNA signatures for such detection.

     
    Last edited: Feb 9, 2017
  24. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    Of course there is some truth in this, but we can't be sure without any big statistics. So I still wouldn't recommend Win Defender just because it might be "good enough". I also wonder how the new MBAM v3 would perform, since they now advertise it as a full AV replacement.
     
  25. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    We did not change the test to "accomodate" Microsoft. You likely mean the commissioned Microsoft report which is based solely on Microsoft data and calculations (the test was the same, just other calculations etc. were applied). For the data based on various telemetry sources, the report/results of the FDT is valid. As later results of our FDT matched various recalculations of prevalence, there is currently no further need for separate reports with only one prevalence source.
    P.S.:Most testers focus on prevalent malware, if you look e.g. at AVTest, they call it "Detection of widespread and prevalent malware discovered in the last 4 weeks" and also AMTSO suggest using prevalent samples seen in the real-world.

    their "protest" in 2013 was towards an other lab - we were already focussing on prevalent samples and had experience in handling those metrics, which is why MS chose us to ran some experiments (e.g. the report based solely on MS prevalence etc.).
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.