AV-Comparatives: Real-World Protection Test February-June 2018

Discussion in 'other anti-virus software' started by anon, Jul 13, 2018.

  1. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    7,983
  2. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,626
    Location:
    USA
    I really don't feel that something that got 111 false positives belongs near the left of the chart, regardless of the fact it didn't miss anything. It's hard to miss when you detect everything.
     
  3. bigwrench9

    bigwrench9 Registered Member

    Joined:
    Oct 28, 2009
    Posts:
    148
    Agreed!

    Also Pretty sad when 99% gets you last place. (June) K7
     
    Last edited: Jul 13, 2018
  4. mekelek

    mekelek Registered Member

    Joined:
    May 5, 2017
    Posts:
    518
    Location:
    Hungary
    looking at the score from WD, you can tell the samples were at least a week old.
    yea, FSecure hasn't fixed DeepGuard yet, still an FP machine.
     
  5. Spartan

    Spartan Registered Member

    Joined:
    Jun 21, 2016
    Posts:
    1,424
    Location:
    Dubai
    When I had F-Secure Antivirus installed, even basic installers like the Adobe CC suite 2018, let's say I was installing Adobe Illustrator 2018 CC, the installer would hand at 99% forever and never finish until I paused the protection then it would finish.

    I don't like using any security product that comes in the way of my workflow and certainly not in something as basic as installing a popular app such as Adobe's.
     
  6. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    this assumption is so wrong... - maybe you are not aware that as fresher the samples are, as more likely reputation system classify them as suspicious and reach therefore higher scores (and some products cause more FPs due to that).
     
  7. mekelek

    mekelek Registered Member

    Joined:
    May 5, 2017
    Posts:
    518
    Location:
    Hungary
    if the samples would be "fresh" as you claim, Panda would be way higher in FPs than FSecure and WD would be under 50% in detection
    it's sad if you actually take smartscreen's not classified popup as a WD detection...
     
    Last edited: Jul 13, 2018
  8. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    Well for starters, this is AV-C's first semi-annual summary for 2018. In other words, results averaged over the last 6 months.

    WD regardless of samples used during that period, ended up with 22 out of 1080; i.e. 1.9%, of samples requiring user decision. That is way ……… to high for an average user. Compound that with 19 false positive detections.

    When I saw that AV-C gave both WD and Eset two star ratings is when I went outside and threw up.
     
    Last edited: Jul 13, 2018
  9. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    I see that WD scored quite high on user dependent, no wonder because SmartScreen blocks ALL unknown apps, it's not actually an AV, what a joke. In my eyes it should be marked as a fail because it couldn't decide whether it was malware or not.
     
  10. TairikuOkami

    TairikuOkami Registered Member

    Joined:
    Oct 10, 2005
    Posts:
    3,418
    Location:
    Slovakia
    The chart shows Feb-June by default, select June only, user dependant is 0,0%, compromised 0,5%.

    This test shows it more accurately. :shifty:
     

    Attached Files:

  11. mekelek

    mekelek Registered Member

    Joined:
    May 5, 2017
    Posts:
    518
    Location:
    Hungary
    yeap, that's more like it.
     
  12. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    If we're going to get into the "nitty gritty" in regards to WD's test scores, it performed very well in May and June. This helped to increase its 6 month average score. If WD repeats their May and June performances over the next 6 months when the next AV-C semi-annual report is issued, they might warrant a serious look.

    As far as I am concerned, this is just one AV lab's determination of its performance. All AV lab test's need to be examined in aggregate to make an accurate determination if WD has actually become a serious contender to the other third party AV solutions.

    Then there is the issue of "AV lab test bagging" where a vendor allocates considerable resources to ensure their test scores show significant improvement. Microsoft certainly has such resources to do like activity. If such activity is engaged in by a tested vendor, all his test score shows is it is proficient at detecting AV lab test samples.
     
  13. mekelek

    mekelek Registered Member

    Joined:
    May 5, 2017
    Posts:
    518
    Location:
    Hungary
    or its flat out just paid for PR.
     
  14. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    I will also add, I don't have the distain many have for WD's use of Win 10 native SmartScreen as its reputational scanner. In that regard, it is pretty good.

    My criticism of it is:

    1. It only scans files that have "the mark of the web";i.e. Internet downloaded, associated with them. There are methods to strip such classification either in the download itself or after the file has been downloaded.

    2. It runs as medium integrity unprotected process. As such, it can be disabled by malware. Later Win 10 vers. now detect such status and will auto reload it. But malware only needs milli-secs. to load its payload.
     
    Last edited: Jul 14, 2018
  15. bigwrench9

    bigwrench9 Registered Member

    Joined:
    Oct 28, 2009
    Posts:
    148
    TairikuOkami said:
    This test shows it more accurately. :shifty:

    Yep! :argh:
     
  16. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,626
    Location:
    USA
    Yep. I have personally set up Group Policy to stop files from being marked with this. I'm not sure i this is still being marked by NTFS metadata, but in the past it was, and was certainty easy to strip off.
     
  17. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    Even though preceeded by the word "Or", this innuendo seems to be a serious accusation of a long-time Wilders friend & member. I, for 1, don't believe this gossip for a minute.
     
    Last edited: Jul 14, 2018
  18. mekelek

    mekelek Registered Member

    Joined:
    May 5, 2017
    Posts:
    518
    Location:
    Hungary
    you might want to read what i responded to, it was about AV tests generally, which are most of the times paid by AV companies.
    "provided by Sophos" , "provided by Norton"

    also it isn't far fetched that these tests are rigged...
    like that polish av test that has the only polish based AV scoring first all the time, that is literally Bitdefender sigs and nothing else..
     
  19. itman

    itman Registered Member

    Joined:
    Jun 22, 2010
    Posts:
    8,592
    Location:
    U.S.A.
    SE Labs does not charge for its public tests:
    https://selabs.uk/en/reports/consumers

    Also as noted above, it is standard practice to have vendors review test results prior to publication. Depending on the AV lab involved, the vendor is allowed to correct the deficiency, be retested, and not be penalized for the initial failure. Is this indeed "real world" simulation, you make the call.
     
    Last edited: Jul 14, 2018
  20. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    You posted your comment in a thread that is clearly titled as concerning tests by AV-comparatives.

    They taught me about this ploy in pre-law. It's backing up 1 unsubstantiated negative inference with another unsubstantiated negative inference -- a sub-species of "guilt by association."
     
  21. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    Avira's results are excellent, and I have no reasons to doubt AV Comparatives integrity.
     
  22. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    They're on the level.
     
  23. imdb

    imdb Registered Member

    Joined:
    Nov 2, 2011
    Posts:
    4,208
    @ trend micro users:
    could you please rate it on a scale from 0 to 10 for its w10 performance?
     
  24. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    7,983
    What a joke!
    Most of the times there is no need to put the money on the table, (ιf) it's already given under the table........
     
    Last edited: Jul 15, 2018
  25. shmu26

    shmu26 Registered Member

    Joined:
    Jul 9, 2015
    Posts:
    1,549
    Smartscreen doesn't support checking the reputation of script files. And Windows Defender at default settings is generally not so great against FUD scriptors. I am guessing that this is the cause of WD's great downfall in the MRG-Effitas test.

    If you just block common script interpreters such as wscript, you can use NVT SysHardener or NVT OSArmor or SRP or whatever you like, then WD will do much better -- especially if you go ahead and enable the ASR rules.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.