AV-Comparatives Heuristic / Behaviour Test 2014

Discussion in 'other anti-virus software' started by FleischmannTV, Jul 4, 2014.

Thread Status:
Not open for further replies.
  1. FleischmannTV

    FleischmannTV Registered Member

  2. ReverseGear

    ReverseGear Guest

    Avast not there ?
     
  3. Rohugh

    Rohugh Registered Member

    "This test is currently an optional part of our public main test-series, that is to say, manufacturers can
    decide at the beginning of the year whether they want their respective products to be included in the
    test."


    Obviously Avast chose not to be tested for it.
     
  4. Inside Out

    Inside Out Registered Member

    Don't get why Avira didn't participate.
     
  5. FreddyFreeloader

    FreddyFreeloader Registered Member

    Thanks for the share!
     
  6. SweX

    SweX Registered Member

    Thanks :thumb:
     
  7. zfactor

    zfactor Registered Member

    agreed was a bit disappointed to not see them included
     
  8. RejZoR

    RejZoR Lurker

    Maybe they did participate, but decided not to get their scores published. This way they check their sofwtare, but they don't make the results public.
     
  9. Rohugh

    Rohugh Registered Member

    If they did participate, and the results were good then they would surely publish them - AV companies like to blow their own trumpet. However, if the results were bad ...................... (just a thought) ;)
     
  10. IBK

    IBK AV Expert

    no, they must decide this beforehand. We do not allow this afterwards.
     
  11. Der Alte

    Der Alte Registered Member

    When will we se Webroot in test again, any news on that front ?
     
  12. FOXP2

    FOXP2 Guest

    It should be noted Lavasoft and Forticlient results here displays what is most likely referred to by AVC as proactive generic protection (page 6). This is probably the best that can be expected from free software.

    "Cloud Based Behavior Scanning" is present in Fortinet's Managed Client.¹

    Lavasoft's Ad-Aware "Advanced Protection" is provided in Pro Security and Total Security.² I question their decision in volunteering the free version in a test so heavily weighted by the feature-laden sets of their competitor's full-blown $uite$. But then perhaps we should appreciate such boldness in light of those who chose not to submit their free versions at all or even their paid versions (ahem, cough, cough), Avira and Avast.

    Obviously, the bar is raised given the enhancements provided by proactive heuristic and/or behavioral protection.

    I don't know if Tencent is free. But, who cares? :)

    ¹ forticlient-admin-50.pdf
    ² http://lavasoft.com/products/ad_aware.php

    Cheers.
     
  13. Why would you want Webroot in a test again?

    Every time Webroot is tested, the Webroot Kool-Aid drinkers whine and cry and make excuses.


    .
     
  14. RejZoR

    RejZoR Lurker

    Yes, but they can still get tested right? So, they simply decided not to publish the results even before the test was done, but they still participated. Considering they are still working on NG technology i somehow understand why they have decided to do so.
     
  15. Der Alte

    Der Alte Registered Member

    @ FtP Because I am a curious kind of gay :)
     
  16. LoneWolf

    LoneWolf Registered Member

    I don't think so..
    The way I understand it is that they must agree to have their results published regardless of how the software does, good or bad, in order to be tested at all.
     
  17. SweX

    SweX Registered Member

    Yes that's how I understand it too. :doubt:
     
  18. Rompin Raider

    Rompin Raider Registered Member

    Thanks for the post...
     
  19. Triple Helix

    Triple Helix Specialist

    @IBK May I ask why so many vendors are not being tested as the list is getting short? And this not about Webroot SecureAnywhere and most of us know why they don't get tested anymore. I'm just very Curious about it looking at the 2013 test and now the 2014 test.

    Thanks,

    Daniel
     
  20. zfactor

    zfactor Registered Member

    same here the list continues to get shorter and shorter it seems each year. i was really hoping to see a few that are absent.
     
  21. RejZoR

    RejZoR Lurker

    I don't think so. I think they have to decide whether they want to publish the scores before the test is conducted. If they say NO and they score the best of all participants, they can't publish them even though they were the best out of everyone. Or, if they say YES and they score super poorly, they still have to publish the scores. This way you can't have only nice scores. But if you're unsure about the uutcome, you can say no and not be published at all, but you can still be tested for your own internal reference. I see no reason why saying "don't publish our scores" means they don't have the right anymore to be tested. But that also removes their right to use the obtained scores for anything but their internal company use.
     
  22. Minimalist

    Minimalist Registered Member

    I don't think that it works like this. If this would be true Symantec would probably still have their software tested. I guess we'll have to wait @IBK to clarify.
     
  23. ArchiveX

    ArchiveX Registered Member

    ESET 90%

    Emsisoft 85%

    ;)
     
  24. anon

    anon Registered Member

    Sometimes it's necessary to explain the self explanatory to those who find conspiracies everywhere..............
     
  25. IBK

    IBK AV Expert

    Rejzor is correct. (although it is not the case for this specific test, we do indeed test internally some more products; also from some vendors that some people think that they do not want to be tested).
    @ zfactor: this kind of test is likely one of the most difficult to pass. Some may have decided to do not get the trouble of getting embarassed by results that they can't even remotely predict/guess, or they know to have FPs and would fail due to that, or do not have an effective behavioral proactive protection and rely more on cloud blacklistings, some marketing/vendors may not want to take part in any test where results are likely to be under 99% (or worse than their main competitors), etc. There can be many reasons; to users such reasons may be told a bit differently (e.g. unimportant test, wrong/biased test, forgot to participate, etc. or in other marketing language that puts/keeps the product in good light).
    Kudos shoudl go to the vendors that are not heavily marketing-driven and who participate publicly in tests (even if they know that their product may still need some further improvements and use the feedback/results to improve their product and/or to see what area needs to be fixed [internal testing does not give vendors as much feedback as if participated publicly]).
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice