AV Comparatives Sep 2013 File Detection Test

Discussion in 'other anti-virus software' started by Inside Out, Oct 14, 2013.

Thread Status:
Not open for further replies.
  1. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    http://www.av-comparatives.org/wp-content/uploads/2013/09/avc_fdt_201309_en.pdf

    - "Macstorm's trio" detected the most samples (in the English version only). I thought CloseGap barely detects/detected anything, but these test results show otherwise. However, considering how some vendors have drastically "improved" their results using dubious methods in the past, its sudden "improvement" seems fishy. :blink:

    - Nice to see Kaspersky on par with BD. :thumb:

    - Disappointing that Emsi's own engine didn't give it an edge over BD. Looks like they just focus on dynamic rather than static detection.

    - Eset results in this kind of tests haven't been very good lately, but users still seem to be happy with its detection. Maybe Eset are in the right (see the first bullet point). :doubt:

    - Avast o_O

    - Good to see AVC not give in to Symantec's excuses. :thumb:

    - But come on, no one threw up an obscene amount of false positives, but still a lot of high-performers demoted as usual.

    - Want Ikarus tested in a single product review this year.

    - The false alarm PDF gives a 404. :rolleyes:

    - The Chinese version is more comprehensive than that though (Tencent and Kingsoft between Avira and G-Data, Qihoo surprisingly not above BD).
     
    Last edited: Oct 14, 2013
  2. avman1995

    avman1995 Registered Member

    Joined:
    Sep 24, 2012
    Posts:
    944
    Location:
    india
    Well,it would be much more interesting if the testers run the samples with the whole AV package present.

    This isnt real life scenario so I will take it with a pinch of salt.ESET is still a very good suite.

    Speaking of Avast,they have evo-gen and filerepmalware running in real time NOT in on demand and these days its been more likely to detect a lot more rather than the normal virus database will detect. :)

    Not to forget,that though those samples may be present in the real world these type of direct binary testing dont signify anything.What if the proactive module or web guard got the rest during real time.
     
  3. vojta

    vojta Registered Member

    Joined:
    Feb 26, 2010
    Posts:
    830
    "AV Comparatives Sep 2013 Copy&Paste Hashes From VirusTotal Test", according to Kaspersky:

    http://eugene.kaspersky.com/2011/10/18/the-holy-grail-of-av-testing-and-why-it-will-never-be-found/

    "To win in these tests you don’t have to run faster than the bear. Instead you just suck up to the source of the malware used by the most famous testers (and these sources are well-known – VirusTotal, Jotti, and the malware-swappers in different AV companies), and then detect everything that all the others detect; that is, if a file is detected by competitors, to simply detect it using MD5 or something similar. No in-depth research and superior technologies to combat real life attacks are needed."
     
  4. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    8,006
    Last edited: Oct 14, 2013
  5. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Wow seems like Emsisoft is taking the FP's issues seriously, they even got 1 less than BD. :rolleyes:

    Anyways, congrats to usual top scorers. I also wonder why they tested Symantec, i thought they withdrawed from all tests.
     
  6. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Never implied I trust it that much, but some don't cheat.
     
    Last edited: Oct 14, 2013
  7. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    818
    If you are going to keep testing Symantec in tests they aren't interested in participating in (static on-demand tests), why do you leave them out in the tests they are interested in participating in ("real world" tests). Looks like you want them to look generally bad (even if that isn't the case, it looks that way).
     
  8. siketa

    siketa Registered Member

    Joined:
    Oct 25, 2012
    Posts:
    2,718
    Location:
    Gaia
    What happened to Symantec? :eek:
     
  9. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    @7anon: Edited the OP to include the Chinese version, sorry.

    PS. And the false positive problem isn't that bad, really.

    If they hadn't run away from a test they supposedly didn't care about in the first place, at least they could have proven that Norton is a good product despite this flaw. Serves them right.
     
    Last edited: Oct 14, 2013
  10. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    818
    They only provide a limited definition set these days, the "Smart definitions". You used to be able to select either Smart or Full definitions, but now only Smart are available, a reduced set compared to the Full. Norton does not have a good on-demand scan detection rate, but rely on all components of the program working together for prevention rather than static detection. Download those missed samples in this test or run them, for example, and Norton would most likely score much higher, with Insight Protection/Download Intelligence and SONAR kicking in, respectively.

    Which is why I think it would be interesting to see Symantec in a real-world test by AV-Comparatives as well, to complement the good results they always get in these type of test by, for example, AV-Test. It just seems weird to me that the only type of tests AV-Comparatives include Symantec in are the very ones they obviously do not consider important and don't want to participate in (also knowing, of course, that they will do rather badly in them).
     
  11. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    If im not mistaken Symantec wanted to participate in "SOME" tests and AV-Comparatives didnt agree so they ended up withdrawing from all tests. The reason is that they knew they would perform poorly in the On-Demand tests.
     
  12. siketa

    siketa Registered Member

    Joined:
    Oct 25, 2012
    Posts:
    2,718
    Location:
    Gaia
    Thanks, KG!
     
  13. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    8,006
    Last edited: Oct 14, 2013
  14. tomazyk

    tomazyk Guest

    I agree with you.
    Here is the explanation in report:
    I guess since there were frequent requests from readers, Symantec will be included in other tests also? Otherwise this is just bad-mouthing of the product.
     
  15. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    @7anon: Webroot had 428 FPs in a test, so Avira and G-Data's are reasonably low for this sample size. I have other misgivings about the latter's in-house engine, though.
     
    Last edited: Oct 14, 2013
  16. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    8,006
  17. Inside Out

    Inside Out Registered Member

    Joined:
    Sep 17, 2013
    Posts:
    421
    Location:
    Pangea
    Since we're not going to convince each other, let's just agree to disagree.
     
  18. Blueshoes

    Blueshoes Registered Member

    Joined:
    Feb 13, 2010
    Posts:
    226
    Yea, but guess what. Come on Norton step up to the plate, Just use some bandwidth and give full sigs JUST like ALL the others do and you will have a better on-demand test. Problem solved. I say keep putting in this on-demand test till Norton is a member of the 100% crowd and not the 2% crowd. Screw the "smart definitions" idea.
     
  19. rinem

    rinem Registered Member

    Joined:
    Mar 11, 2012
    Posts:
    156
    So you want the testing organisations to tell the avs what to do?
    I must say i admire symantec for sticking to their strategy.

    http://www.pcmag.com/article2/0,2817,2424118,00.asp

    very interesting.
     
    Last edited: Oct 14, 2013
  20. SnowFlakes

    SnowFlakes Registered Member

    Joined:
    Jun 29, 2011
    Posts:
    194
    are there people who still belive in this independent av-comparatives test ?
    :eek:
    LoL
     
  21. m0use0ver

    m0use0ver Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    81
    Call it a form of flirtation and ongoing courtship.We only see the tip of the Iceberg in public and that is when a relationship is on the rocks and the couples are sparring in public.

    All Test outfits don't do anything for free with out an agenda and if they did there is a logical reason beyond the public stated "our end customers ask us to". More clients= more $'s its as simple as that.

    The fact is no AV testing organization is unbiased as they rely on the $ trail to keep their business models afloat.You don't burn your paying customers it not good business practice after all and they wont be your customers for long.

    Have to chuckle when a vendor does bad in a test and then the testing organization offers to work with them.Translated into actuality is the vendor then enters into a service contract with the testing organization where they become the paying customer and the testing organization the paid provider.

    Whenever a provider tests a client then playing field has been totally warped beyond much meaningful use.Does anyone genuinely believe clients and non clients will be treated equally ?

    100% detection rated...99% detection rates yeah wake up and smell the coffee.

    These companies have more people working in support for product fails then they do R&D and research(to prevent them).The simple reason is they all fail at some point in spite of the marketing hype and so called respectable testing organizations assistance to perpetuate the myth in exchange for $'s( or Euros or whatever denomination they bill in).
     
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    Yes, we are working on it. For this year on our website are published only the FDT and performance test results (in which Symantec is good btw), more is to be expected for next year.
     
  23. tomazyk

    tomazyk Guest

    OK! Thank you for your quick answer. :thumb:
     
  24. Sher

    Sher Registered Member

    Joined:
    Oct 19, 2005
    Posts:
    366
    Location:
    Pakistan
    Symantec stole the show, no doubt. Take a bow, Symantec! :D
     
  25. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,616
    Location:
    Milan and Seoul
    Trend Micro is 10th (14 FP) in the false positive results score and 7th (98.2%) in the total detection rates, as a result it gets Advanced +.

    Avira is 11th (20FP) in the false positive results score and 1st (99.7%) in the total detection rates, as a result it gets just Advanced.

    I find the criteria of assessment completely flawed as anybody seeing these results would pick Avira as the real winner. Now we know Trend Micro performs very well with dynamic tests, but these awards are not fair.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.