AV-Comparatives: Real-World Protection Overall Report - March to June 2014

Discussion in 'other anti-virus software' started by anon, Jul 17, 2014.

Thread Status:
Not open for further replies.
  1. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    No, I don't blame you and don't look who's pack it was (it was an EAM test). I rarely watch youtube trash tests.

    Read the report. Read the methodology papers on AV-C's homepage, then you'll know more about their sources and can give valid informations about that in the future :)

    Mhm, what?? Why?? Youtube tests are nothing beside waht I posted above.

    Why do you think testing companies work hard with vendors and M$ and other partners and use things like prevalence data? Why do you think they use special software to really track system chances (and not just some scanners...)? ...

    Such gossip as you make really destroy what people are working for and often are educated for. (And no, I don't claim they have perfect tests without failures - but still a lot more professional than what happens on those private playgrounds. Every vendor, every real researcher will tell you the same.)

    Show that they do... ;-) AV-C can with prevalence data, big vendors do that with prevalence data, and now you?
     
  2. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    Find that info on any reputable site and post a link to the quote, because I sure can't find it. First time I heard any reputable member state that. :)
    Ok, but my point remains the same.
     
  3. malware1

    malware1 Registered Member

    Joined:
    May 26, 2014
    Posts:
    133

    http://www.av-comparatives.org/wp-content/uploads/testing-methodology.pdf
    7 page
    They get money.

    I did never say YouTube reviews are good. But they're still much better than tests done by AV-C.

    ?
     
    Last edited: Jul 19, 2014
  4. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    ...which doesn't give any AV vendor advantage over another, please read the next paragraph. And it's just a small subset of their sources, read the info before that sentence. Real-wold protection test (what this thread is about) doesn't receive sources from AV vendors: http://www.av-comparatives.org/wp-content/uploads/2014/07/avc_prot_2014a_en.pdf page 6
     
  5. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    See the word also, all main sources are listed before. Your selective quoting is quite unfair an your statement in previous post wasn't treu, because you suggested all samples are from vendors.

    And? It's their work (they live from it) and a service. For my own statical analysises I also take money - but that doesn't has an influence on results.

    Why?

    :) So no answer or sort of proof why your tested files should be relevant but AV-C has. Mhm, bad.
     
  6. malware1

    malware1 Registered Member

    Joined:
    May 26, 2014
    Posts:
    133
    Did I say 'only'?

    The samples are more fresh.
    I don't understand how files collected by me are not relevant. They're active in the wild, what else is needed?
     
  7. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    No, but you implied it. Anyway, we're going in circles, I'm sure the mods love that. :)
     
  8. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    Says who? Esp.WPDT usese relative fresh samples (active links!)

    Because you can't show that they affect users or only you find them. You can't say they are active. You can't say how prevalent they are ...
    And so you can't say anything about in the wild. Hope you understand it now.

    All that things every professional tester can do and so those tests meet some relevance criteria.

    btw.: I hope at least you can detect if your files are really malicious by more than scanning results. (because enough examples exist (esp. in the VT area) where many vendors detect some files ... but they aren't malware at all ;-) )

    Ok - I'll close the circle from my side. I think arguments are clear and so at least we are close to topic AV-C test.
     
  9. taleblou

    taleblou Registered Member

    Joined:
    Jan 9, 2010
    Posts:
    1,349
    I suggest you set the antivirus setting to aggressive and enable "harden mode", this alone will detect all files that are unknown or questionable to avast (not in avast whitelist). I found nothing can pass it. It automatically sandbox it with option for you to add to whitelist if you know its good.

    So anything passing the av will be detected by harden mode. Panda cloud in my test does not have this.
     
  10. Rompin Raider

    Rompin Raider Registered Member

    Joined:
    May 6, 2010
    Posts:
    1,254
    Location:
    Texas
  11. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Well at least he collects samples and sends it to the different vendors. I think thats something positive, no need to bash AV-C or Malware1.
     
  12. avman1995

    avman1995 Registered Member

    Joined:
    Sep 24, 2012
    Posts:
    944
    Location:
    india
    I agree with SLE.Youtube tests are 90% rubbish.They take no effort whatsoever to seperate the PUP/Adware/Duplicate files/Broken stuff from the real malicious ones and second alot of the tests are performed by biased people and then they call themselves as "experts" or pretend to one and it gives false sense of security.

    AV-C yeah I know they are far from perfect.Plus,they use only URL's so anything like panda will get a 100% because of agressive web blocking surface they have.

    I questioned one youtube tester last week and started defending himself and tried throw mud on me to protect himself.LOL Mr.Noob Security channel!!
     
  13. FreddyFreeloader

    FreddyFreeloader Registered Member

    Joined:
    Jul 23, 2013
    Posts:
    527
    Location:
    Tejas
    Here's what I love from AV-Comparatives testing methodology page:

    #10 - Vendors will get several weeks to peer-review the results, and if needed - we will correct the results. :isay:
     
  14. 3x0gR13N

    3x0gR13N Registered Member

    Joined:
    May 1, 2008
    Posts:
    850
    What are you implying, that AVC will modify the results to please an AV vendor even if there's no objective reason for it?
    All that means is AVC will ask for vendors' help in finding any objective mistakes in the results and correct them accordingly... not just because AV vendor A showered them with money under the desk for modifying the results in their favor.
     
  15. SweX

    SweX Registered Member

    Joined:
    Apr 21, 2007
    Posts:
    6,429
    There's no point in trying to explain, the expert testers are in the MT malware hub where no one execute the samples, that's the right way of testing. :thumb: :rolleyes:
     
  16. SLE

    SLE Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    361
    I also like quality management, it's good and speaks for responsible testing.
     
  17. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    163,883
    Location:
    Texas
    Another off topic post removed.
     
  18. FreddyFreeloader

    FreddyFreeloader Registered Member

    Joined:
    Jul 23, 2013
    Posts:
    527
    Location:
    Tejas
    AV-Comparatives - Panda is the #1 AV on Earth. Here's all you AV-Comps white knights chance to ride to the rescue. :D
     
  19. Frank the Perv

    Frank the Perv Banned

    Joined:
    Dec 16, 2005
    Posts:
    882
    Location:
    Virginia, USA
    AV Comparatives is great. And they keep on getting better.

    Many of those who complain about test techniques have not bothered to read them, and/or don't understand them.
     
  20. century

    century Registered Member

    Joined:
    Oct 13, 2007
    Posts:
    92
    This time around, AV-Comparatives roped in Symantec and G DATA for testing, for informational purposes. The real-world test is exactly the kind of test Symantec believes should be universal, as it exercises the whole product. Symantec would have earned Advanced+ in this test, with a very high detection rate and no false positives.

    Glad to know that Norton would have got top ranking in this test. It will substantiate the faith of Norton users. Thanks for the information.
     
  21. guest

    guest Guest

    In my opinion, every tests (pro or not) in the world is wrong anyway since there is no standards methodology available and only the fool take them for granted; if people took them just as a an informative snapshot in a specific moment for a specific system with a specific methodology , we will avoid those going-to-be troll threads.

    Some people believe those "pro" tests labs to be the only very truth and youtube testers a bunch of unskilled wannabe-famous-pseudo techies.
    Some others that they are biased and manipulated and Youtube ones are closer to the truth.
    others like me believes in none of them except what they see with their own eyes on their own machine.

    Now give me enough funds , i can create a test lab that can shows that every AVs is a joke or an awesome product , i just have to "orient" my methodology to fit my "needs" , follow my sight...

    The final note is we don't know what happen behind the curtains of those test labs and can only rely on the infos they give to us. I always wondered why those labs never make a live session of their tests...

    Thanks
     
  22. m0use0ver

    m0use0ver Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    81
    Dude you are not far from hitting the proverbial nail.

    Let me recall one recent piece of recent noos to support this.

    Symantec pronounced AV as dead and in a total break from their advertizing blurb over the year they admitted that their detection rates were sub 50%

    This is a vendor being honest finally for once and boy does that conflict with their test results by so called respectable testing entities over the years.

    the fact of the matter is test results are no more then purchased marketing material in most cases

    The AV test industry (and it is an industry) is symbiotic to the AV industry as a whole.

    No vendor will pay for bad advertizing material so it has never been in any Testing outfits interest to paint a true picture especially if that picture is anything less then flattering for its paying clients.

    So to stay on topic and reference this particular testing outfit.

    Until they can publish a league table of their customers and their respective purchased levels of service alongside any test data then all data has to be taken with a pinch of salt.

    As with all things in this industry, understanding the money trail allows for a more clearer perspective of what is going on.

    AV testing results = Advertising material for vendors = You get what you pay for. (rinse and repeat)
     
  23. Fabian Wosar

    Fabian Wosar Developer

    Joined:
    Aug 26, 2010
    Posts:
    838
    Location:
    Germany
    Except that every vendor pays the same in AVC's case. So how do you explain the drastic rating differences? How do you explain, that we as a very small company with less than 30 employees are somehow able to compete with large behemoths like Kaspersky for example, even though we clearly have way less funds available to throw at AVC?
     
  24. m0use0ver

    m0use0ver Registered Member

    Joined:
    Jun 30, 2011
    Posts:
    81
    Seriously Fabian, Emsi are one of the best performers in the industry and should be accepted as but there is not a drastic difference between best in class and worst when $'s change hands.

    That said i think the pertinent questions would be.

    How many samples do you as a company contribute to AVC collection of test samples ? do you use a live feed to supply in real time to them and what % that would make of a total given test bed.

    Also not to forget part of the service contract is a service where AVC help you to improve your results over time afterall it makes for beter advertising material for your website, sales team etc.
     
  25. Fabian Wosar

    Fabian Wosar Developer

    Joined:
    Aug 26, 2010
    Posts:
    838
    Location:
    Germany
    It may look that way, but keep in mind that you can't just participate in those tests either if you are a new product. AVC does not accept low performing products to begin with. I think the minimum criteria was at least an 80% detection rate. So naturally the difference of competing products is already limited to a 20% variance. However, that is still a huge variance.

    None. That being said, AVC participates in some of the same industry wide exchanges that we do like for example MUTE. Whether or not they select URLs or samples we shared on such feeds is unknown to us or any company for that matter. Given the volume of URLs and files exchanged through those feeds it would also be extremely difficult for anyone to poison the feed for an unfair advantage.

    Which consists of providing us the malware we missed or false positives we had as well as additional logs if we need them after the test is done.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.