AV-Comparatives Retrospective / Proactive Test May 2011 released!

Discussion in 'other anti-virus software' started by clayieee, May 25, 2011.

Thread Status:
Not open for further replies.
  1. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    ...and the retrospective test report of AV-C describes quite well what it tests and what not.

    P.S.: some days ago I was predicting that in some years vendors may start to ask for opt-out from WPDT tests because they score lower than other vendors. I was wrong: it happened already today and not in some years, that one vendor asked about opting out from the Real World Dynamic Test (WPDT).
     
  2. Matthijs5nl

    Matthijs5nl Guest

    I already have got two beta licenses :D. Using one for some testing, using the other one on my laptop.
     
  3. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Every single type of test is important, not just what vendors say. IMO, the retrospective tests have their rightful place, as one cannot hope to remain connected 24/7, especially in developing countries. There will always be occasional problems with the internet connection, and the internet is not the one and only way to spread malware. There are things like LANs, Wi-Fi networks, what not.

    In such a scenario, over-dependence on the cloud is never a good thing. If some vendor wants to opt out, so be it. But the testers should not be obliged to test according to the interests and strengths/weaknesses of some vendor.

    Heck, all this discussion makes me think of a kid giving exams - the best performer is still one who has a decent score in all the subjects, regardless of who does well in subject A or B......o_O
     
  4. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    I can readily understand why any vendor of security software would opt-out from tests that are not appropriately structured for assessing that vendor's particular product.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    I would be disappointed to see any spreading of the "Matousec syndrome" -- whereby there is a strong inference that organizations which opt-out of a given test (or refuse to pay to be tested) have something to hide. This inference places subtle pressure on organizations to pay for participation in tests that are not fully applicable to their security software.

    "Company 'A' Opted-out" should not become a test organization's implied synonym for "Company 'A' has something to hide."
     
    Last edited: Jun 20, 2011
  5. Spooony

    Spooony Registered Member

    Joined:
    Apr 30, 2011
    Posts:
    514
    Look at AVG. They didn't do that good in most of the tests but they didn't complain (yet) I think what they do is take a test like this and say. "Guys look at how poor our product did. We have some short comings and lets look at the weak areas to fix our product."

    So vendors are looking at it wrong I think. Its a great way to see where your short comings are and how to better your product. In real world that short comings are going to lifted out and users are going to switch to another vendor. They change them like underpants these days. So instead of moaning about it and sulking not wanting to play anymore take the positives out of the situation. Ok we didn't do good. In which area. Ok people lets fix it. Lets get a better rating and improve on the areas where we did bad.
     
  6. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    I wonder who that was. I guess we will find out in the next test. Someone will be conspicuously absent. I have my guesses. You have to wonder why a company in their right mind would pull out of a REAL-WORLD test

    The are basically saying "If you test me using a concocted out-dated test, I look great, but in the real-world I SUCK".
     
  7. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    Indeed, i would expect a product doing well in the retrospective tests, they would have a good protection in dynamic tests

    the opposite is not always applicable:
    a product with good protection in dynamic tests not necessarily have good heuristics
     
  8. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    ...or both (low in retrospective and dynamic test).
    bottom line: seems like only a test where the product of the vendor scores good is considered as a good test for the vendor.
     
  9. Baserk

    Baserk Registered Member

    Joined:
    Apr 14, 2008
    Posts:
    1,321
    Location:
    AmstelodamUM
    I wonder which one will pull out.
    Looking at the stats, from January-June 2011, there seem to be 5 contenders/worst performers; Webroot, AVG, McAfee, K7 and PC Tools. link
    Likely one of those...
     
  10. toxinon12345

    toxinon12345 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    1,200
    Location:
    Managua, Nicaragua
    maybe McAfee or K7 :)
     
  11. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    There's an upcoming test? :D
     
  12. Nevis

    Nevis Registered Member

    Joined:
    Aug 28, 2010
    Posts:
    812
    Location:
    255.255.255.255
    correct. earlier it was McAFee but now all vendors show only those test in which they scored well on their website and treats other tests as less imp
     
  13. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    I guess anyone would do that :rolleyes:
     
  14. Nevis

    Nevis Registered Member

    Joined:
    Aug 28, 2010
    Posts:
    812
    Location:
    255.255.255.255
    that is my point . those vendors only later state that others have shown incomplete results and blah blah when they themselves do it
     
  15. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    I think this is a totally irresponsible statement coming from you Andreas. I dont think you can make such a statement globally across all vendors. I for one hold certain vendors in much higher regards than others and buy their stories as it makes technical sense, and your test dont.

    You are welcome to peddle your tests as long as some magazines will buy them but dont try to convince the technical community that there are any technical merits to doing a on-demand retrospective test.
     
    Last edited: Jun 22, 2011
  16. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    While this may be true, there are some totally valid reasons for refusing to participate in a completely bogus retrospective test. It tells me NOTHING about how my product will when faced with real-world infections.

    Restrospective testing is like testing how safe US gold is in Fort Knox by testing how strong those bars are on the safe inside the vault by subjecting them to a laser cutter. By doing so, you've effectively ignore ALL other layers of security. The 50-inch steal shell on the vault, the electronic surveillance around the vault, the physical security of the building in which the vault is contained, the heavily-armed security guarding the Fort, the security checks leading up to the Fort, the special clearance to get past the main gate.

    So doing a restrospective test that bypasses all these layers and is akin to just test how those bars hold up to a laser cutter is about the DUMBEST thing I have heard of.
     
  17. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    @qakbot:
    i wrote product and not products, i wrote vendor and not vendors.
    please avoid overinterpreting things.

    av-c does not sell any reports to magazines. if you want to hear just one part of a story, you are free to do so.

    anyway, speaking in general about the forum and looking at what and how other users (like you) write, why would it be irresponsible to say something if it comes from me but OK if someone else writes his opinion? Do I have to put a long disclaimer under each post? There are only few forums left (including wilders) where i sometimes participate and i would like to be a forum user like anyone else (and no, i am not going to create fake accounts to hide my identity). If I write an "official" statement, it will be noted/seen as such; otherwise I am just here contributing to the forum.
     
  18. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    Andreas, you cannot cease to be who you are -- a highly regarded professional tester. That your comments on security matters carry added weight cannot be avoided -- it is the nature of holding a high status in any given field of endeavor. Thus, no matter what you may wish, your forum comments carry a potential to enhance of diminish your perceived objectivity, and to increase or decrease a security company's reputation & profitability.

    If I talk stink about a given security program, it is just another half-baked opinion by someone who is known to be little more than a neophyte & a kibitzer when it comes to security matters. On the other hand, if I were to publicly criticize someone in my own field, where I am still a fairly well-known emeritus (as is actually the case), then that would be a far different matter. I would certainly be called to account and challenged -- not only to prove my allegations but also to prove my lack of personal bias.

    Andreas, you are well-known and highly admired in your field. Your comments in this forum will NEVER be regarded as those of "just another poster." Live with it.
     
  19. MarkKennedy

    MarkKennedy Registered Member

    Joined:
    Jun 16, 2011
    Posts:
    19
    In a sense, "real world" tests are the best form or retrospective test: Current products against current threats. Let me use all the weapons in my arsenal against the weapons in yours and see what happens. No layers, no messing around with the infection vectors, no monkeying around with the defs, no monkeying around with the cloud.

    This scenario plays out daily on millions of machines. This is the most relevant test. There are other relevant tests too (pre-infected cleanup, cleanup of machines previously compromised), but tests that attempt to focus on a particular layer are less relevant. Do you care how well the seat belts work in your car, or how safe you car is in an accident (or more importantly, can your car help you avoid the accident in the first place)?

    If a new, revolutionary protection product came into the market today, a properly conducted real-world test would require no modification in order to test it. Run the product and attempt to infect the system. If it blocks the attempt it does not matter what technology or layers the product used, it did its job. This allows the security companies to focus on protecting our customers instead of ways to score well on an artificial test.
     
  20. MarkKennedy

    MarkKennedy Registered Member

    Joined:
    Jun 16, 2011
    Posts:
    19
    That is one way to look at it. But do you want vendors spending resources fixing problems that are not indicative of what real users are seeing on real machines? If we fail in a Real World test, you can be damn sure that we will want to know why, and if it is a problem with the technology (and not the test) then we will certainly fix it. We want to focus on what will help actual users in actual situations. We want tests that reflect that same reality.

    If I test your car and tell you that it performs badly on unpaved roads in winter climates, and live in Los Angeles, you are not going to be so quick to have the car company spend time and resources addressing those problems, especially in that time and those resources can be spent to give the car better fuel economy and safety.

    The impression of some seems to be that security vendors have limitless resources. Would that it were so.
     
  21. MarkKennedy

    MarkKennedy Registered Member

    Joined:
    Jun 16, 2011
    Posts:
    19
    Come on Andreas, you can do better than that. We score well on plenty of your tests that we do not consider to be good. Same for the other testers. And I am fairly certain same for the other vendors.
     
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    @Mark: you know very well that I am not speaking about Symantec in this case ("both" does not apply to you).
     
    Last edited: Jun 22, 2011
  23. pbust

    pbust AV Expert

    Joined:
    Apr 29, 2009
    Posts:
    1,176
    Location:
    Spain
    I too thought this was an inapropriate comment Andreas. It is unfortunate that testers such as yourself promote this idea that vendors trash tests in which they don't perform well. You very well know this is not true. It is the exception rather than the rule. On the other hand the damage that such a comment does to the already delicate situation between user perception of good/bad tests, testers and vendors is definately not positive nor constructive.
     
  24. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    i wouldnt say his claim is completely unfounded. many vendors in the past (like Dr.Web + others ;)) have withdrawn from AV-C when they see consistent low scores then miraculously they use the excuse that "we dont agree with the methodology" or "our product focuses on 'real' threats which is why it doesnt perform well in tests" well we only really have the vendors word to go on in that scenario now dont we, i think ill trust a tester that doesnt have an allegiance to a particular product over a vendor that has an obvious and completely understandable bias.

    Obviously im not saying all vendors will always drop out of any test they do bad in, but vendors definitely have done this in the past and there is every reason to assume that they would.
     
  25. Zyrtec

    Zyrtec Registered Member

    Joined:
    Mar 4, 2008
    Posts:
    534
    Location:
    USA
    Andreas,

    Now that you are here, I would like to ask you a quick question [I know it's off-topic but I wanted to ask it anyway].

    Why are your test results [AV-C] so different from those published by AV-Test?
    Is there something in your approach, the way you test, the amount of samples or what?

    I, myself use ESET NOD32 v4.2 and it performs wonderful [execept for a few FPs lately] in your tests but it performs so badly in AV-T's tests.


    Thanks in advance for your reply and my apologies for going a bit off-topic.


    Kind regards,



    Carlos
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.