AV_Test: The Ultimate Endurance Test?

Discussion in 'other anti-virus software' started by silverfox99, Dec 21, 2012.

Thread Status:
Not open for further replies.
  1. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    So has Twister. 99,98% at VB100 actually. Right up with the top dogs and better in RAP than Comodo 1 year ago.

    http://www.filseclab.com/en-us/news/12011113.htm

    Of course i still wonder why i never saw such perfect percentages even on old malware, pubblically available, but i guess i was always finding it in a bad day and i was able to find too rare malware that the guys in VB couldn't find. :D :D

    P.S.: Is there ANY antivirus that hasn't passed VB100 at some point with flying colours?
     
    Last edited: Dec 24, 2012
  2. The Red Moon

    The Red Moon Registered Member

    Joined:
    May 17, 2012
    Posts:
    4,101
    Only real tests i take into account are the ones on my own computer and i feel comodo provides adequate protection.I hardly come across malware(touch wood.o_O )but im sure the CAV will grab it sure enough.

    Everybody seems to change their security suites based on test results which i find peculiar as they can vary day by day.
    Im just sticking with something that works for me.:thumb:
     
  3. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    VB100 is the benchmark - if you don't pass it; don't sell an Anti-malware product. :p

    As far as RAP goes; I am observing that it seems to favour products with a higher update frequency; which means it's probably a test of how quickly a vendor reacts to ITW malware rather than a "regular" on-demand test.
     
  4. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    For me VB100 is lower than benchmark. It's the certification that everyone can get in order to use it as marketing promo symbol, by putting the sign on its website and that's it. Which is also the same reason why companies even pay to partecipate in tests or finance tests in magazines etc. They want the "award" to put in the website.
     
  5. Narxis

    Narxis Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    477
    I don't use or care about COMODO just want to see some results and not fanboy opinions. The same goes for Symantec too but at least they were in these tests until 2011.

    I don't care if the results are bad or good just want to see them because it's interesting that's all. I never dumped an AV just because tests. I'm currently using 4 kinds of AV, Kaspersky is on my main PC. Norton IS on my notebook, MSE is on my older notebook and Avast is on my older PC. So i really don't change just because test results.
     
  6. The Red Moon

    The Red Moon Registered Member

    Joined:
    May 17, 2012
    Posts:
    4,101
    Nothing wrong with appreciating a good piece of software.o_O :cautious:
     
  7. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    It's what I was hinting at :p
     
  8. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Hahahaha they are the "Must have" testing organizations approval. :D
     
  9. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    Since Tests are done in a Lab, their Results are Lab Results...:rolleyes:

    -Do you have any "Real-World" Results to share with?
    -Results that are better than the Lab Results of AV-Comparatives and AV-Test?

    If my kids take an F on a Test, I will try to explain What
    caused such a Poor Performance.

    Most importantly, I will make Whatever Necessary to Improve
    their Skills so that they score much Higher next time!

    I will improve my Kids' Test Performance
    instead of Throwing Mud at the Tester!

    Participating in the Tests of AV-Comparatives and AV-Test and then,
    severely Questioning their Methods/Results is Not Decent/Honest at all! :thumbd:
     
    Last edited: Dec 25, 2012
  10. er34

    er34 Guest

    which is not real world , right :thumb:

    Everybody here has such results - their own results/experience.
    I am trying to tell that everybody need to evaluate programs on their own and not to care about test lab results. How certain products performs, how the vendor acts, vendor reputation and abilities, very recent history - this is way more important than some lab non-real world tests.

    Me, too. I'd do the same if it was necessary. If the methodology was like from real world tests. Let's imagine you are right - the tests themselves are fine, correct - but they are still just lab tests - which can't show all the products' protection/repair mechanism. Tests are done by automated systems which is far from how the average Joe uses their computer and becomes infected.

    COMODO questions not only the methods but the way these organizations act and call themselves independent. From what Melih posted and what I have seen in other forums - what they do is pure business. And when there is business and lot of money - there is no independency.

    Why do you think Symantec left - their results were not bad at all. Why do you think Microsoft does not participate in all tests done by these organizations. Have you though about why Panda hasn't been part of Virus Bulletin tests for so many years (they later returned because of marketing but that's another story).
     
    Last edited by a moderator: Dec 25, 2012
  11. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    Melih is not exactly the most reliable figure in the Security Software Industry...
    Remember? Also, HERE , HERE , and HERE.
    Symantec "sponsored" Tests at Dennis Technology Labs and NSS Labs!
    AV-Comparatives and AV-Test do Not operate like this...
    Judging from MSE Scoring in the latest Tests of AV-Comparatives and AV-Test...:rolleyes:
    They returned just 'because of Marketing', ah?
    What a "convenient" explanation...:rolleyes:

    BTW, I haven't seen complaints from Bitdefender, Kaspersky, F-Secure, and G Data for example.
    It is the Poor Performers that always Complain; not the Top Performers of the AV-Comparatives and AV-Test!
    -Do you have the Knowledge and Professional Expertise of Andreas Clementi and/or Andreas Marx?

    -Do you have the Infrastructure of AV-Comparatives and/or AV-Test?

    You have systematically tried to diminish the Reliability
    of AV-Comparatives and AV-Test for obvious reasons...:blink:

    At the same time, you have FAILED to present:

    -a more Reliable AV-Testing Methodology than the one of AV-Comparatives and AV-Test!
    -more Reliable AV-Testing Results than the ones of AV-Comparatives and AV-Test!

    Before constantly Criticizing the Reliability of AV-Comparatives and AV-Test,
    you need to establish one for yourself! :rolleyes:

    P.S.
    AV-Comparatives offers REAL-World Test Results!
    http://chart.av-comparatives.org/chart1.php#
     
    Last edited: Dec 25, 2012
  12. er34

    er34 Guest

    I want to note that I am not fanboy or supporter of COMODO but someone who looks at the picture from 3rd party point of view. COMODO and Melih might not be the most reliable figure in the industry BUT he is the one who does not care that much about destroying his reputation - he has many times told the truth when others remain silent.

    Other vendors have done the same. Symantec left AVC before:
    - their products (despite all problems their products are modern, their lab is big enough and stable, the company has resourses) do not excel at static scanning and static scanning of files is something that can lower their reputation because of s****d users who care very much about tests
    - static scanning of tons of files is far from real world scenario where people actually get infected


    Really? You really think this o_O

    They do free job and free tests ? AVC doesn't want 8000 euros for participating in the main tests ?

    AV-Test commissioned test (type this in Bing and you'll see all vendors have paid AV-test for such tests)
    Capture.PNG


    http://www.kaspersky.com/images/AV_Test_Real_World_Testing_Report-10-139045.pdf

    Microsoft does have the money to pay but I think they don't want to because their product does not fit the LAB requirements of automated systems AVC just like COMODO. Additionally, Microsoft's security strategy (I have posted about that before) includes protection from different layers and different Windows technologies. Testing organizations are very limited and they do not test all Microsoft technologies against cyber threats.

    As soon you realize LAB is different from REAL-WORLD (in every aspect of life) the better for you would be. And REAL-WORLD is what really matters. Microsoft might last in many tests but its market share and popularity continues to grow/expand. People use their products - there must be some reason for that, right ?

    What a true explanation! You can find Panda blog posts before they were part of these tests (for example for VB), then later they returned. But nothing has changed in the methodology VB or many others use. So what is the reason - by participating in such tests they get bigger popularity. No matter if you are number 1 or number last - popularity is what matters - there is no bad advertisement.

    And you'll never see. These vendors pay organizations for marketing - vendors do know better than me and you what is real world. As soon as they get unrealistic good results and good marketing - nothing else matters. Microsoft also does not complain (despite being a poor performer - your words - not mine) but their marketing strategy is completely different.

    Do you know me?
    No.

    Do you know who I am ?
    No.

    Do you know what knowledge and expertise I have?
    No.

    So why do you try to judge me? What I have I keep it for those who need to know it - not you. I will not make myself a good marketing with my expertise and I will not use it against others. And this has nothing to do with my statements - everybody has the right to experess his/her opinion - valid or not. I find my statements for valid.


    I am not trying to diminish anybody's reliability. I am just saying that such lab tests performed by today's testing organizations are not that real-world and their results must be taken with a pinch of salt. Even they state that the tests are not the only thing that matters. However, some people think the test are what really matters and nothing else.


    I don't need to do that.

    No, this is not real-world tests. This methodology is far from what most computer illiterate users would do and is far from how they get infected. Real world users use either old or not updated versions of protection software, some lack such, some have not updated software, many run with different product configurations. Additionally, there are many other protections and modules that organizations currently do not test and do not take into consideration such as Windows technologies, browser protections, the most important - the human behind the machine and his/her reactions.
     
  13. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    -Search results from Bing...(I'm Not Surprised at all; how original...:rolleyes:)
    -AV from Microsoft (MSE)...
    -Internet Browser from Microsoft (IE)...

    'Microsoft does Everything Perfect', ah?...:rolleyes:

    If that had been true, then Mozilla, Google, and the rest 3rd Party Vendors (Security and Not)
    wouldn't have been alive!

    If you equate the Tests of AV-Test with the ones of NSS Labs and Dennis Labs,
    then you certainly don't know much about AV Testing...:rolleyes:
    Do you have any Proof against Bitdefender, Kaspersky, F-Secure, and G Data or
    is this just another "Big-Talk" of yours?

    AV-Comparatives Real-World Protection Test
    http://chart.av-comparatives.org/chart1.php#
    I am Not surprised at all that even this kind of Test is Not Accurate enough for you...:rolleyes:
    The only Test you would consider "Accurate" would be the one in which MSE could be on Top...:rolleyes:
    But such a Test does Not exist...:rolleyes:
    You are in No position to know what Percentage of Users run Old or Not Updated Security software!
    I wouldn't call *STUPID* the users (HERE and HERE)
    who care about static-scanning tests (like the ones performed by AV-Comparatives/AV-Test)!

    Especially, when the Criticizers of these Tests have FAILED to offer more Reliable Tests!

    To end this issue (since you continue Throwing Mud at AV-Comparatives and AV-Test
    and at the AV Vendors that Scored High in these Tests):

    1. Setup your OWN AV-Testing Method.
    2. Publish your Results.
    3. Convince the Security software Community that your work is
    more Reliable than the one AV-Comparatives and AV-Test.


    Since you cannot (despite the Knowledge and Expertise you claimed that you have...:rolleyes: ),
    present something better than
    what AV-Comparatives and AV-Test have offered
    then,...:thumbd:
     
    Last edited: Dec 26, 2012
  14. Syncman9

    Syncman9 Registered Member

    Joined:
    Jul 28, 2004
    Posts:
    113
    Location:
    UK
    This thread caught by eye, and while I can understand why the OP proffered his view, I for one disagree with it.

    Personally I think it’s fairly clear that AV Tests can never be true real world tests, not just because they’re done in a lab but because each person’s setup is different, their knowledge and experience is different etc etc. However these tests do offer something of an insight, and in such an active and fluid environment such as PC Antimalware, people need some way of gaining a yard stick to measure overall performance.

    I also would never trust the marketing garb from an AV company; I want something more tangible that helps me make that decision. Sure I can test the software on my own PC but that won’t give me much of an indication of how well I’m really protected and granted a lab test won’t give me that either. It’s just for me something is better than nothing. The more information I have to hand, the better informed decision I can make.

    So finally, I think it’s better to be somewhat supportive of these companies, rather than overly critical of them. I would certainly trust them and their methods, than self-professed ‘experts’ posting you tube tests.
     
  15. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    IMHO, no private center has the resources and money to spend to make a trully "scientific" and "statistically accurate", "real world test", in "world scale".

    However, the tests existing are an indicator, of more or less relative strength of products, under the specific conditions and sampling of each organization, which can't be known to depth, but one must assume it's as good as possible.

    The problem arises since the start, with sampling.

    - How do you make the samples representative of the world malware. Since you can't possibly be certain to gather 100% of the world's malware.
    - Do you choose to make the malware pool according to prevalence and incidence of different malware types and how does this reflect in different geographical areas? For example, can you positive ID that the prevalence of backdoors is say 23% of all malware and do you include that in your pool or do you randomly add to the pool whatever malware you get your hands on?
    - How big will your pool be and how much of the older malware in that pool has now so much incidence, to be almost insignificant? The larger the pool of "old" malware, the higher the products will score (good for AV-companies).
    - Must you give the same weight to detections that are hardly prevalent, compared to malware that has much wider distribution? Because in the real world, more people will encounter variants of a rapidly spreading malware, than an old malware that still exists in a small niche and most users will hardly ever encounter it and it may be even geographically isolated.
    - How much weight will you give to zero-day detection compared to the rest of the samples? Or won't you include them at all in the same test? Ideally, to determine this, you must know, how much infection impact zero day malware has to users. If say 20% of users are infected by "fresh" malware, maybe you should adjust the pool to reflect that, being "real world".
    - Will you do dynamic , static or both? Will you allow for products to use all their defences, or must they be "on equal features"?
    - How will you do quality control and how much of the testing steps are standardized in order to avoid bias?


    References:

    http://en.wikipedia.org/wiki/Sampling_(statistics)

    http://en.wikipedia.org/wiki/Prevalence

    http://en.wikipedia.org/wiki/Standard_deviation

    http://en.wikipedia.org/wiki/Confidence_interval

    http://en.wikipedia.org/wiki/Bias_(statistics)

    http://en.wikipedia.org/wiki/Statistical_significance

    The above are links are basic statistic principles, used all the time in scientific tests in various fields. Although i am rusty with them now (i did them in 1st year of univ and never liked them), if you don't see them in a statistical research, it's not even close to "scientific". In health related tests, the above is only the start, you then have protocols defined from "above" to follow and organizaitions like the FDA and WHO who must review everything and say that what you did was a trully "statistically significant" test and not some pharmaceutical company funded fraud or half-baked drug, just to make $ at the expense of patients.


    I don't think any private lab has the resources to do such a high level of tests and honestly, i don't they they would have the will either. Cause i have the suspicion, that the vendors' tests wouldn't end up in 99,98% success, but much lower. But vendors don't like to pay for seeing a "77%" "real-world" result for example. What can you do with a 77% logo in your website? People in order to buy an antivirus, want to see the "VB100", not a meagre 80%. And if vendors don't pay, labs will close. As with all symbiosis, one needs the other.


    But in absence of better, the current tests are at least a relative indicator. But the above, also explains, how different labs can show different results, without necessarily "cheating on the test". Cheating is relative to the sampling you may do... Suppose a testing gets an order from X Antivirus to make a test. Let's say that X Antivirus is particularly good in catching rootkits and trojans but bad in virus scripts and it's particularly good with US circulating malware. If you want to boost said antivirus, you will increase in the pool the percentage of rootkits and trojans of US origin, reducing virus scripts. The result, technically will be "correct" in execution. Meaning, the product will flag a higher score without altering the results the scanners will show. But it will not be statistically signigicant, because of the above links. :D But that's also why FDA in USA or WHO exist, so that "private labs" won't do the testing of other privates using arbitrary protocols...
     
    Last edited: Dec 25, 2012
  16. BoerenkoolMetWorst

    BoerenkoolMetWorst Registered Member

    Joined:
    Dec 22, 2009
    Posts:
    4,873
    Location:
    Outer space
    If you read the report you'll see that they do use outdated and vulnerable software on the testing machines.
     
  17. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    Spot on! :thumb:

    Awesome! :thumb:
     
    Last edited by a moderator: Dec 30, 2012
  18. jo3blac1

    jo3blac1 Registered Member

    Joined:
    Sep 15, 2012
    Posts:
    739
    Location:
    U.S.
    Jesus, why do people argue so much about these tests. Obviously there is no single test that can reflect the actual performance of a product. At the same time you can't just discredit what these guys have done. I've been looking over their procedures and testing methods and they look pretty solid. For someone to completely dismiss these tests is complete insanity. At the same time to argue that AV A is better than B because it got 10% better score is not far better.
     
  19. Cutting_Edgetech

    Cutting_Edgetech Registered Member

    Joined:
    Mar 30, 2006
    Posts:
    5,694
    Location:
    USA
    I hope Eset gets their removal, and protection ability / score up in Version 6! I do believe test such of these have merit, and should be taken seriously.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.