Dennis Technology Labs: PC Anti-Virus Protection 2011

Discussion in 'other anti-virus software' started by Pleonasm, Sep 8, 2010.

Thread Status:
Not open for further replies.
  1. guest

    guest Guest

    I already told you that I dont care your test, like 3 times, you cant proof anything, how can we know that all your malware were really detected when you tested it, can be detected now but maybe not X days ago, so the next time make a video.
    If you dont want to belive nothing about my test, you can see many video on youtube.

    I didnt lie, I tested the latest beta, and all the changes until the final version were not related with any improvement in the detections, all the new and old features were 100% working.
    According to you the difference btw the final version and the latest beta of Norton 2011 must be abysmal, have you seen the changelog?


    Since I tested the lastest Norton 2011 beta at the end of August must not be big differences in the detections rates with the final version.
    I have show you video proofs malware from MDL not detected by Norton 2010 and Norton 2011 beta using less thant 10 samples... but seems that this is not enough for you.



    I dont even know what are you discussing here, you are defending Norton like if you were working for them, but nobody attacked norton. I just said that a testing group like Dennis lab should change the sources of their malware since is easy find malware not detected by Norton (like for any other AV) and they have tested norton 3 times with the same result 100%, norton 2010 two times (but you can easily find in youtube Norton 2010 failing with malware from MDL) and Norton 2011 one, is this possible maybe, but if any dude un youtube using malware links can bypass Norton 2010 and 2011beta they must be doing something wrong.

    You can keep trolling, and puting in my mouth things that I didnt say you do it quite well.

    You should read better the next time, I'm talking about it since 2010 version is realted with other 2 test sponsored by NOrton made by Dennis Technology Labs where Norton scores 100%. If you dont like it, dont read it.
     
    Last edited by a moderator: Sep 9, 2010
  2. guest

    guest Guest

    @begemot64

    You can keep trolling, and puting in my mouth things that I didnt say you do it quite well.

    https://www.wilderssecurity.com/showpost.php?p=1745508&postcount=2

    This is what I said.
    So is writen very clear since the start that I tested 2011 beta, I can't understand why you have invented that I have tested the final version or that I tested it with 10 samples from MDL. Must the that you are a bad troll.

    And then this
    https://www.wilderssecurity.com/showpost.php?p=1745753&postcount=22
    Is a recommendation that I make to a person so he can take his own conclusions, I didn't say 10 samples, I didnt say that I tested 2011 final, I didn't say take the samples only from MDL, all this is an invention by you, a troll.

    My english is not perfect but seems that you have some kind of problem while you read, so or you have an extremly bad memory or you are a troll.
    The next time remember that everything that you write in a forums remains, so think about it before start to trolling and lie.

    You can apply this to yourself:
     
    Last edited by a moderator: Sep 9, 2010
  3. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Yes, I agree. The AV-Comparatives whole product dynamic test is, in my opinion, the most enlightening test conducted by that organization.
     
  4. Baserk

    Baserk Registered Member

    Joined:
    Apr 14, 2008
    Posts:
    1,321
    Location:
    AmstelodamUM
    guest & begemot64,

    This kind of back and forth bickering is becoming extremely uninteresting.

    Im not saying that (one of) you is/are (both) right or wrong and I don't care who calls who an idiot or a troll but for -insert random deity- sake, give it a rest.

    Or continue this via pm, email, chat, chitchat or whatever appropriate channel. Please!

    I don't want to offend either one of you but let's discuss the 'merits' of this particular 40-sample 'test', that's difficult enough imao.
     
  5. begemot64

    begemot64 Registered Member

    Joined:
    Jul 28, 2010
    Posts:
    71
    I'm quite happy to continue by PM, for the sake of the thread. Let's get back to discussing the results of the 2011 test, where products such as NIS2011 (not NIS /KIS/whateverelse 2011beta, and not 2010) were tested:

    Pleonasm's point that it is self destructive for a security firm, especially one as big as Symantec to willingly engage in deception is a very valid one.

    There has always been a fine line between marketing/spin and deception - it's easy to make a test set where the methodology chosen favours a certain product, and give your "angle" on test results.

    On the other hand, Symantec is one of the biggest security vendors, and given how consumer products form only part of it's business portfolio, making a dubious test is just not worth ruining reputation over.

    With all sponsored tests, it's a good idea to carefully read the methodology chosen, and take results in a wider context - especially in context of other tests that have been carried out. Testing security suites is always methodologically difficult, and different tests will always give slightly (or sometimes significantly) different results. What matters is the general trend of similar tests.
     
  6. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    That pretty much echoes my sentiments on the issue.

    I really can't see why both parties would take such a huge risk to their respective reputations for such a negligible gain.It's not as if Symantec are going to double sales of NIS on the strength of this test.
     
  7. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    In addition, it would be pointless for Symantec to ‘influence’ the result of a single anti-malware comparative test, because other tests would contradict the fraudulent findings. Thus, a strategy of ‘test manipulation,’ for it to be successful, would need to be widespread across testing organizations and across time. Implementing the strategy would require a highly coordinated and comprehensive plan to deceive the public which makes the likelihood of its existence negligible, in my opinion.
     
  8. woody79_06

    woody79_06 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    3
    I don't put much stock into any of these tests.

    these tests do not test these products in a real world usage scenario model which IMO makes these tests marketing rubbish.

    A better test for exmaple would be:

    hand the software out to 10 home users(who are not computer savy) have them install the software and run it over a 6 month period and then check the computers later for malware.

    the product has to be tested in real-world usage scenrios in order for the test to deliver real measureable results...until that is done..these tests are just marketing garbage and prove nothing..
     
  9. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,787

    That would be great, but it is completely unrealistic.
     
  10. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    I tend to agree. However, I would question the results of such a test since the average home user might very well click on the wrong thing, and download some rogue.
    Of course one could argue that the AV should catch it, but I have noticed that the best ones sometimes fail to prevent a rogue. For that reason I use another layer, MBAM.

    I have to conclude that tests such as AV Comparatives, and maybe this one are as complete and useful as can be done realistically.

    Regards,
    Jerry
     
  11. woody79_06

    woody79_06 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    3
    how you figure?

    Nielson puts boxes on people's TV to get channel ratings...its a completely realistic test

    the reason they won't do it is because they will have a hard time controling the results.

    these tests are rubbish

    they do not follow the Scientific Method and the basis of sound modeling to perform these tests...that makes them garbage with skewed results...who cares what product wins these disingenious tests it still makes them garbage.


    They could very eaily hand out a 6 month trail to people and get them to run it with a data collection tool for a 6 month period..they won't do that though...because they won't get the results they want.

    If these tests followed the Scientific Model, that would be fine...but they don't and I rule them rubbish on that basis alone......

    You can't have real accurate results from any test or hypothesis without following the Scientific Model...you have to test these products under real world usage scenarios

    theses tests are like testing the the 4 wheel drive capability of a bicycle vs a Dodge Ram....can we get anymore disingenious?

    As i said before I could careless which product wins what...but theses testing methods are severly flawed beyond measure...anyone who bases a decision off of flawed data is making a bad decision.
     
  12. woody79_06

    woody79_06 Registered Member

    Joined:
    Sep 8, 2010
    Posts:
    3
    True but "clicking on the wrong thing" is that not a function of the software? is that not a real world testing scenario? it is...AVC and others are skewed results....

    no AV is going to block everything...lets be realistic here

    however a test like I mentioned would show us how much protection these products actually reall give in the real world...these tests like AVC offer a flase sense of security as they are not real world tests.

    I figure if every vendor participated in a test like I stated...100 users over a 6 month period..i bet the "real" malware detection/block % is somewhere in the high 20%...if were lucky..low 30%.....thats real world...but The Symantec's, McAfee's, and others don't want "you" to know that..
     
  13. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    The critical flaw in this approach, in my opinion, is that you have no guarantee that some of the users would not ‘manipulate’ their Internet usage and behavior so as to make a specific product look good or bad, thereby making the test results highly questionable. If there were a way, however, to mask or hide the name of the product from the user in order to conduct a 'blind' test, then such a bias would be overcome.

    In essence, the idea you are suggesting is known in the scientific community as a “randomized control trial” (see here for a description of this methodology).
     
  14. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    The results would, I suspect, vary widely according to the extent to which users engage in ‘safe computing’ practices (e.g., not opening attachments from junk emails, etc.). In my own case, I have not had a single malware infestation in about five years, due to the use of a high-quality anti-malware product supplemented by a dose of common sense.
     
  15. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    Very true :thumb:
     
  16. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    FYI: “Dennis Labs recently had its real world security testing methodology reviewed by the Anti-Malware Testing Standards Organization (AMTSO) and received 100% compliance” (see here).
     
  17. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    Now if only some outfit would review the reviewing methodology of AMTSO. :cautious: ;) :shifty: o_O
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.