New Detection Test - Dennis Labs

Discussion in 'other anti-virus software' started by dschrader, Oct 29, 2009.

Thread Status:
Not open for further replies.
  1. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Let me re-enter. Norton pays for test and test indicates Norton did good, so Norton uses it. Vendors pay IBK to do testing of their product and to submit a report to them. IBK doesn not test that product against other products, just malware. Then based on the results it is the vendors choice to do what they want to with his findings. They paid for it so it is their right.

    The problem is when you pay someone to tests against other products instead of just your products ability to fight malware. It will always look devious. Doing it for internal purposes is fine, but like I said in my first post, if the results had been different we would not even be having this conversation.

    Keep in mind, the OP works for Symantec and even jokes in the post that testing paid for by a vendor is suspect.

    It was presented wrong to this site, plain and simple. It doesnt matter if accurate or not because that went out the door.
     
  2. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    I am sure if Symantec had done this and the free product had done poorly, someone would have said "what do you expect.. its a free product. How dare they compare free with paid. They should have tried the paid version.. blah blah."

    It never ends.
     
  3. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    This I agree with. You simply don't go around tapping yourself on the back, its bad form.
     
  4. dcrowe0050

    dcrowe0050 Registered Member

    Joined:
    Sep 1, 2009
    Posts:
    378
    Location:
    NC
    I quit reading the results to the tests as soon as I saw that Symantec was testing 6 full security suites against 4 free and standalone AV's, this is absolutely stupid. I haven't read this whole thread I just skipped through it but what is the motivation behind this? I don't understand how their was even a question whether or not to test the full paid versions. This is a very shady test. I don't know how this was presented or if it was presented like this in other forums but from what Ive read in this thread Norton has been successful in one thing, damaging their reputation.

    Also am I the only one who has never heard of Dennis Technology Labs? I try to stay update with all the testing and their are a bunch of testers out their but I am positive that I have never heard of Dennis Technology Lab, maybe they are also paid for by Norton??
     
    Last edited: Oct 30, 2009
  5. EliteKiller

    EliteKiller Registered Member

    Joined:
    Jan 18, 2007
    Posts:
    1,138
    Location:
    TX
    I don't think that it's stupid since the average end user doesn't understand the differences between a free vs. paid antivirus/suite. However I feel that the paid counterpart should have been included so that one could see the difference. What is stupid is how people continue to discredit a test when their favorite vendor is not in the top 3.
     
  6. dcrowe0050

    dcrowe0050 Registered Member

    Joined:
    Sep 1, 2009
    Posts:
    378
    Location:
    NC
    (unless I'm missing something elite killer)this does nothing to help average users understand the differences in the two, all this shows is that you can pay for a security suite and get better protection than with these standalone free AV products. And yes I agree the paid counterpart should have been included.
     
  7. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Very true :thumb:

    First, their famous pcmag's "unbiased" reviews and now this DennisLabs one o_O....... OMG!!!
     
  8. JamesFrance

    JamesFrance Guest

    It would also be reasonable to complain if your favourite security was not included in the test. Symantec chose the participants didn't they?
     
  9. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    I also meant to say that IBK does not get paid by the vendor. The vendor pays av-comparatives. IBK works for peanuts.;)
     
  10. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    :D

    You know i try to stay out on commenting on AV tests. It just raises my blood pressure and spoils on average 1 keyboard for 3 replies, based on how bad the test was.

    You simply can't test av detections (and especially cleaning!) as a single one man show (or even with a small group of people). It's completely impossible.

    There are so many facts (and yes, they ARE facts) that play into this "game" that it's far more than just a "flawed" testing method.

    The first problem starts with the testing organization as such. Do they have experienced people? If yes, to what level experienced? Being able to download virus database updates and to click "scan" on a folder that contains malware is NO SKILL AT ALL. That's also the reason why we recently see so many "Hobby AV Tests". It takes NO SKILL at all and you easily get malware today.

    However, the REAL problem lies behind PRIORITIES of specific malware. The tester has control of WHAT is being scanned/tested. There are already the first potential flaws. If you test with old malware and Product B scores bad that doesn't mean that Product B protects you less than Product A that won the test. It actually may protect you much better but you don't know because your testset was outdated. Next Problem is that there are so called "history-Malware". That means malware that IS old, but still appears "in the wild". Netsky for example. So you have to include specific "old" malware in your tests as well because if somebody would miss that the customer most likely gets infected as well even given the fact its old. That leads to the next problem: The AV tester NEEDS TO KNOW what's going on CURRENTLY in the malware world. He needs to know that big "problems" are currently numerous versions of ZBOT-based backdoors. (Just one example)

    That boils down to MAINTAINING testsets. And it's NOT ok just to add new files to an existing "scan folder."
    It also boils down to maintaining a proper whitelist / fp set. Alone that takes a huge amount of resources because you simply can't copy files where you think they must be clean into this fp set. Some AV's will pick up some files because they might be bundled with adware what other AV's don't report.

    Next question is how many AV testing guys can reverse engineer malware to that extend that they can argue with professional reverse engineers from av companies? Honestly, i don't know ANY. Why is that important? Because if some AV company complains that file 123.vxe shouldn't be in the test set you must be able to explain WHY it is in the testset. You simply can't remove it because 2 or 3 companies said "it shouldn't be there". If you do that then you're relying on decisions that the very same companies make that you test.

    Reverse Engineering skill is essential(!) for example for cleaning tests as well. Because you need to verify after the cleaning process is done if the files in question are still malicious or not. Because some products just kill the jump (or call) into the malicious code and call that "cleaning". That basically works (because the virus code never gets activated) however, other products may still report this file as being infected because they find fragments of the virus code that is "dead" inside the file. Who to blame now? The product that cleaned wrong? Or the product that produces an FP? And is it even an FP? I know the answer. DO YOU KNOW THE ANSWER?


    That would be the same if you wanna test let's say cars. For testing the maximum speed of the car you need to have a long secured environment (test track). If you don't have that then you can't provide this details based on "according to the vendor it's 164.5 mph". They can tell you ANYTHING.

    Last but not least: Even huge companies with 100's of employees having difficulty to keep track of current malware. How on earth will you do that with let's say 5 or 6 guys as "AV Testing Lab"? Tell me?
     
  11. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    isn't what actually matters to the end user is "how well a product protects" not how much experience the actual tester has:-as long as they are following instrucions laid down by someone who is experienced,pretty sure some of the testing on very important medical and scientific breakthroughs wasn't done by highly trained people,just staff following instructions and some smaller companies with smaller no's of staff actually do do things better than large companies with large no's:-there is less of a case of of passing the buck in smaller companies,after all how big was the skunkworks section of lockheed?
     
  12. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
  13. dcrowe0050

    dcrowe0050 Registered Member

    Joined:
    Sep 1, 2009
    Posts:
    378
    Location:
    NC
    Yes the Avast blog was great did you see the two Dennis labs pictures, thats good stuff.

    Who ever heard of Dennis Technology Labs before this test was done?
     
  14. markcc

    markcc Registered Member

    Joined:
    Apr 8, 2008
    Posts:
    185
    Location:
    Michigan, usa
    Inspector

    You are way beyond my pay grade in this discussion, however just because a company is very large does not mean they do the best job. Quite the opposite in lots of cases. Take GM for example, not many people think they had made the best cars in their segments, they went bankrupt. Look at all the engineering teams they have. What did it get them?? Out of business!! Smaller firms often do the best jobs, I believe it is the same way in the A/V business also. Steve 1955 had a great example of skunkworks.

    As a computer user I want the A/V to keep the stuff off my computer. It needs to detect the malware that we may come across today whatever that may be. There needs to be a good way to test real world scenarios that is relevant today & tomorrow & I don't care how big the organization is.
     
  15. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    thanks IC.

    In the end, the OP actually has gotten what he wanted, when this thread was started.:cautious:
     
  16. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    The problem is quantity of malware. The problem is lack of resources. Do you guys have an idea how many NEW malware every day arrives on average?

    It's FAR more than 5000 samples! You have to have automated process in place to deal with that. You have to have people that guide this automated process and everything that can't be handled automated (there is lots of such stuff recently)

    You go NOWHERE with only 2 or 3 guys available for that.
     
  17. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    Isn't Norton/symantec the biggest av company?therefore by your reckoning shouldn't they be also the best?I thought there was an agreement between AV companies to share info on malware so in the end aren't all researchers pooling resources?
     
  18. IonSurge

    IonSurge Registered Member

    Joined:
    Oct 31, 2009
    Posts:
    2
    I think you're completely missing the point. I believe he was speaking about the testing company not the vender and the fact that the testing companies need to be familiar with the malware being tested to truly give accurate results as to whether the malware was cleaned from the system. IC mentions specifically the fact that one product may remove part of the bad code from an infected file making the file no longer a threat. Some other product can scan that file and see there is still some fragment of bad code in it and even though the file is no longer an actual threat, the file can be detected as a threat. If the tester is unfamiliar with what is actually going on here then they may believe that this is a missed threat where in reality it's actually a false positive. This is just one example and I believe I understood it correctly.
     
  19. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469

    Sureeeee.. everyone is paid by someone.
     
  20. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    The detailed testing methodology in the pdf is nothing like I have ever seen before and I am surprised that you think the guys that executed that methodology are incompetent.

    Here are a few of the details, that you need to pay attention to:

    How many testers do you know of even setup the system in a way that clearly indicates they know what vulnerabilities and drive-by downloads are. Most testers install XP SP3, and thats it. No vulnerable software etc.

    Once again.. attention to detail. They didn't use Vmware VIX APIs to launch IE. Instead they made it to be as realisitic as possible, by typing in the URLs just like a normal user would.

    Once again, unbelievable detail on exactly how they tested.

    and I can go on..

    So if someone thinks this was done by a 1-man shop by a guy who doesn't have a clue, all I can say is that you are on the wrong newsgroup

    Here is a challenge: To point me to even one online reference of a tester that clearly lists their testing methodology in such detail

    Clearly Dennis Labs is ahead of the curve in their test methodogies.. Good job Dennis!!
     
  21. acr1965

    acr1965 Registered Member

    Joined:
    Oct 12, 2006
    Posts:
    4,995
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    I could understand if someone would complain (regardless of the results) about the sample size etc., but for the rest, the test method is not bad (= much better than what can be found in most magazines). basically it is an approach discussed and proposed at amtso (where vendors, testers, reviewers work together on test methods etc., even if sometimes there is no 100% agreement by all). also how the basic test system works was shown there. i would have expected that readers would be happy to finally see such a test made public...
    btw, simon was voted inside a board of amtso, he is not an unknown person (ok, I also did not recognize DTL when I first read it, but I heard about PCPRo).
    "Simon Edwards is a computer security journalist. He joined Dennis Publishing in 1995 and is currently features editor of Computer Shopper, the UK's largest consumer computer magazine. In addition he runs technical operations at Dennis Technology Lab, which specialises in testing security software. He is a founder member of AMTSO."
    so, could someone please summarize what is wrong with the test (based on facts) and at the same time also write how it would have been done right? this would lead to a more productive discussion.
     
  23. progress

    progress Guest

    Interesting test, what happened to Avira and Panda? :oops:
     
  24. markcc

    markcc Registered Member

    Joined:
    Apr 8, 2008
    Posts:
    185
    Location:
    Michigan, usa

    I completely agree with IBK. He said it much better than I did.
     
  25. Saraceno

    Saraceno Registered Member

    Joined:
    Mar 24, 2008
    Posts:
    2,405
    I haven't read the test, only the comments in this thread. Seen plenty of tests. ;)

    I think what some have said, is when you're paid to test samples, the tester could only use samples the 'star product' did well against.

    For example, if Avast were paying someone to test their program, to make it favourable to Avast, the tester could publish 40 of the samples in the Avast blog, and show how Avast and say Emsisoft did well against these threats, but all are not detected by Symantec, well through signatures at least. So Symantec would do poorly.

    So the question is asked, if the samples aren't just randomly selected through browsing the internet, and watching a regular person interact with websites through Google, then the small sample by a paid company designed to show off the strengths of a product, can always be questionable.

    The tester could leave out the threats Symantec missed. Just not include them in the test, keep the ones it did well. Now, in case you're wondering, I personally, still think Symantec provides some of the best detection, just don't think small paid tests are the way to go.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.