New Detection Test - Dennis Labs

Discussion in 'other anti-virus software' started by dschrader, Oct 29, 2009.

Thread Status:
Not open for further replies.
  1. Zombini

    Zombini Registered Member

    Joined:
    Jul 11, 2006
    Posts:
    469
    I agree..
     
  2. dcrowe0050

    dcrowe0050 Registered Member

    Joined:
    Sep 1, 2009
    Posts:
    378
    Location:
    NC

    I agree.

    I think they are going in the right direction with the test. Much better methodology than some but for one why did they test full security suites against free standalone AV's, this makes no sense to me at all. Plus this test was presented all wrong, I don't doubt that symantec is one of the best choices in security, I don't doubt that Simon is knowledgeable and has a good reputation, but to me(and I may be wrong it has happened before) why not pay a known company, someone that most people have heard of and trust for the most part, to do the testing in stead of someone that not many people have heard of. This just kind of casts a shade over the whole results for some. Like I said though I like the idea of these tests and I hope that others continue to improve and build on them.
     
  3. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    you're missing the point:-what I said was in reply to what IC said about no of staff at a company relative to how good that company can/should be;he seemed to be saying that only a large company can do things correctly and thoroughly,testing and developing
     
  4. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    That's not exactly true. It depends how many of them are in the R&D and virus lab department. Symantec is great example of that. They have thousands of employees, but most of them aren't in the virus labs, but in marketing, PR and so on. It's also important that staff is qualified enough, because if you have bunch of wannabe coders, you won't get far.
     
  5. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Where did i say that? Having a smaller group has nothing to do that you perform quality wise worse. It's just a matter of how much you can process. Because some of the malware needs human interaction to verify specific things that you can't find out in an fully automated system. It's self explaining the more that is the more you struggle with a small team. With a small team you could (theoretically) just CRC everything that comes in fully automated. HOWEVER that is not how you can VERIFY if it's really malware or not. So basically making detections with a smaller team is "much" easier than VERIFYING detections/samples.
     
  6. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Yes. And on top of that keep in mind that Symantec with lots of employees (compared to the smaller av companies) has other products as well. They are NOT just a standalone AV company. Actually their AV Team is rather small compared to the overall company size.
     
  7. pbust

    pbust AV Expert

    Joined:
    Apr 29, 2009
    Posts:
    1,176
    Location:
    Spain
    Heiya Andreas,

    I posted some questions which are still unanswered regarding the methodology (posts #6, #13 and #14). I'm not questioning the other issues which are being discussed (sponsorship, sample selection, etc.), just technical details of the methodology. The most important being that in the test supposedly the products had access to the Internet in real-time to access updates and cloud-scanning.
    However the methodology mentions the following as well:
    My point is that cloud-scanning technologies vary a lot. McAfee's for example is based on DNS queries, while ours is based on HTTP. Obviously a caching HTTP proxy in offline mode can have an effect on HTTP cloud-scanning while at the same time have no effect on DNS-based cloud-scanning.

    All I'm asking is if they *verified* that our products' HTTP-cloud-scanning technologies were working behind a caching proxy in offline mode. Unless they actually reversed the cloud-scanning HTTP communication this is hard to verify as a replayed response is not the same as a real-life response.

    I'm also asking what type of setup it was (which proxy and config exactly) so that we may run some tests of our own to see whether this setup actually allows HTTP-based cloud-scanning communication to perform as it should or not.
     
  8. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    Hi Pedro,

    yes, your concerns/questions are valid points. (->PM ;))

    Have you already tried to ask DTL directly or asked only the topic starter?
     
  9. pbust

    pbust AV Expert

    Joined:
    Apr 29, 2009
    Posts:
    1,176
    Location:
    Spain
    For now only to the OP.
    But you're right, I'll ask DTL :)
     
  10. guest

    guest Guest

    Panda is always complaining about tests, why Panda doesn't participate at av-comparatives so we would see the "real" performance?
     
  11. pbust

    pbust AV Expert

    Joined:
    Apr 29, 2009
    Posts:
    1,176
    Location:
    Spain
    I don't think that's a very fair comment.

    We participate in AV-Test, ICSA, CheckMark, PCSL, PCMag, PCWorld and other tests where we're included.

    AVC and Panda are talking about this as well.
     
  12. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    You didn't say it outright but you seemed to be intimating that you thought that a small company with only a few employees stood no chance of testing products thoroughly or correctly,historically I suppose large companies normally start of as small companies and some of them have grown not by corporate takeovers but by actually being good at what they do,in-fact in some fields some large companies do use the services of small specialist companies because they are VERY good at what they do,Mercedes couldn't get the SLR to perform or handle how they wanted:-they turned the problem over to McLaren,hardly a large concern but very specialised in what they do for all we know this testing lab may employ very highly skilled people who more than make up for their lack of numbers
     
  13. guest

    guest Guest


    Yes but the most representative is av-comparatives.

    AV-test: as far as I know they work for magazines, I have never seen a good test of they.

    ICSA and CheckMark: beautiful label that all the AV vendors have.

    PcMag, PCworld...: not very serious test, very similar to Dennis Labs.

    So av-comparatives, why not?
     
  14. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    163,838
    Location:
    Texas
    Vendors can participate in whatever tests they desire.

    At any rate, this thread is headed off topic so let's get back on the track.
     
  15. Someone

    Someone Registered Member

    Joined:
    Jan 18, 2008
    Posts:
    1,106
    What do u mean with AV-test not having a good test?
     
  16. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Based on your opinion.

    Yes, they do tests (only the detection test and reaction test) for magazines. It's up to the magazine/reporter what they make out of this. As usually they add their own opinion to such tests. AV-Test has no influence on that.

    Yes.

    So basically you're saying that Dennis Labs is not serious by comparing PCWorld tests with it. I disagree. Even given the fact that Dennis Lab's Testing is not "perfect" it's one of the better (if not even the best) start of AV Testing that i've seen recently. (One reason why i didn't jump in here earlier and complained about the test!)

    I don't understand the question.
     
  17. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Thank you, Page42, for your civility and professionalism. I respect and welcome all viewpoints, but especially those that are different. It’s how I learn, which is the motivation for my participation in this forum.

    Someone once said, “If two people have the same perspective, then one of them is redundant.” :)

    • When a pharmaceutical manufacturer is introducing a new medication, they pay for the clinical trials. Thus, are all drug safety tests suspect? I think not.
    • When an automotive manufacturer conducts car safety crash tests, they pay for the testing. Thus, are all automotive safety ratings suspect? I think not.
    Et cetera....

    Yes, I enthusiastically agree. However, to be clear, “being good” isn’t an excuse for not “being great.” Hopefully, Symantec and other vendors and testing organizations will build upon this framework and improve it further in the months to come. Doing so will “raise the bar,” and spark further competition and innovation in the industry.
     
  18. elapsed

    elapsed Registered Member

    Joined:
    Apr 5, 2004
    Posts:
    7,076
    Flawed logic, they are paying to test their products are up to official standards, they are not paying to test their products against market competition.

    Now sure, after the test is done you can say, our product scored 4/5 which is better than product Y since when they did the test they scored 3/5. But again the point is the tests we're completely individual with no "VS" or "against". The meaning of the tests in this case was health and safety, the meaning of the AV test was completely competition based, since it involved more than their own product.
     
  19. subset

    subset Registered Member

    Joined:
    Nov 17, 2007
    Posts:
    825
    Location:
    Austria
    This comparison is as flawed as the comparison test.
    They didn't pay a test for their program, they payed for a comparison test with rival products.
    There is a difference between an independent test and an bought test.
    IMHO this unfair practice is much more important than a seemingly valid testing method.

    The security software industry is young and has most often no or only very little ethical standards.
    Same applies to the security software testing industry, many get a lot of money from the industry, publishers etc. but all are independent. :blink:

    How about AMESO (Anti-Malware Ethical Standards Organization)
    Board of Directors: Michel de Montaigne, Voltaire, Immanuel Kant, Karl Popper.

    Cheers
     
  20. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    perhaps to get away from the bias or presumed bias of these tests perhaps its time for all the large anti-malware companies to get together and ALL fund the setting up of a testing facility,which could be monitored by reps from any/all companies to make sure than no favouritism is shown to any products by the testing methods employed
     
  21. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814

    Yes and when they start disagreeing they can whack one and other with wiffle bats as it would porb be more effective then having any of those guys in a room together.
     
  22. Hugger

    Hugger Registered Member

    Joined:
    Oct 27, 2007
    Posts:
    1,003
    Location:
    Hackensack, USA
    Originally Posted by Pleonasm

    * When a pharmaceutical manufacturer is introducing a new medication, they pay for the clinical trials. Thus, are all drug safety tests suspect? I think not.
    * When an automotive manufacturer conducts car safety crash tests, they pay for the testing. Thus, are all automotive safety ratings suspect? I think not.

    There is government oversight of drug manufacturers and the testing done. In the USA it's the FDA.
    Also, car manufacturers must meet or exceed government standards. And the cars are tested for crash worthiness by the insurance companies.
    There are documented cases of improper testers being prosecuted.
    That doesn't seem to be the case for the computer security industry. We're probably better off not having the government involved, but perhaps there should be some sort of strict enforcement in the industry.
    Just a thought.
    Hugger
     
  23. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    I agree. The 'golden standard' seems to be av-comparatives.
    Objective. Limited in a sense, it does not test HIPS components and other aspects, but it's the best objective test. Feel free to disagree. :p
     
  24. bwoirhaye

    bwoirhaye AV Expert

    Joined:
    Sep 24, 2002
    Posts:
    5
    Disclaimer: I'm a symantec employee, so some of you should automatically discount anything I have to say.

    Reacting to the "AV-comparatives as a gold standard" comment:

    The last two AV-comparatives tests, the august 2009 comparative and the oct 2009 malware removal test, both showed Symantec in the top tier. These are not symantec-sponsored tests.

    In the comparative, of those with the Advanced+ rating, only gdata scored higher than symantec. This is the traditional flat file scanning tests with all the characteristics that have been discussed to death.

    In the malware removal test, symantec was one of the three (along with escan and microsoft) to score "good" in both malware removal and leftover removal.

    Symantec's products are not perfect, but they are very capable of testing well in both sponsored tests and independent tests.
     
  25. FanJ

    FanJ Updates Team

    Joined:
    Feb 9, 2002
    Posts:
    4,652

    Hi Brendon,

    So very good to see you again here !!!
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.