new av-comparatives nov-09

Discussion in 'other anti-virus software' started by Subgud, Nov 28, 2009.

Thread Status:
Not open for further replies.
  1. IceCube1010

    IceCube1010 Registered Member

    Joined:
    Apr 26, 2008
    Posts:
    963
    Location:
    Earth
    That's where your firewall/hips combo comes into play. If the crack is doing something suspicious (outside of a program not using your cd drive), then the hips should pick it up. This is the formula that I've been using for the last 3 years or so. But you never know. Like you said, VT might come up 0/41 on a legit malware executable. Hopefully, the hips can stop it.

    Ice
     
  2. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    With over half of the tedste products getting detecting over 50% of the test-set, I wouldn't be too surprised if 60% will be the minimum percentage mark for Advanced+ next year. I think that the range for Standard should be 20-40%.
     
  3. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    I think it's a good test.

    Objective, not sponsored like some other tests.
    No AV vendor is forced to participate.

    HIPS, in the cloud etc. are not tested, nor does av-comparatives claim it's a complete assessment of the AVs.

    Just take the test for what it is.
     
  4. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    I have personally never thought Symantec is a bad product, even though I gave them God forbid $150 back in 2004 for nothing. What I can't stand, and I say it openly, is the fact that that they are prepared to doctor any test in order to prove any competitor unsuitable for their own gains.

    When Pleonasm was mentioning Dennis technology Labs, the test shows Norton detecting 100 % of the samples, and every body else struggling (Avira was second last, can you believe it?). They could be a little bit more elegant considering their wealth, but their ""bully" attitude brings them to a Goliath and David situation in just about any promotional bout.
     
  5. hamzah95

    hamzah95 Registered Member

    Joined:
    Jun 22, 2009
    Posts:
    108
    Even when Symantec was a resource hog, I still used them as long as 6-8 months, never got infected. Their products were and always will be great. But since they are a huge company, their thought of other competitors as inferior doesn't bring much improvement to their products.
    NAV for an example is really really light compared to their older products, but their proactive detection is still weak.
     
  6. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    The identical situation applies to me as well. I make critical comments on the methodology of a test, if in my opinion it is deserved, even when my preferred anti-virus product scores well in that test.

    We’re getting off-topic, but I find this unsupported conjecture to be troublesome. Everyone is entitled to their own beliefs, but not to present those beliefs as fact.

    Comments on this topic should, however, be posted in this thread.
     
  7. shanep

    shanep AV Expert

    Joined:
    Sep 10, 2008
    Posts:
    54
    Hi Hamzah,

    The test in question did not test the SONAR2 proactive detection engine in NIS/NAV.

    Shane
    Symantec Corp.
     
  8. Fajo

    Fajo Registered Member

    Joined:
    Jun 13, 2008
    Posts:
    1,814
    :eek: That's about all I can say!
     
  9. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    You were the one to link Dennis Technology Lab in the first place. There is no conjecture or own belief, the fact is that nobody sponsors a lab to test several AVs including the sponsor's own.

    You may not like AV Comparative methodology, but Symantec was not the only company "penalized" for not being online. My laptops have caught more than 200 pieces of malware in 4 years, all of them off line from USB flash drives.

    Finally we should all agree that AV Comparative is an independent lab, unless somebody can prove otherwise.

    Let me reiterate that I believe Norton has come a long way since 2004, but their marketing department uses deceitful tactics which in the end backfire, instead of helping build a better image of the company.
     
  10. hamzah95

    hamzah95 Registered Member

    Joined:
    Jun 22, 2009
    Posts:
    108
    Thanks for your answer, but when I was testing it against new malware, executed them, and if they weren't detected by the AV engine, there were no pop ups from SONAR.:(
    But I must admit that Norton's AV engine is really good against week old or more than 1 day old malware.
     
  11. shanep

    shanep AV Expert

    Joined:
    Sep 10, 2008
    Posts:
    54
    Hi Hamzah,

    I assume you were testing SONAR2 which is in NIS/NAV 2010 vs SONAR which is in NIS/NAV 2009. SONAR2 is a completely different engine. If you can PM me the source of your malware I'd be happy to test out some samples.

    Thanks,

    Shane.
     
  12. hamzah95

    hamzah95 Registered Member

    Joined:
    Jun 22, 2009
    Posts:
    108
    I tried the malware out in Norton AntiVirus 2010, a lot of stuff was missed but I remember now, SONAR didn't ask a question, it directly blocked it itself, but that was just one piece of malware.
     
  13. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    In the future, you may wish to be more careful when stating “facts.”

    For example, consider this test sponsored by Trend Micro and this test sponsored by Sophos and this test sponsored by Kaspersky Lab and this test sponsored by AVG and this test sponsored by McAfee....

    A company sponsoring a test does not mean that the test results are necessarily invalid, contrary to whatever you may choose to believe.

    I think you’re missing the point. The issue isn’t about a product being "penalized” -- the issue is the extent to which the test mirrors the real-world experiences of users. Attempting to infer the actual relative performance strengths and weaknesses of an anti-virus product based upon testing in such an artificial environment as that used by AV-Comparatives is problematical. Thus, for purposes of understanding the genuine level of malware protection provided by an anti-virus product, this AV-Comparatives test has no bearing.
     
  14. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    The fact remains that in each test that you are listing the sponsor comes out as the best player, they are just pure ads for the company who's paying for it. AV Comparative is different as Virus Bulletin.
     
  15. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    So, the anti-virus vendor that is rated highest in a test is potentially guilty of having achieved that honor through deceit? Let’s think about this problem differently....

    For purposes of illustration, consider this proposition: I assert that the only reason why AVIRA did well in the detection of malware in this AV-Comparatives test is that AVIRA secretly bribed AV-Comparatives. Do you find this assertion absurd? I do, and I hope you do as well. However, also consider this: the quality and the quantity of the evidence supporting my assertion about AVIRA is identical to the quality and the quantity of the evidence supporting your assertion that Symantec “influenced” the outcomes of anti-virus tests that it financed. If you believe the latter and if you wish to be intellectually honest and consistent, then you are compelled to also believe the former, because each is based on precisely the same unfounded speculation. My expectation is that reason will prevail, and both assertions will be (correctly) seen as preposterous.

    :)
     
  16. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    o wow, uve completely missed what he was saying havent u? it says the test was funded by Symantec, and Symantec was overwhelmingly on top, AV-C isnt sponsored by 1 vendor in any way... thers ur difference...
     
  17. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    Pleonasm, this can go on forever. If you show me a test financed by Symantec where Symantec is number one it becomes at best a promotional stunt for the company in question. As you have pointed out with your links, it is a common practice to sponsor tests where the sponsor comes out as the best player. So in this regard I owe Symantec an apology.

    AV Comparatives have said that the next round of tests, will include "real world" so we shall see what is going to happen.

    Post #28
     
  18. gaslad

    gaslad Registered Member

    Joined:
    Feb 18, 2007
    Posts:
    117
    Location:
    Toronto, Ontario
    Wow!

    Five different AV vendors sponsor the same independent tester Cascadia to compare themselves against the competition, and in every case the sponsor comes out on top.

    Any rational person would question the independence of such a tester, and chalk up the conclusions, even if valid, as a highly selective representation of the results, to put the sponsor in its best light. Spin marketing at its finest! (AKA "data mining").

    AV-C is at least transparent in its methodology, warns of its test limitations, and AFAIK is free of the taint of sponsorship. Whether or not on-demand scanning/detection has any relevance is a legit question; I look forward to their dynamic tests, which they say are "coming".
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.