PassMark: Consumer Antivirus Performance Benchmarks 2011

Discussion in 'other anti-virus software' started by Pleonasm, Sep 8, 2010.

Thread Status:
Not open for further replies.
  1. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
  2. guest

    guest Guest

    Another one sponsored by Norton (4 in one day) I am losing my faith :D
     
  3. Nevis

    Nevis Registered Member

    Joined:
    Aug 28, 2010
    Posts:
    812
    Location:
    255.255.255.255
    just cause norton scored best doent mean it is sponsered by symantec

    nw u guys are even doubting av-test, etc then god help ur trust
     
  4. begemot64

    begemot64 Registered Member

    Joined:
    Jul 28, 2010
    Posts:
    71
    Yes, sponsored is annoying, as we'd all love to see some independent tests. However, there's no getting away from the fact that NIS2011 is one of the lightest, if not the lightest suite out there - just because a test is sponsored does not mean that its results aren't valid.

    More like Symantec realises that their massive overhaul needs to be marketed, and from their point of view, rightfully so. Good choice to have Passmark commisioned - they are generally thorough.
     
    Last edited: Sep 8, 2010
  5. Boyfriend

    Boyfriend Registered Member

    Joined:
    Jun 7, 2010
    Posts:
    1,070
    Location:
    Pakistan
    Thanks Pleonasm for update. I do not believe in such sponsored tests.
     
  6. guest

    guest Guest

    I didnt invent anything is written in the report (at the end) that Norton paid to PassMark for this test and Norton gave to PassMark some of the test scripts used for the tests.

     
  7. begemot64

    begemot64 Registered Member

    Joined:
    Jul 28, 2010
    Posts:
    71

    So what if the test was sponsored, and Symantec gave scripts?

    It's results correlate with av-comparatives' performance tests, and a number of other tests. Are you suggesting this test is not trustworthy, in context of other tests that we have seen?

    Several independent organisations drawing the same conclusion is pretty powerful evidence, wouldn't you agree?
     
  8. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Very good work Eset.:thumb:
     
  9. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Yes, it is called “convergent validity.”
     
  10. codylucas16

    codylucas16 Registered Member

    Joined:
    Nov 17, 2009
    Posts:
    267
    Based on the test date of some of the software and the version number of some of them this test is completely worthless, to me anyway.
     
  11. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,789

    Really? Seems like it is pretty recent builds from what I saw.
     
  12. Martijn2

    Martijn2 Registered Member

    Joined:
    Jul 24, 2006
    Posts:
    321
    Location:
    The Netherlands
  13. Nevis

    Nevis Registered Member

    Joined:
    Aug 28, 2010
    Posts:
    812
    Location:
    255.255.255.255
  14. vlk

    vlk AV Expert

    Joined:
    Dec 26, 2002
    Posts:
    621
    This gives them the ability to e.g. initially do the test on a larger set of products, and then (after seeing the results) only have the final report include the products that actually scored worse than Norton.

    It's just my guess but looking at the list of tested products, I just find it weird... where are the other products that were included in the previous PassMark tests?
     
  15. JoakimM

    JoakimM Registered Member

    Joined:
    Jun 19, 2009
    Posts:
    51
    Location:
    Lomma, Sweden
    One can wonder why they left out testing MSE. Could it perhaps be because it doesn´t impact on the performance at all and IMHO probably would have aced this kind of test? Just a thought...

    Regards,
    Joakim
     
  16. begemot64

    begemot64 Registered Member

    Joined:
    Jul 28, 2010
    Posts:
    71
    My point was not that the methodology used was good (which it clearly isn't, an independent test would have been far superior, and this definitely looks like marketing), but that in context of other performance reports, Symantec's performance in this test does not look out of place (despite them submitting scripts). I can't comment on other vendors though, what are your thoughts?
     
  17. clocks

    clocks Registered Member

    Joined:
    Aug 25, 2007
    Posts:
    2,789

    I doubt it.
     
  18. 3GUSER

    3GUSER Registered Member

    Joined:
    Jan 10, 2010
    Posts:
    812
    This test is sponsored by a given company . However , think about other independant tests sponsored by all vendors who participate.

    We as end users can never be sure that a vendor didn't pay to someone or to an organisation to get better results . For example - AV-Comparatives , we know they all pay , but how much do they pay remains unknown to the public . It is also not clean if they all pay the same amount of money or some pays more to have more privilages / higher results ?!
     
    Last edited: Sep 9, 2010
  19. the Tester

    the Tester Registered Member

    Joined:
    Jul 28, 2002
    Posts:
    2,854
    Location:
    The Gateway to the Blue Hills,WI.
    Smells like an advertisement for Symantec to me.
     
  20. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Correct. However, it would be in the vendor’s own best interest not to do so, because the short-term ‘gain’ would be more than offset by the long-term ‘pain’ resulting from lost business due to a damaged reputation, if the activity were discovered.
     
  21. 3GUSER

    3GUSER Registered Member

    Joined:
    Jan 10, 2010
    Posts:
    812
    Oh , that would be behind the scenes - the vendor and the testing org. will only know.
     
  22. subset

    subset Registered Member

    Joined:
    Nov 17, 2007
    Posts:
    825
    Location:
    Austria
    Symantec funded this farce, it's plain stupid advertisement.
    I bet they also pay for laughter when their CEO tells a joke. :p

    Cheers
     
  23. PassMark

    PassMark Registered Member

    Joined:
    Jan 14, 2007
    Posts:
    8
    Just thought I would add my 2 cents, as one of the reports authors.

    We do benchmark testing for companies who commission them. It is understandable to think that because company X paid for a report it will be bias in their favour.

    However we do performance testing for about a dozen different security companies. Can't name them all as we are under NDAs.

    The majority of the reports never get made public. They are presented to the company that commissioned the report for their internal use to improve their marketing and technical development.

    So the reports that do get made public are typically only the ones that favour the company who commissioned the report. Unfortunately this can give rise to a perception of bias, as the public doesn't see all the reports that paint the commissioning company in a poor light.

    The test scripts referred to above are just DOS scripts to copy files between folders, or unzip/zip a large zip file from the command line.

    The reason for using scripts like this is becuase Neil J. Rubenking (the well known reviewer a PCMag.com) also tests using similar test methods and the companies all want to rate well in Mr Rubenking's tests.

    Mr Rubenking has had a significant impact on how the industry looks at AV performance. And it is not suprising (to us) that Norton also did OK in the PCmag testing.

    The rest of the tools are public tools like our Apptimer the Microsoft xperf tools and the 3rd party Httpwatch.

    So pretty much anyone with a few weeks of spare time should be able to reproduce the results, and I would encourage people to do their own testing to confirm the results.
     
  24. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380

    Are you even aware of who Passmark is ? If Symantec had rigged the test, then they should have won ALL tests. Notice that that lost quite a few test but over all came out on top.
     
  25. subset

    subset Registered Member

    Joined:
    Nov 17, 2007
    Posts:
    825
    Location:
    Austria
    Why? Who pays the piper calls the tune...
    Passmark http://www.passmark.com/forum/showpost.php?p=9460&postcount=7

    Passmark will show you whatever the customer wants you to see.

    Cheers
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.