Security Products Performance Benchmarking (Sep 2010)

Discussion in 'other anti-virus software' started by guest, Sep 9, 2009.

Thread Status:
Not open for further replies.
  1. guest

    guest Guest

    Last edited by a moderator: Sep 9, 2009
  2. Ade 1

    Ade 1 Registered Member

    Joined:
    Jun 21, 2006
    Posts:
    471
    Location:
    In The Bath
    How long did those tests take to do just to inform us what we knew already?!

    I guess some people have too much time to burn.
     
  3. subset

    subset Registered Member

    Joined:
    Nov 17, 2007
    Posts:
    825
    Location:
    Austria
    From the PDF... (Page 20)

    Cheers
     
  4. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Still looks like a good, indepth test.
    Not sure about some of the benchmark tests though, my comments/opinions bellow...

    4 - there's such a short time-peroid in it, it doesnt matter, especially considering the number of times the average user opens the UI.
    5 - performance can be seen with other tests rather than idle mem usage which is only a number, that doesnt matter, time matters.
    8 - you install it a few times a year, its not the end of the world if it takes a couple of min longer.
    9 - nothing to do with performance, and doesnt show the peformance impact on the computer, although Trend and G-Data are considerably high relative to the others.
    10 - nothing to do with performance and quiet irrelivant - as long as the registry keys are cleaned up after uninstalling and they dont cause any problems, most users dont care.


    Edit: and no Avira? - this is going to be horrific for some users here
     
    Last edited: Sep 9, 2009
  5. salmol

    salmol AV Expert

    Joined:
    Sep 10, 2009
    Posts:
    2
    It seems that they tested F-Secure IS 2009 and not IS 2010.
     
  6. raven211

    raven211 Registered Member

    Joined:
    May 4, 2005
    Posts:
    2,567
    Correct - easily seen in the summary.
     
  7. flyrfan111

    flyrfan111 Registered Member

    Joined:
    Jun 1, 2004
    Posts:
    1,229
    Wow, the company that funded the test WON!
     
  8. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Would you prefer it if they came second? :D
     
  9. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    i wuldve prefered if it wasnt funded by them, OR the people who fund it arent allowed to participate (which wuldnt really work out)
     
  10. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    the one who funds such a test usually keeps the right to decide what happens with the report, i.e. if it gets published or not. as the outcome of the test was good for the funder, of course they decided to get it published. that's why you will almost always see the one who funded the report coming out first (or very good). its not because the results are faked in favor of the funder, in this case its because the funder decides about what gets published.
     
  11. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    hmm so does that happen at AV-C as well?
     
  12. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    very true IB, you pay for it you better think they have a say. This may very well be a somewhat valid test and everyone is so focused in on who won and paid for it. What I find interesting is number 2, Eset. If you take the tests for valid, they did an outstanding job.
     
  13. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Ignore Symantec's results then and only look at the others... (unless you think Symantec may have fixed the tests so Eset comes second and McAfee came last) - go figure, none of us know about this for certain. I personally take this test as a reliable indicator more-or-less.

    ... and someone's going to have to fund it, especially if its going to be released in the public domain for free - I guess AVs have the most interest in it.
     
  14. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    well how does AV-C run? they seem to be pretty reliable and unbiased? im actually kinda curious since im not really sure how AV-C does it.
     
  15. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Think their pricing structure is more reliable - its fee the tested AVs have to pay.
     
  16. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    IBK has stated many times in the past that yes, individual testing for a specific product does come down to what the vendor decides in the end. And that is totally the way it should be. When it comes to the overall tests, yes their is a pricing structure but doesnt influence the end results. It cant. Vendors agree to this up front and understand the value of having a testing organization such as his as a useful tool for the future. You do not have to be first. If through his tests you demonstate your product is taking from the results what needs to be done, and you are increasing your products ability, even in small increments, that shows you are serious about creating a solid product.

    Hence, those who have dropped out in the past only proved they failed at being able to do so. Regardless of what others may tell you. Look at Norman. I have more respect for them for sticking it out to find ways to openly look for ways to build a better product.
     
  17. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    unbelievable :rolleyes: :D
     
  18. renegade08

    renegade08 Registered Member

    Joined:
    Aug 26, 2008
    Posts:
    432
    Guys, this is performance test.
    Who cares about: scan speed of solid state drives, installation time, installation size, file format conversion ( :blink: ).

    If this test was suppose to make laugh, well it did. (No disregards to the testers(who were doing their job) or the test itself).
    I don't think that this deserves so much attention IMHO.
     
  19. smage

    smage Registered Member

    Joined:
    Sep 13, 2008
    Posts:
    378
    It seems that Avira is so good that testers have to change the rules or simply remove the product from the test so that other products get a chance to claim first place.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.