AV_Test: The Ultimate Endurance Test?

Discussion in 'other anti-virus software' started by silverfox99, Dec 21, 2012.

Thread Status:
Not open for further replies.
  1. silverfox99

    silverfox99 Registered Member

    Joined:
    Jul 14, 2006
    Posts:
    204
  2. henryg

    henryg Registered Member

    Joined:
    Dec 13, 2005
    Posts:
    342
    Location:
    Boston
    I wonder how Agnitum's Outpost Security Suite would have performed with their testing?
     
  3. siketa

    siketa Registered Member

    Joined:
    Oct 25, 2012
    Posts:
    2,718
    Location:
    Gaia
    ...or CIS 6?
     
  4. silverfox99

    silverfox99 Registered Member

    Joined:
    Jul 14, 2006
    Posts:
    204
    The summary on page 8 of the report gives the results of the aggregate scores from the 10 separate tests occurring between January 2011 and October 2012 covering the three AV_Test categories of Protection, Repair and Usability as follows:

    1. Bitdefender Internet Security: Average points value 5.6 out 6.0
    2. Kasperksy Internet Security: Score 5.4/6.0
    3. F-Secure Internet Security: Score 5.2/6.0
    4. Symantec: Norton Internet Security: Score 5.0/6.0
    5. G-Data Internet Security: Score 4.9/6.0
    .
    .
    15. ESET Smart Security: Score 3.8/6.0

    The AV-Test Awards 2012 will be announced 31 January. Not sure if the results for 2012 will be much different from last year:
    http://www.av-test.org/en/test-procedures/award/2011/
     
  5. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    IMHO 22 months is way too long a period to discuss "endurance", as between major releases a lot of things change for many products; some for the better and some for the worse! That's why people generally don't buy licenses beyond 2 years for any product :)

    Sure, protection rates might stay consistent but usability and repair can vary wildly between version releases :)
     
  6. silverfox99

    silverfox99 Registered Member

    Joined:
    Jul 14, 2006
    Posts:
    204
    Very true. Interestingly though if you look at the Top 5 based on the 22 months, they are all still 'top tier' AVs today, even going over say the last 6-12 month tests. You are right this kind of long term endurance test misses recent events eg Trend's meteoric rise to the top/100% protection in some recent tests.
     
  7. CogitoTesting

    CogitoTesting Registered Member

    Joined:
    Jul 4, 2009
    Posts:
    901
    Location:
    Sea of Tranquility, Luna
    I think the time frame has to do with consistency (like you alluded to in your post.); and with consistency comes reliability. Two components that should be the hallmark of any good security system. To me, 22 month period is good thing. :).

    Thanks.
     
  8. er34

    er34 Guest

    Neither ESET is the worst in usage or protection, neither BitDefender/Kaspersky are the best in protection and usage.
     
    Last edited by a moderator: Dec 23, 2012
  9. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,645
    Location:
    USA
    Agreed. I would like to see a longer time period. If any particular company can't pull off a reasonably good product for more than a few months then they need to be exposed for that. These vendors are selling security software. They need to be good at it. For more than 6 months at a time.
     
  10. manak

    manak Registered Member

    Joined:
    Aug 12, 2012
    Posts:
    78
    Poor result ESET...
     
  11. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    As pure "protection" is what matters to me, G Data, F-Secure and Bitdefender would be narrowly my top products :cool:
     
  12. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    Don't see any reason to change what I'm using. Just one set of tests by one org out of many.
     
  13. Rompin Raider

    Rompin Raider Registered Member

    Joined:
    May 6, 2010
    Posts:
    1,254
    Location:
    Texas
     
  14. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
     
  15. Rompin Raider

    Rompin Raider Registered Member

    Joined:
    May 6, 2010
    Posts:
    1,254
    Location:
    Texas
     
  16. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    It would be nice if Comodo would start participating in any damn real world tests. :rolleyes: Like they don't have any confidence in their own program...
     
  17. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Once again BD and KIS on top.
    Good job! :thumb:
     
  18. VectorFool

    VectorFool Registered Member

    Joined:
    Oct 21, 2012
    Posts:
    280
    Location:
    India
    They've taken to improving the scan speeds while detection takes backseat
     
  19. The Red Moon

    The Red Moon Registered Member

    Joined:
    May 17, 2012
    Posts:
    4,101
    I could not disagree with you more.!.In recent years the comodo av has been steadily increasing in detection rate and is certainly as good as anything else out there.
    I personally do not take any notice of silly test results as these can vary by the day.
    Regards to you.
     
  20. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    No big surprises...:cool:
    Expected results for Bitdefender, Kaspersky, and F-Secure. :thumb:
     
  21. Narxis

    Narxis Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    477
    +1

    They should participate in av-comparatives too.
     
  22. er34

    er34 Guest

    Why do you put so much importance in these tests. They are just TESTS, nothing but a test showing 'lab results' (not real world) of a single moment.

    If you see COMODO participating in these tests will this make you use them more or will this change your opinion about them ? If you are a parent and your child gets an A in a school test, or your child gets F on a test, will this change your opinion about and attitude to your kid ? I doubt. So, why do you guys care so much about tests - I really can't understand, really.:thumbd:
     
  23. er34

    er34 Guest

    It's not that they don't have confidence in their product. The reason is that if they pay the testing organizations the money, these money would be the worse spend money ever. Do you remember details about the "scandal" between COMODO and AVC some time ago ?

    The above are AVC words. It is all lab, automatic stuff, no real-world. So why should COMODO pay for lab things. They have their pluses, testing organizations and their tests are very limited, they do not show real-wolrd human usage and they can not demonstrate COMODO's strengths, neither DrWeb's power, neither Microsoft strengths, neither some other products strengths.
     
  24. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753

    Because when you fear something (infection), you seek protection. And when you are unable to tell which protection is better in absolute certaintly, you take whatever there is out there. Something is better than nothing.

    Obviously, the tests are only a part of the truth. No wonder, zero day tests are either never done or done separately, because they would "ruin" overall scores, that in most tests are close to 100%. :D Or, you can pick samples from an area that the company financing the tests is best at and rank it 1st, even without cheating during the tests. At the end, much depends on how much you want to believe in a test. :D For example, in PCSL tests, Twister was doing over 90%. In my zero day samples, it got something like 50% in one and 3 out of 18 in another. If you go to Twister's site, it has a nice "VB100 with 98,8% detection!". That does seem much better than "50% success!", doesn't it? :D Or put more russian samples in the test and chances are russian antivirus will score higher. And well, if antiviruses were so good as the 98-99% success tests suggest (and mind you this happens for years), there wouldn't be many infected people. Oddly enough, i remember in XP days, where most security fora had to make a dedicated "HiJack This Log" section, where unfortunate users would flock to submit their logs and find "salvation". And these were actually computer-savvy people! The real "average Joes", wouldn't bother to register. They were the legions of users who were permantly infected and learnt to live happily with their new viral friends.


    As for Comodo, i understand why doesn't want to partecipate. From info posted in the past in this forum, partecipation costs something with 5 figures. Comodo's AV i doubt has evolved so rapidly to be up there with products that have been working on their engines for much more years. Melih likes to be no.1. He doesn't mind to pay when he is no.1 (Matousec). But he does mind to pay for being in the low or middle of the pack. Besides, as you say, Comodo was born as firewall with HIPS and also has this sandbox that can't be tested. Why pay for a test that invalidates D+, which effectively would be very powerful in a dynamic test, but probably be judged "unfair comparison" to the others?
     
    Last edited: Dec 24, 2012
  25. The Red Moon

    The Red Moon Registered Member

    Joined:
    May 17, 2012
    Posts:
    4,101
    Comodo AV has been tested at VB100 and achieved a very good percentage.:thumb:
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.