AV-Comparatives June (May 2007) Results (Retrospective / Proactive Tests)

Discussion in 'other anti-virus software' started by AshG, May 29, 2007.

Thread Status:
Not open for further replies.
  1. pilotart

    pilotart Registered Member

    Joined:
    Feb 14, 2006
    Posts:
    377
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I am pleased to see such detailed analyisis of AntiVirus Performance.

    For accuracy, settings should be adjusted to maximum,
    user can decide what settings suit a particular system and habits of use.

    I am not one to request additional tasks to be done,
    however before analyisis of reduced settings there may be other,
    more relevant characteristics that could be reported.
     
  2. Hangetsu

    Hangetsu Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    259
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I actually disagree with this. Rather than blame the user for not knowing how to configure the AV, I'd put the pressure on the AV vendor to make the software secure by default. Why make the user run additional steps to make the system secure? I think the opposite direction (i.e. make the user take steps to make the system less secure) would be the way to go.
     
  3. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    June 4, 2007

    Statistical Choice Algorithm to Select Anti-Virus Products

    Introductory Comments

    After permission was obtained on March 1, 2007 and advice from a moderator I decided to contribute this algorithm to try to reduce the negative effects of brand loyalty and emotional issues when our favorite AV doesn’t do as well as before as new results are published over time. I feel that human factors have tended to cause unnecessary concern, fear,
    defensiveness and product switching.

    This statistical algorithm uses all available independent AV Comparative test results to select top Anti-Virus products as candidates for your PC. When reading this please note the footnotes. Note as well that I now include latest AV comparatives report whereas back in March I excluded them, to avoid bias.

    For the record, I am using use BitDefender AV 10 and in this analysis it did not survive! Any errors in logic, data used or other silliness here in this post are mine and only mine. IBK has nothing to do with my use or misuse of their excellent work.

    Statistical Algorithm

    (1) Exceeding or equaling the median performance per test on “new” parasites is the first selection filter. Only those products are considered further. This assumes that heuristic ability should be given 1st priority. There were 7 tests from 2004 to 2007. Products that were able and willing to be tested over all 7 tests are selected for further consideration. This assumes that doing well over time consistently is important as it avoids “jumping” from product to product as new tests occur. 7 products meet that criterion. Of those 7, 2 equaled or exceeded median performance over all 7 tests and 1 did that over 6 out of 7 tests. (see Table attached on Proactive)

    This left 3 products, in rank order, NOD32 Anti-Virus, BitDefender Prof. + and Dr. Web.

    (2) Next, the performance in on-demand tests was used as the second filter. This assumes that on-demand ability ranks second in priority. As in step 1, only products in all 6 tests are considered and equaling or exceeding median performance is used as the criteria. Again to avoid product jumping based on one test to say nothing about resources needed every time users change and climb the learning curves for those concerned with that factor. 8 products were in all tests. Of those 8, none exceeded median over all tests. 3 exceeded over 5 tests, and 1 did in 4 tests and 1 in 3 or just 50% of the tests. (See attached table on On-Demand).

    Keeping products that were equal or greater in the latest test, which left 3 products in rank order, KAV Personal Pro, NOD32 Anti-Virus and Norton Anti-Virus.

    (3) Steps 1 and 2 results are examined and the only product in both lists is NOD32.

    (4) If your existing product is NOD 32 Anti-Virus, relax and enjoy life and don’t bother with the remaining steps. Those steps are suggested only if a user has no AV product or if their current product has developed bad performance habits.

    (5) Determine required O/S, CPU, RAM and HDD, record scan speed and trial each products footprint on your PC to break ties (if any) and to uncover any unresolved compatibility issues with your system BEFORE investing $!

    (6) During the trial periods, test vendor’s support group(s) with similar questions and their track response time and the quality of solutions offered.

    (7) Include visits to public forums to determine un-edited vendor reputation and any unknown product benefits and issues. If ties remain, use price as a final selection variable of course free wins over paid.



    Complete step 5 the determination of O/S, CPU, RAM and HDD, record scan speed and trial each product’s you select footprint on your PC to break ties (if any), too uncover any compatibility issues with your system BEFORE investing $! This table is not completed because you fill it in for your system.

    Product Required O/S CPU RAM HDD On demand Scan Speed False Positives


    Complete steps 6 and 7 to test the vendor support groups with similar questions and include visits to public forums for vendor reputation and issues, track response time and quality of solutions offered. This table is NOT completed only hints as to how you could fill it in are given. Product Vendor Support Reputation Benefits/Issues Vendor Response time Quality of Responses Price

    The results in the 2 tables are used to break ties or eliminate products that fail to meet your expectations for quality.

    Conclusion

    Whatever method you use to select AV products, it should be consistent, fact based and not altered after results emerge that disappoint them. The method I use may not suit you, you may disagree with it. Fine, create your own, but stay with it.

    My own license/subscription to Bit Defender expires this September. When the August AV’s come out for On Demand, I will update the data run my algorithm and decide accordingly. If the results match last test, it is likely I will go NOD32.
     

    Attached Files:

  4. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I think we have made this a hell of a lot more complicated then it really is. It befuddles me to see some asking if their security looks good and they have like 50 security applications. But yet we tell them what to tweak here and their. Some very good people here have said some things over time that shoud be written in stone when it comes to wisdom. And yours truly is the first that needs to read this. We have assisted in creating a glutton of products that have great potential and means, but the reality is we have open Pandoras box in making it harder for the legit to secure our PCs.

    1.Safe browsing-dont play with dynamite and you wont risk getting blown up.

    2. Pick a long term AV-if they have been around for awhile then they have the dedication to continue to make their product better.

    3.Commitment-Buy your license and keep it- Duh trjam? Ask yourself when was the last time it let you down, and if it did, was the outcome so bad it caused you to endure true hardship. I dont think so.

    4.Work together with vendors,testers and users to give constructive feedback as to what you, the consumer, feels your product needs to add or change.

    It is simple and from this day forward, I plan to live by this. The reality again is that the nasties in this world may be our own selves.
     
  5. SteveS335

    SteveS335 Registered Member

    Joined:
    Jan 16, 2007
    Posts:
    43
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Hello Escalader,

    I appreciate that you have gone to a lot of effort to produce your "Algorithm", but I must say I feel there is a flaw in step 1, which obviously affects all further steps.

    This assumption also assumes that no new versions of the products' engines or heuristic capabilities have been made since 2004. If a product that was once mediocre and is now vastly improved is accounted for in this way, it doesn't reflect the current status of the product in relation to others, and vice versa! There is no reason to assume that consistency since 2004 has any bearing on current or future results.

    Personally, I would consider only the latest results as the current "field of play". I feel there's no point in resting on past laurels, or damning on past failings.

    Next time it may be different - or maybe not!

    Some food for thought,

    Cheers,

    Steve
     
  6. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Hi trjam:

    I'm having trouble with "And yours truly is the first that needs to read this."

    Are you recommending something for me to read? Be happy to! Or are you saying you think I have made the problem worse? :doubt:

    Anyway, I agree with your points 1, 2, and 4. On 3 I agree conditionally.

    For example with BD in my case if it falls off well, that's the way it goes. Nobody should stay with a declining product that consistently is under the median test after test. (BD is not in that state yet or at all but compared to NOD32 well, the results speak for themselves) Since that does not prove they are improving enough. It's a competitive world and the user ( that's us) should benefit from that !:D
     
  7. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,102
    Location:
    North Carolina USA
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Yours truely, refers to trjam.;)
     
  8. midway40

    midway40 Registered Member

    Joined:
    Jul 24, 2006
    Posts:
    1,257
    Location:
    SW MS, USA
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I have to agree about finding an AV that works for you and stick with it. Since last August I have had full (paid) versions of Avira, NOD32, F-Secure, Trend Micro, and now Norton (I was using the free aVast! before Avira). I really didn't have any major problems with these AV's. The reason I left F-Secure was that it didn't have a Vista version until just last week. I want to settle down and it seems Norton is running great on my computer so I am going to settle down with it come hell or high water. I am tired of "AV-hopping".

    Unless some future version really screws my computer up my avatar will remain the same for a long time.
     
  9. Access Denied

    Access Denied Registered Member

    Joined:
    Aug 8, 2003
    Posts:
    927
    Location:
    Computer Chair
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    AMEN. I installed Avira on my fresh install of Vista and its running great. I have been following this thread for humor mostly, I am in no mood to change anything on Vista, cept maybe a software firewall(already have hardware) at a later time and thread, lol. :D
     
  10. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)


    Ah, I see. :cool:
     
  11. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Hi Steve:

    I see what you are saying but it is more of a different view than a flaw. ;)

    If you want to try latest tools great but that is NOT my plan. edit: What I mean is my assumption is not that no improvements have been made, but that I want AV products that have held up there head high over all testing periods. Natutually, if I switch this fall I will install the latest version

    Next time I will be dropping the oldest test and replacing it with the newest.
    In time, if the new guys are still in business they will have a chance to pass the consistency test. I'm interested in the best of breed that have a history of paying dividends like a stock!

    The whole point is to avoid hopping on the latest hot product. They are on the list, statistical methods don't damn or praise.

    In the end it is up to the user on their PC. I'm just offering a method that looks at the whole record not just 1 point in time selections.

    It is work to do it but that's okay if it helps some stabilize there setups.:D
     
    Last edited: Jun 7, 2007
  12. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)


    I agree with Steve and disagree with your methodology entirely. A user should IMO ALWAYS be watching for AVs with improvement, AVs resting on past laurels, and AVs with the beginning of a downward spiral. If I went with your methodology I'd still be using Norton and that would be ridiculous if for no other reason but that Symantec considers all users to be pirates and totally wrecked their USA support for utter crap in India. Your methodology takes none of the many things that are crucial into consideration. Your methodology doesn't follow the FIRST AND MOST IMPORTANT RULE for choosing an AV: What AV works best on YOUR system and has at least "good" support (preferably better than good). To discover that, one has to test a bunch of AVs and that testing is never over because what works best this year may have been mediocre last year and next year will be a disaster on your machine. Surely, you don't expect someone to stay with an AV because your test says they should when that AV has done damage to the system, or slows it dramatically, or the support is crap or the GUI is so poor that the user's eyes burn and sting and they get headaches because of it.

    Sorry, but I think your methodology is worthless. Plus, why is it wrong to change AVs? What is wrong is if one changes AVs for the WRONG REASONS. And that was what Stefan was trying to convey not that it is inherently wrong to change or is inherently wrong to change often. The ONLY thing wrong with changing is doing so for the wrong reasons. To me, you are making a gigantic mistake in leaving BD because your algorithm says you should but each to their own. As for NOD32 being the best...you have really got to be kidding! I should have my internet connection slowed by more 50% because your algorithm says its the best....yeah...really. I am amazed that you spent time on something so worthless instead of spending time testing various AVs in their latest state to see which was best for you at this point in time.
     
  13. Escalader

    Escalader Registered Member

    Joined:
    Dec 12, 2005
    Posts:
    3,710
    Location:
    Land of the Mooses
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Hi Mele20:

    Thank for taking the time to read the post and comment. I agree completely that it took some effort. I'm also glad it got you and others thinking I didn't intend to upset anybody since the whole idea is to reduce emotion in the process. The goal I had in mind was to do that. It is not "wrong" to change AV's. I never said that, just use all the tests over time not just the last one. I'm considering AV's myself, but using a logical and neutral method to do it.

    Of course, any poster can say the methodology is worthless, I don't agree since it made you think and post and that in itself is not worthless. I am open to improvement ideas but your post doesn't offer that. No one can stop posters from and testing latest AV's as often as they like.

    I prefer to use IBK's work and approach to select then test. You must have tested many AV's, so why not summarize the findings and publish them here against your own set off selection variables, I would be interested in reading them. I for one would like that since there may be a factor or two I could use in my next upgrade to the methodology.

    It looks like you got part way through the work and then decided to let me have it:eek: You did that! :cool:

    BD well, I still have it and who knows next tests may elevate it higher and I will stay with it. It is a good product. If I do go to NOD32, I will carry out the steps 5,6 and 7 and trial it first on my PC. I have other AV needs such as Email IN/OUT scanning that would have to be checked out as well. I left these out since they may not apply to everyone. IE some use email only on their ISP and don't download it at all.

    Please re-read and think about the following sections of the methodology again.


    (4) If your existing product is NOD 32 Anti-Virus, relax and enjoy life and don’t bother with the remaining steps. Those steps are suggested only if a user has no AV product or if their current product has developed bad performance habits.

    (5) Determine required O/S, CPU, RAM and HDD, record scan speed and trial each products footprint on your PC to break ties (if any) and to uncover any unresolved compatibility issues with your system BEFORE investing $!

    (6) During the trial periods, test vendor’s support group(s) with similar questions and their track response time and the quality of solutions offered.

    (7) Include visits to public forums to determine un-edited vendor reputation and any unknown product benefits and issues. If ties remain, use price as a final selection variable of course free wins over paid.

    Complete step 5 the determination of O/S, CPU, RAM and HDD, record scan speed and trial each product’s you select footprint on your PC to break ties (if any), too uncover any compatibility issues with your system BEFORE investing $! This table is not completed because you fill it in for your system.

    Product Required O/S CPU RAM HDD On demand Scan Speed False Positives


    Complete steps 6 and 7 to test the vendor support groups with similar questions and include visits to public forums for vendor reputation and issues, track response time and quality of solutions offered. This table is NOT completed only hints as to how you could fill it in are given. Product Vendor Support Reputation Benefits/Issues Vendor Response time Quality of Responses Price

    The results in the 2 tables are used to break ties or eliminate products that fail to meet your expectations for quality.

    Conclusion

    Whatever method you use to select AV products, it should be consistent, fact based and not altered after results emerge that disappoint them. The method I use may not suit you, you may disagree with it. Fine, create your own, but stay with it.

    My own license/subscription to Bit Defender expires this September. When the August AV’s come out for On Demand, I will update the data run my algorithm and decide accordingly. If the results match last test, it is likely I will go NOD32.
     
  14. AshG

    AshG Registered Member

    Joined:
    May 7, 2005
    Posts:
    206
    Location:
    East TN
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I'm curious - NOD32 slows down your internet connection? I find through my personal experience that it causes the least page load lag of any of the major programs. I wonder what is causing it to behave abberrantly in your specific setup.
     
  15. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Maybe the http scanner was causing slow downs on his computer.
     
  16. Access Denied

    Access Denied Registered Member

    Joined:
    Aug 8, 2003
    Posts:
    927
    Location:
    Computer Chair
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I have a close friend that used to have NOD and IMON slowed him down considerably(dialup inet). He now uses NAV 2007 with no slow down.( I recommended it, since I have it on my other PC and its great on there)
     
  17. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Yep. I didn't read all of it. :eek: But I did just now. :) And you certainly did get me to think. ;) If your method works for you and may help some others then that is good. I don't feel compelled to be so "scientific" about it. I read a lot about the various AVs. I have tried many of them (although not one right after another as a real reviewer would do), used longer than a trial many of them, but can't always speak to personal use of the very latest versions of many of the AVs so sometimes something important may have changed from the version I used to the current one. So, if I am in the market for a different AV because I don't like my current one, then I will test the latest versions of the AVs that I would consider having on my computer.

    For instance, I mentioned NOD32 slowing my internet connection a lot...well that was back when IMON HTTP scanner was new...the result today might be different. Same with Kaspersky 2006. I couldn't use the HTTP scanner as my speed was cut by 60%. Since the experience with KAV was recent and dramatic, I suspect that NOD32 would also still have that drastic effect but I can't know for sure unless I tested the current version. I'm not particularly eager to do that because there are other things I don't care for with NOD32 and I have no interest in testing KAV 2007 because I have been informed that the damage from it to our files as evidenced by chkdsk problems still exists. I have damage that did not go away when I uninstalled KAV 2006 so I have to rule it out as an AV for me. I already know pretty much which AVs I could tolerate so using your methodology doesn't seem sensible for me but I can see that for some it might be quite useful (especially after I read the rest of your steps). I cannot be dispassionate (scientific in choice) about software that is used so much and your methodology attempts that.
     
  18. Martijn2

    Martijn2 Registered Member

    Joined:
    Jul 24, 2006
    Posts:
    321
    Location:
    The Netherlands
  19. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
  20. dan_maran

    dan_maran Registered Member

    Joined:
    Aug 30, 2004
    Posts:
    1,053
    Location:
    98031
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I noted that while reading the PCWorld report.
    These magazine companies need to understand the testing methodology before they go to print.
    Maybe any magazine who is going to cite Andreas' work should have him proof read the document first, as this is not the first time his results have been used "out of context"

    Here is a link to the computerworld article:
    http://www.computerworld.com.au/index.php/id;767914886;fp;4;fpid;78268965

    I read it the other day but forgot to post about it. :)
     
    Last edited: Jun 7, 2007
  21. MalwareDie

    MalwareDie Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    500
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    I disagree with this ,"The collection of malware used can span years"
     
  22. Inspector Clouseau

    Inspector Clouseau AV Expert

    Joined:
    Apr 2, 2006
    Posts:
    1,329
    Location:
    Maidenhead, UK
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Yub. It's even getting worse. Many magazine authors - with no clue at all about professional security product testing - just painting their own frame around some results they don't understand but got from somewhere.

    Best example is recently the german magazine CT. Guess what - they create even new malware variants (and admit that even!) to test heuristics. Most of the testers don't even know WHAT THEY ARE REALLY DOING. But it sounds "cool & important" to create new malware for the own tests.

    Btw my weblog is back. New URL is in my signature.
     
  23. dan_maran

    dan_maran Registered Member

    Joined:
    Aug 30, 2004
    Posts:
    1,053
    Location:
    98031
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Hence the need for proper information before printing, this can give users a false sense of security

    Nice to see your weblog back and in full swing with the banned testers list :)

    I really cannot believe that people would create a "new" piece of malware after the whole stink that was caused by the Consumer reports piece. o_O
     
  24. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Far from it, I have seen a few people interested in modifying existing malware to create new variants to test the heuristics of AV products. None of them have actually done anything yet, but...

    That doesn't mean I agree with such methodology though.
     
  25. Miyagi

    Miyagi Registered Member

    Joined:
    Mar 12, 2005
    Posts:
    426
    Location:
    None
    Re: AV-Comparatives June Results (Retrospective / Proactive Tests)

    Thank you for launching your new blog Mike! :)
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.