Discussion in 'other anti-virus software' started by honeybunny, Aug 31, 2005.
NOD 2.51.8 placed behind Symantec's Norton 11.0.
but it is a great score by NOD32, good improvement on last On-Demand test, well done Eset!
Indeed, pretty good improvement. So we can rely on Kaspersky, Norton and NOD32.
Very interesting results
Norton is heavy, try to avoid it
This is very subjective. Please try to stay on topic.
There is also the additional consideration of how well an AV provides
proactive, zero-hour protection. I prefer an AV that does well on both.
I tried it (2005) last week, wasn't what i consider to be heavy. I personally would not buy one of the big three (Norton, McAfee & Trend-micro), because i prefer to support smaller vendors, but i was surpriced when i tried it, haven't tried Norton since 2002-2003. Certainly not in the DrWeb catagory, but better than earlier versions and the detection doesn't seem to be bad according to the last AV-comparatives.
Well beta of NAV2006 was even better. Can't say for speed (since i ran it in VirtualPC),but memory usage was lower than anytime before.
Far more should be considered than this test when choosing an AV. Things such as heuristics, interface, how light/heavy the product is, and ease of use as well as features.
This test should be but a tiny portion in the overall picture when choosing an AV. It really really annoys me when people base everything off this single test not conducted by a commercial organization. Please people, don't put so much stock into one test site, it does the industry no good!
Nice, I see you read also the main site of av comparatives:
"If you plan to buy an Anti-Virus, please visit the vendor's site and evaluate their software by downloading a trial version, as there are also many other features (e.g. firewall, scriptblocker, etc.) and important things (e.g. compatibility, graphical user interface, speed, language, price, update frequence, etc.) for an Anti-Virus that you should evaluate by yourself. Even if quite important, the detection rate is just one aspect that you should consider when buying Anti-Virus software."
And also BitDefender and McAfee
I expected six AVs to perform well in this test, and they did do well, though one of them surprised me a bit (Dr.Web - I expected it to have a better score).
I disagree on several points you try to make, and agree on selection criteria.
You fail to prove that non-commercial organizations that provide independent testing services are not up-to-commercial snuff. In fact, the results are made available about two weeks before being released to the public to allow the commercial organizations whose product is undergoing testing to comment and refute. Very fair IMHO.
It does do the industry good to have fair and independent evaluations. The AV-Comaratives.org website publishes their test methhodology - now what commercial organization wants to share that I ask you?
I understand your annoyance with people that may make their choice on just this test, but it is not one test, but a series of tests - over 400,000! A very good sample size one should think!
When one compares the results that are achieved by AV-Comaratives.org against the drivel published in terms of opinions of so-called experts in the industry, well there is no comparison! While it is wise to consider the opinions of experts, their point of view should be backed up by data they are willing to share, and bounced against the experience and opinions of other experts in a peer review.
The point is that the unenlightened public (who needs all the help that can be given them to stay safe), has no point of reference or standard by which to judge various AV products, let alone other products necessary to batten down the security hatches on a computer. Until such time that one exists, I could not agree more with your point of view that things such as heuristics, interface, how light/heavy the product is, and ease of use as well as features also need to be considered when purchasing an industrial strength product. Perhaps other comparison metrics need to be added to the evaluation methods already published by AV-Comparatives.org - I'm sure they would be willing given a standard method/procedure against which to compare, and the resouces to do so.
One other point - the industry and nastyware are moving targets - no one product can afford to stand still - and just how does a commercial organization measure inprovements in its product line against the industry of competitors? Some do , some don't, and others just do marketing! A valuable service has been provided by AV-Comparatives.org both to the industry and to the buying/non-buying public and increased our understanding on a level playing field against its test-set. Thanks AV-Comparatives.org!
Even though resource usage for 2006 was lower than 2005, it is still not consider as "light".
My experience is their on-demand seems to have huge difference compare to on access. ( I.e many virus are only detected when scanning on demand ) Although i am not sure if this is still the case for 2005 / 2006.
NOD is much lighter in resources. And has better heuristics........ On paper NOD looks too good to be true.......
McAfee did better than NOD32 at Windows and Macro viruses...I thought NOD32 was the BEST at detection of Windows viruses...:/
but it did a great job!
lotuseclat79 and SDS909 - you both make good points - but miss something that really is quite important I think - each individual and/or company, has their own criteria - for instance:
My own organisations criteria:
we handle a LOT of bounce messages from SPAM and viruses/worms because I run mailservers in our hosting environment - this make zero-day a major issue for me - many others don't have this problem, so zreo-day is LESS IMPORTANT (not insignificant).
I wanted to see activity on staff's machines - ie, the enterprise features are important
I wanted to password protect the settings so that protection can not be turned off
I wanted it to LESS memory/cpu intensive than our previous solution (a mixture of Norton 2003/2004)...
Cost was also an issue...
those were MY COMPANY requirements - and once I'd shortlisted products, the one that we chose became the clear leader... but that's the process WE used... every company/individual must make their own mind up - and as pointed out - results in tests are only PART of the complete picture...
I am dissapointed by NOD32! ..they could have behave better and they could have been the number one, but it seems ESET stuff son't want this and they know what I am talking about!!
But then again, what's the basic reason people use AV programs for, in the first place? To catch viruses and other "nasties". With that said I'd think that while detection rates shouldn't be the only reason to buy/use a particular AV, it should be a major part(along with Heuristics and the like) of everyone's criteria, since that is basically why you are using an AV, in the first place. No?
As for the results at AV-Comparatives(referenced in SDS909's first post), IMO they hold more credence to me, than most commercial sites, because one, AV-Comparatives shares their testing methodology, thus we can see if it's biased or not and two, commercial sites can easily be influenced by outside forces, especially money/advertising and other such things and are usually not as forthcoming about how they carried out a test.
I agree that the main reason to run protection is to be protected, but like I said, everyone's criteria are different, and are slanted by different factors - if cost was more of an issue, perhaps we'd run a freeware solution. Protection is the biggest reason, the other reasons are all secondary to that fact - but nevertheless, ALL of our factors played SOME part in the decision, and back when we opted to move FROM Norton, one of the biggest factors that turned our head towards NOD32 in the first place, was that it played nicely with Windows XP SP2 - and no-one else did at the time, not Norton, not McAfee - it was incredible to us that running Norton fully paid up and we were being told that we had no valid AV solution - and we heard it almost hourly from clients too... I forgot that little gem when posting earlier!
Ok but whos criteria do you base this off? A few hobby test sites? VB? Or word of mouth from security professionals? Or trial and error and experiance in real world circumstances?
For most of my applications, how light an AV operates, and how spartan/minimal every bit as important as raw detection. But once again, protection based on what criteria? My own of course..
Generally I throw an AV on a exposed honeypot and watch its performance after a few months. If i'm happy with it's ability to catch malware, and it is light enough, it gets pushed to my networked computers. I pay almost no attention to actual test data from hobby or commercial test houses because their data really wouldn't apply to my circumstances. (and in some cases, biased or flawed)
As such, Dr.Web and VBA32 are the most used AV's pushed to our network.
now i dont think nod users exspected these results hate to say i told you so but im sure the nod users will make up some excuse for such poor results i bet thier crying right now lol
I don't see where they have to make any excuses for the test results they got. They are very respectful and Nod still remains a very good antivirus.
tip) nod users ditch the nod and put that free norton back on that came with your dell pc's lol
Separate names with a comma.