Discussion in 'other anti-virus software' started by kiko78, Apr 8, 2012.
good result for comodo...
Shadowserver percentages are always so weird. Like Kaspersky with a detection rate of 9%.
actually this is strange
Not to mention F-Secure with a 1 (!) % detection rate. Has to be some messed up testing protocol.
the test and the result are simply rubbish and weird.......
They are testing AVs against zero-day malware. The results are a useful indicator of the relative strength of a given AV's heuristics & behavior blocker, over & above its signature-based main engine.
I do love it when test results are published,its always the same:-if a members favourite/chosen product gets a poor result the test is flawed/badly run/rubbish if their favourite/chosen product does well then it is a good test with accurate results!
True different tests and different outcomes
I always look at AV-comparatives for indication of data base coverage (wild and zoo variations). The AV-Test protection stat has some nice indications of last month zero day and 90 day in the wild stats. The Shadow server results give an indication of first detection variances between AV companies (also called zero day, but in theory first detection stats). I use them all three, because the three of them cover the life cycle of a malware really well.
In the wild--------------------- Zoo virii -------------------------------------
Zero Day --- First detection --- older than three months --- older than a year
AV-Test ----Shadow Server --- AV-comparatives
Protection stats are averages and the daily stats show how a number one AV-vendor can have mediocre results on a given day. To get a better impression I allways sum the month, 60 day, 90 day and year averages. It gives me a RAP like indication (RAP results are useless, since we have no idea where the results are based on, some companies marked in red in the RAP-test only have 1 result included while worse scoring companies may have results based on months of RAP-test results).
When looking for a free AV these vendors are the best choices: Avast, AVG, Avira, Panda for western hemisphere and Qihoo and possibly RoboScan for easter hemisphere
I would not trust any test by anyone who concludes that Kaspersky is at 9%, and I have not used it for several years.
I'll admit that when I first looked at these tests, I wondered just how significant the tests are. What gave me pause initially against ignoring the tests is that the organization is non-profit and all volunteer. And they've got what appears to be great credentials.
So I looked deeper and decided they were testing something different than other tests. Tests are done on new threats, of the types not in conventional AV databases yet. In essence the tests are tests of heuristic capabilities of AV engines, only, and as applied to actual threats.
What the tests mean as a practical matter to someone like me may be very little because the chances that I'm going to encounter one of these brand new threats before it gets into my AV software database, is very, very low; near zero. But in terms of promoting good heuristic AV engine construction, these tests are likely extremely important.
Can anyone comment on whether they actually run the samples ? Looks like they are downloading the sample from the web. But I can't tell if they are running them.
That's some chart indeed.
In like manner here, on first read with the magnitude of threat sections it appears somewhat daunting. However, different tests always yield differing degrees of results and can prove extremely useful for the vendors.
The chances of you encountering those exact same zero day is well ZERO.. because by definition then they wouldn't be zero However, the chance of you encountering some zero day threat is very high. You dont have to go looking for them. They will find you.. its just a matter of time.
Therefore I think these tests are very relevant to show how the products react to zero-day tests in general.
And the reasoning behind is "Because I´m worth it"
Using the same technique I can say that this test are the most representative ever, and the best in the world
The % are not weird, they are just testing the AV engine, not the whole products.
Comodo, Ikarus have sightly more FP's than the average so could be normal to be one of the best AV engines against 0day malware
Kaspersky av engine, not Kaspersky AV or Kaspersky internet security is at 25% in windows 9% is for linux
well you can say that ...
I still stand by my statement......the test and result are simply rubbish and weird......
Yes of course xD but smell like a troll if you don't give an argument. If you don't like what you see or you don't understand it does not mean that is wrong.
why would I argue over something rubbish?....unless I'm nuts....it's just not worth it......
Regardless what any test say, I will choose one that runs well on my system and not based on test score.
On that criteria, try Iolo or Quickheal; both run very light on most systems
No thanks,I will stick with ESET AV since I got a Free license good until 2013 of June.
Hey Blackcat, when you mentioned Iolo, did you mean their SystemShield product or something else??
Another option is Notepad -- light as a feather & never has a false positive.
Separate names with a comma.