Several issues resulting from recent Windows 2000 test cleared up: http://www.virusbtn.com/news/2007/12_12.xml?rss=
Void, the more I hear about vb tests and results and mistakes, the more I dismiss the test completely! IBK, you have been quite lately, what you been up to?
I do get the bad vibe in these forums about the VB100. What are some of the issues that make it not so representative of any real life situation? I've heard criticism about the wildlist. Some people have wrote that 'in the wild' is misleading and this list does not include the viruses actually in circulation. Is this true?
I noticed that always the reason for mistakes is: submited product came with older detection file and some settings in product were disabled. Can they check these things before they start test?
If that is true, you get a pretty good idea of how professional these people are, running these tests.
IMO, yes. These vendors have come to realize the marketing significance of these tests, and the effort they put into their products to pass the VB100 tests may not reflect the effort they put into detecting viruses outside of those included in the VB100 test set. It's possible for an AV product to pass all the VB100 tests but still have mediocre virus detection.
even that, i aint sure about. like any test, it shouldnt be taken seriously. although VB are geniune, its not very ... erm... challenging. if an AV vendor really wants to pass it, it can. this makes me think, that VB are losing a little bit of credibility, or some vendors just dont care about it as much now.
Mostly because of false positives. It's not all that hard to detect everything, it's much harder to detect ONLY actual malware and not clean files that may look similar...