scmagazine antivirus test and review!

Discussion in 'other anti-virus software' started by Technodrome, Jan 31, 2003.

Thread Status:
Not open for further replies.
  1. Technodrome

    Technodrome Security Expert

    Feb 13, 2002
    New York
    BitDefender Professional
    eTrust AntiVirus
    F-Secure AntiVirus Suite
    Kaspersky Anti-Virus
    McAfee Active Virus Defense Small Business Edition
    OfficeScan Corporate Edition
    Panda AntiVirus Small Business Edition
    RAV AntiVirus Desktop for Windows
    Sophos AntiVirus
    Symantec AntiVirus Corporate Edition
    Vet AntiVirus

    Test methodology

    Test 1 ensured that all the products had to go through the onslaught of each and every virus on the September 2002 in-the-wild (ITW) list, with detection rates defining their final score.

    Test 2 expected each solution to disinfect all of the viruses from test one that were capable of being ­ disinfected. But in doing so they could not damage the infected ­ object. After restoration to its former state the data had to be ­ readable or the application had to run as it had previously.

    Test 3 used a total of 1,012 polymorphic infections, which the solution had to be able to detect to satisfy the stringent test criteria.

    Test 4 bombarded the anti-virus products with a smaller number of viruses, which had either been dropped from the ITW list recently, or which appeared in its subsidiary category. These were infections that may have been detected in only one specific area across the globe or may have only been picked up by one reporter. The important message is that although they did not pose any great threat at the time of testing, these are the viruses that often become a problem at a later date and subsequently end up on the ITW list, making them pertinent to our test criteria.

    Test 5 looked at how the products coped with our macro library; this has been reduced to around 50 percent of the total to allow for the recent drop off in macro viruses, and therefore we discarded the older versions in favor of the new.

    Test 6 considered the length of time taken by each solution to scan a set of clean files within a designated directory. This was recorded in seconds. It shows the likely impact on a system that you are scanning and may affect your choice when considering an anti-virus solution for your SME.


  2. Firefighter

    Firefighter Registered Member

    Oct 28, 2002
    To Technodrome from Firefighter!

    Hi, that test seems to be ”winner takes it all” test. Only the quite small producers (BitDefender, RAV and NOD32) missed in the performance issues!

    That makes me suspicious, when 9/12 have got full 5 stars in that test, I mean that the “overburning level” was quite low! To test something, there can’t be 9/12 winners in pure statistical reasons!

    Maybe it’s better to wait that hopefully independent “biggest ever” -test, which will be published later in the spring 2003!

    Personally, I trust most in, then Virus Bulletin and will maybe come that “black horse” in the future. All of them 3 have said to be independent, but still you have to be a bit of suspicous, if “one takes it all” test by test!

    All the time it is question about making money, so it’s better to be aware, and remember statistical rules! :)

    “The truth is out there, but it hurts”

Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.