Antivirus daily test

Discussion in 'other anti-virus software' started by kiko78, Apr 8, 2012.

Thread Status:
Not open for further replies.
  1. kiko78

    kiko78 Registered Member

    Joined:
    Aug 1, 2008
    Posts:
    106
  2. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    814
    Shadowserver percentages are always so weird. Like Kaspersky with a detection rate of 9%. o_O
     
  3. kiko78

    kiko78 Registered Member

    Joined:
    Aug 1, 2008
    Posts:
    106
    actually this is strange
     
  4. King Grub

    King Grub Registered Member

    Joined:
    Sep 12, 2006
    Posts:
    814
    Not to mention F-Secure with a 1 (!) % detection rate. Has to be some messed up testing protocol.
     
  5. ams963

    ams963 Registered Member

    Joined:
    May 3, 2011
    Posts:
    5,965
    Location:
    Parallel Universe
    the test and the result are simply rubbish and weird:thumbd:.......
     
  6. bellgamin

    bellgamin Very Frequent Poster

    Joined:
    Aug 1, 2002
    Posts:
    5,648
    Location:
    Hawaii
    They are testing AVs against zero-day malware. The results are a useful indicator of the relative strength of a given AV's heuristics & behavior blocker, over & above its signature-based main engine.
     
  7. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    I do love it when test results are published,its always the same:-if a members favourite/chosen product gets a poor result the test is flawed/badly run/rubbish if their favourite/chosen product does well then it is a good test with accurate results!
     
  8. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    True different tests and different outcomes

    I always look at AV-comparatives for indication of data base coverage (wild and zoo variations). The AV-Test protection stat has some nice indications of last month zero day and 90 day in the wild stats. The Shadow server results give an indication of first detection variances between AV companies (also called zero day, but in theory first detection stats). I use them all three, because the three of them cover the life cycle of a malware really well.

    In the wild--------------------- Zoo virii -------------------------------------
    Zero Day --- First detection --- older than three months --- older than a year
    AV-Test ----Shadow Server --- AV-comparatives

    Protection stats are averages and the daily stats show how a number one AV-vendor can have mediocre results on a given day. To get a better impression I allways sum the month, 60 day, 90 day and year averages. It gives me a RAP like indication (RAP results are useless, since we have no idea where the results are based on, some companies marked in red in the RAP-test only have 1 result included while worse scoring companies may have results based on months of RAP-test results).

    Conclusion
    When looking for a free AV these vendors are the best choices: Avast, AVG, Avira, Panda for western hemisphere and Qihoo and possibly RoboScan for easter hemisphere

    Regards Kees
     
    Last edited: Apr 8, 2012
  9. Atul88

    Atul88 Registered Member

    Joined:
    Dec 8, 2011
    Posts:
    259
    Location:
    India
    Very True!!!:D :D
     
  10. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,221
    I would not trust any test by anyone who concludes that Kaspersky is at 9%, and I have not used it for several years.
    Jerry
     
  11. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    I'll admit that when I first looked at these tests, I wondered just how significant the tests are. What gave me pause initially against ignoring the tests is that the organization is non-profit and all volunteer. And they've got what appears to be great credentials.

    So I looked deeper and decided they were testing something different than other tests. Tests are done on new threats, of the types not in conventional AV databases yet. In essence the tests are tests of heuristic capabilities of AV engines, only, and as applied to actual threats.

    What the tests mean as a practical matter to someone like me may be very little because the chances that I'm going to encounter one of these brand new threats before it gets into my AV software database, is very, very low; near zero. But in terms of promoting good heuristic AV engine construction, these tests are likely extremely important.

    __
     
  12. Narxis

    Narxis Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    477
    +1:ouch:
     
  13. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    Can anyone comment on whether they actually run the samples ? Looks like they are downloading the sample from the web. But I can't tell if they are running them.
     
  14. EASTER

    EASTER Registered Member

    Joined:
    Jul 28, 2007
    Posts:
    5,632
    Location:
    U.S.A. (South)
    That's some chart indeed.

    In like manner here, on first read with the magnitude of threat sections it appears somewhat daunting. However, different tests always yield differing degrees of results and can prove extremely useful for the vendors.
     
  15. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    The chances of you encountering those exact same zero day is well ZERO.. because by definition then they wouldn't be zero :) However, the chance of you encountering some zero day threat is very high. You dont have to go looking for them. They will find you.. its just a matter of time.

    Therefore I think these tests are very relevant to show how the products react to zero-day tests in general.
     
  16. lordraiden

    lordraiden Registered Member

    Joined:
    Jan 30, 2006
    Posts:
    3,067
    And the reasoning behind is "Because I´m worth it" :D

    Using the same technique I can say that this test are the most representative ever, and the best in the world :D

    //

    The % are not weird, they are just testing the AV engine, not the whole products.
    Comodo, Ikarus have sightly more FP's than the average so could be normal to be one of the best AV engines against 0day malware

    Kaspersky av engine, not Kaspersky AV or Kaspersky internet security is at 25% in windows 9% is for linux

     
    Last edited: Apr 11, 2012
  17. ams963

    ams963 Registered Member

    Joined:
    May 3, 2011
    Posts:
    5,965
    Location:
    Parallel Universe
    well you can say that:D ...

    I still stand by my statement......the test and result are simply rubbish and weird:thumbd::thumbd:......
     
    Last edited: Apr 11, 2012
  18. lordraiden

    lordraiden Registered Member

    Joined:
    Jan 30, 2006
    Posts:
    3,067
    Yes of course xD but smell like a troll if you don't give an argument. If you don't like what you see or you don't understand it does not mean that is wrong.
     
  19. ams963

    ams963 Registered Member

    Joined:
    May 3, 2011
    Posts:
    5,965
    Location:
    Parallel Universe
    why would I argue over something rubbish?....unless I'm nuts....it's just not worth it......
     
  20. Dark Shadow

    Dark Shadow Registered Member

    Joined:
    Oct 11, 2007
    Posts:
    4,553
    Location:
    USA
    Regardless what any test say, I will choose one that runs well on my system and not based on test score.
     
  21. ams963

    ams963 Registered Member

    Joined:
    May 3, 2011
    Posts:
    5,965
    Location:
    Parallel Universe
    +1....:thumb:
     
  22. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,010
    Location:
    Christchurch, UK
    On that criteria, try Iolo or Quickheal; both run very light on most systems ;) :blink: :p
     
  23. Dark Shadow

    Dark Shadow Registered Member

    Joined:
    Oct 11, 2007
    Posts:
    4,553
    Location:
    USA
    No thanks,I will stick with ESET AV since I got a Free license good until 2013 of June.
     
  24. kdcdq

    kdcdq Registered Member

    Joined:
    Apr 19, 2002
    Posts:
    657
    Location:
    Southwestern Massachusetts
    Hey Blackcat, when you mentioned Iolo, did you mean their SystemShield product or something else??o_O
     
  25. bellgamin

    bellgamin Very Frequent Poster

    Joined:
    Aug 1, 2002
    Posts:
    5,648
    Location:
    Hawaii
    Another option is Notepad -- light as a feather & never has a false positive. :ninja:
     
Loading...
Thread Status:
Not open for further replies.