On the way to better testing

Discussion in 'other anti-virus software' started by TonyW, Feb 2, 2010.

Thread Status:
Not open for further replies.
  1. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,741
    Location:
    UK
  2. JimIT

    JimIT Registered Member

    Joined:
    Jan 22, 2003
    Posts:
    1,035
    Location:
    Denton, Texas
    That is a very interesting post.

    Too bad that the vast majority will read it and not realize the significance.

    Thanks for posting the link.
     
  3. Templar

    Templar Registered Member

    Joined:
    Nov 5, 2009
    Posts:
    114
    Very good post indeed.
     
  4. skokospa

    skokospa Registered Member

    Joined:
    Apr 1, 2009
    Posts:
    177
    Location:
    Srbija
    by this that is written will be that some companies have participated in the VB tests were right!
     
  5. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Good points, essentially confirmed what most of us presumed and has valid points - but in the end, many AVs will still continue this because others have added it to their database.

    Adding malware files which have been detected by more than 10 AVs is far easier and quicker than checking the samples - and static tests will still continue... and a high % = big bucks - as we see on this forum every-time a new test is released.

    It is still a step in the right direction. Publicising the flaws is the only way to educate people and make a change.
     
  6. Get

    Get Registered Member

    Joined:
    Nov 26, 2009
    Posts:
    384
    Location:
    the Netherlands
    Kaspersky didn't do well in this sort of tests obviously :D .
     
  7. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    I was thinking so as-well for the Feb2010 AVC test... but didn't say anything as I don't have a clue... and neither do you :p :argh:
     
  8. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    When you get down to it, both sides of the debate (vendors and testers) are in an arms race of mutually assured destruction, and that arms race is based on hard to refute points.

    A larger scale test is better, right? That means more samples to be tested, so many that there's no conceivable way to validate every sample. It's a logistical impossibility.

    A larger signature database is better, right? That means harvesting with a broader net. Tired of getting grief over that one missed sample that users seem destined to harvest and immediately publicize to the blogs and forums? Deal with the false positives after the fact.

    Why do you think the A vs. B and multiscanner results drivel are not allowed here? Both approaches helped fuel the type of insanity decribed in the piece. Yet it is an insanity that many users and pundits simply seem unable to let go of. In part, that's because there's no reasonable replacement at the moment, aside from placing some level of trust in the vendor(s) you've chosen to deal with. At times, we really don't have a firm basis upon which to assign that trust. By the same token, I also fail to see the origins of some levels of active distrust that are reflected in comments out there.

    Blue
     
  9. Get

    Get Registered Member

    Joined:
    Nov 26, 2009
    Posts:
    384
    Location:
    the Netherlands
    I know. Actually I found the blogpost perfect (sounding), but I couldn't resist :oops: .
     
  10. guest

    guest Guest

  11. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    It is important to recognize that the criticism of anti-virus testing described in the article On the way to better testing is properly and primarily focused upon static testing procedures. Hopefully, the discipline will evolve within the very near future to recognize the near worthlessness of such tests and instead seek to emulate and improve upon the new whole-product dynamic testing methods used by AV-Comparatives, AV-Test and Dennis Technology Lab.
     
  12. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Ikarus I already suspected doing months ago.
    Funny the way Antiy-AVL keeps sticking a .gen at the end of the detection name, trying to make it look like they have very good generic detections! - cheap move if this is the case!

    Agree, and it is already evolving. The tests by AV-Test and AVC at the end of last year are already showing a movement in the right directions.

    Still, the AVs doing this are doing nothing wrong essentially. Rather than seeing it as a method to boost static AV tests, can also see it as a method to boost actual detections protecting users.


    Some queries can arise about the implication of this for the AV industry though if many AVs do it - AVs leaching off other AVs - If one AV spends time and resources to analyze a file and create a detection, why should another get it for free? Goes to show the more reputable virus-labs are doing the work and incuring the costs, whereas the other labs take all their hard work away. And on top of this the possibility of False-Positives of reputable AVs being spread across other AVs.

    I also do not like the idea of my AV detecting something just because another is. Prefer them to check the sample first, as is my confidence and trust in my AVs detections.
     
    Last edited: Feb 3, 2010
  13. PC__Gamer

    PC__Gamer Registered Member

    Joined:
    Dec 26, 2009
    Posts:
    526
    I guess all those companys who didn't cheat and lie for results have arrived eventually with their heads held high
     
  14. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Is it a cheat/lie though? - The AV does detect it and the more it detects, the more it protects users (assuming they are not FPs!). I'm sure some AVs have made thousands of detections by using methods like this, and in the end, it is protecting its users from thousands of more malware than it *may* have otherwise missed or detected it at a later date through actual analysis.

    Copying detections of others can be considered a cheaper way of adding detections - although it is not good for the industry and can increase FPs.
     
  15. progress

    progress Guest

    I knew that some AV detect 103 % - that's why I like VB100 more than other tests :)
     
  16. rothko

    rothko Registered Member

    Joined:
    Jan 12, 2005
    Posts:
    579
    Location:
    UK
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.