Relevance of false positives

Discussion in 'other anti-malware software' started by emsisoft, Apr 20, 2010.

Thread Status:
Not open for further replies.
  1. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    In a discussion about the latest MRG test I found an interesting statement from IBK that I want to discuss separately to avoid hooking the other thread:

    I fully agree with that AMTSO principle.

    But it contains a big variable: "measured in a balanced way"

    I think the actual question is: How important is a false positive in relation to a non detected malware sample? Can there be ANY fair relation stated in numbers of a test result at all?

    There are arguments against: Only one missed malware sample can destroy the user's data completely. Do you think that an infected user would set the importance of a wrongly flagged good file the same as the virus which destroyed the data? Usually a false positive in quarantine can be easily restored once the detection is fixed (if it's not an essential system file of course).

    What's a balanced way between non detected and wrong detected? Is it 1:1, 1:10, 1:100 or 1:1000?

    Who defines that relation for a trusted and valuable antivirus comparative?
     
  2. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    the appropriate relation can be defined by the tester, but a report should contain (even if listed separately) also a FP test, in order that users are informed in a "balanced way".
     
  3. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    I agree - FP test should be included and stated separately.

    But most testers publish final rankings or some kind of awards that are calculated by some schema. Do you, as a valuable tester, agree that these rankings are always a subjective view on the test results?
     
  4. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    no, i do not agree :p. if e.g. VB rules are that one FP is enough to deny their award, then it is an objective criteria they set and apply.
    of course vendors with lot of false alarms would agree with you and try to downplay the relevance of FPs, but that is not an objective perspective. So, rankings are useful for readers, esp. as some vendors would otherwise just promote their high detection rates and hide the fact that they have e.g. quality/FP problems.
     
  5. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    Well, the question for the users remains:

    Which product is better?

    * Product A which has 99.9% detection combined with let's say 3 FPs,

    or

    * Product B which missed thousands of samples but had no FPs?


    As you can imagine, I'm not really happy with the 'awards' thing as it is used by many now. VB100 is a very extreme example with their zero-fp-policy, but nearly all well known comparatives tend to give some kind of awards or final ranking based on FP calculation. And all of them have one in common: Users rely on their final rankings to make up their mind about products' quality.

    That means the public view on a product is made by a subjective tester view on what "balanced way" means.

    People rarely ask the important question: Which one is the best for MY personal needs? E.g. if I can live with the fact that a product has more FPs, I can get a better protection with the tool that has the best detection. On cost of some FPs of course.
     
  6. Night_Raven

    Night_Raven Registered Member

    Joined:
    Apr 2, 2006
    Posts:
    388
    And that's exactly the reason why the standards should be high. A false positive may be acceptable for a user who is familiar with computer security but is unacceptable for the average user. And it's the latter that most of these "unfair" websites try to cater to.
    Computer security savvy users can question the malignancy of a detection and decide to investigate further without jumping to conclusion. Therefor a false positives are somewhat acceptable.
    Average users on the other hand trust the security software of their choice and will do what it tells/suggests them to do. After all that's the reason the software is being installed in the first place. Some users also fall into a state of panic when the alert of a malicious code appears on their desktop. These people's state of mind is "OK, delete it, do what needs to be done, just keep me safe". Many average users also like their product to be configured to do its job silently. For these people false positives can be as dangerous as real threats.

    Also you seem to be ranting with the presumption that false positives are always (very) minor and missed/detected threats (depending on the product) are major. What about if it were the other way around? What if the sample your product detected, and another one missed, is not a major threat but a minor joke program, and the false positive that your product had produced turned out to be quite major, breaking the OS or some thrid party software?
    That's why FPs are potentially as dangerous as real threats.

    You/your company is already known for presenting data in a manner VERY subjective, and now you criticise others for not abiding to your idea of right and wrong. I find this extremely hypocritical.
    Instead of whining here and there why not try to work on your software? Or did you choose whining because it's easier?
     
  7. progress

    progress Guest

    A FP can be as dangerous as real malware :thumbd:
     
  8. innerpeace

    innerpeace Registered Member

    Joined:
    Jan 15, 2007
    Posts:
    2,121
    Location:
    Mountaineer Country
    That simply is not true! Has a FP ever maxed out your credit cards, wiped your bank account, stolen your identity, copied personal files, logged your keystrokes, etc.? Nope!

    Besides that, people should have images and backups by now. External hard drives, thumb/flash drives, online backups and other media are very affordable so there should be no excuses when it comes to recovering from any "catastrophic" event.
     
  9. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    I'll have to go with Night Raven on this one.
     
  10. progress

    progress Guest

    No, a FP can only kill the whole system :cautious:
     
  11. mark.eleven

    mark.eleven Registered Member

    Joined:
    Oct 27, 2006
    Posts:
    81
    Location:
    Island of Sodor
    A few years back, due to an FP from Kaspersky, I hosed my XP by deleting svchost.exe which was wrongly identified as malware. That was before I spend time here at Wilders.

    So, FP is as dangerous as missed detection. It depends very much on the experience of the users. For most users here, I think highest detection is much preferred even at the expense of higher FP, but this may not necessarily applicable to the general public users.
     
  12. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    I'd say the importance of a FP highly relies on the file. If a critical windows file is deleted in wrong, yes, this has a major impact on the user. Similar to a damaging virus.

    But what, if it's just 'any' other file? And how important is the fact, that the user usually can just put detected files in quarantine instead of final removal? A click on "Restore" is not that time consuming as restoring the whole system including user data in case of a damage, cause by a missed virus, isn't it?

    Presuming that the antivirus software does not delete critical windows files, what's the fair relation between missed sample and false positive?

    I'm not sure if there can be any correct answer to that question at all. In my opinion 1:1 isn't right as well as 1:10000 as it depends on the user and the files.
     
  13. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    I tend to agree with EmsiSoft.

    The severity of a FP is relative to the impact. After all classical contingency management always set out risk of happening against impact of some event happening.

    Likewise a FP positive wiping out Explorer or Svchost has serious impact, while quarantaining my red eye reduction freeware (as a FP) only has minor impact.

    I understand that it is real hard to balance this into some sort of equasion/formula, but the current info behind FP is to limited to give it much value
     
  14. Night_Raven

    Night_Raven Registered Member

    Joined:
    Apr 2, 2006
    Posts:
    388
    Having some FPs is OK. There is no product out there that is free of FPs. Every product produces FPs sometimes. In EMSISOFT's case however I think the FPs are too many. I also think there is a tendency to use high detection rates as a justification for FPs which is pretty spurious. All (or at least most) vendors try to keep the detection rate as high as possible with as little FPs as possible. EMSISOFT, seems to me, cares about the first but not about latter. It's as if you think like this: "we have had no/few major FPs so far so it's safe to assume we aren't going to have an increased number in the future". As a respectable security company (which you no doubt consider yourself as) it's important to think ahead, and not to react to situations and fix problems as you go along. I might be wrong of course but it's just how it looks through my eyes.

    Just because most FPs aren't major doesn't mean it's OK to have them. Even if a certain FPs doesn't break the OS it could break another software product or simply remove an important document or standalone file. What if this software/file is important to the person and is job related for example? For a business user a computer with a broken OS is just as bad as a computer with a working OS but broken important software as both hinder the work process. It's not just system files FPs that can be "bad".
     
  15. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    Of course, every AV's goal is to keep the number of FPs as low as possible. Emsisoft included.

    Let me describe the reason of FPs quickly:
    With more than 10,000 new malware signatures each day, the AV industry has reached a level where it is completely impossible to create signatures by hand. Each AV uses some kind of automation to create signatures. Some rely on detections of others (no, that's not a myth), some use heuristics and realtime analysis systems that cost CPU power and time and some still mainly rely on massive manpower because they missed to implement goot automation (only a few can still afford that, but the results are usually not the best). Most vendors use a mix of these methods and some specialized things.

    As you can imagine, when adding 10,000 signatures, there will be some FPs. That's a fact for ALL vendors. Today, the main question is: How fast can they find and fix the FPs? A 500+ employee company will most likely be faster than a 15 employee company.

    Each vendor has his own tricks to keep the FP rate low. Massive whitelist file databases that are scanned all around the clock, user feedback and submits, several filters based on tech analysis, etc.

    You can believe me, we're doing our best to keep the number of FPs low and it's a serius problem for us. We have been developing several new strategies during the last year to improve our scanner. With success. VB confirmed that Emsisoft only had 1 FP. It's one too much, but for us a great result. Btw. it was a Java SDK file, not a critical one. Critical Windows files e.g. are protected in Emsisoft scanners.

    Enough about Emsisoft. The discussion initially was about the general problematic of FP weight in tests, not about Emsisoft. It was not my intention to place Emsisoft in a shiny light, I just think that this topic is subject to discuss in general as the concept of "balanced way" has some flaws I think.
     
  16. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    It seems to me that the vocabulary of and concepts from signal detection theory should be leveraged by anti-malware product testing organizations, to better analyze and describe the tradeoff between true and false positives.
     
  17. btman

    btman Registered Member

    Joined:
    Feb 11, 2006
    Posts:
    576
    I think if there was a mathematical formula to determine FP's vs detections to = awards in each test it would be a lot simpler :p

    In the case of VB100, obviously it would be rare.. But if EVERY tested program detected a false positive but 1. Should the rest be considered 'failures' ?

    But with the VB100 I never care if my AV misses the odd award. But at the same time if 15 or so that get the award can add 10,000 signatures a day, and not get a false positive. Or like Esets VB100 history (and trust me I'm no fanboy of their product), have high detection and never get a false positive on their test, why can't other anti-malware companies tweak their methods to do the same?

    Emsisoft, love EAM (I guess that'll be the new acronym lol), had my own issues with FP's before. Had stuff only you guys detected for me too. Hate your huge updates when I'm cleaning an old machine too, but when I need thorough cleaning its great program.

    I know FP's are as simple as quarantine, update, auto re-scan quarantine.. But a ton of home users hit delete and never look back and then get hit hard by a FP and get pissed at the software that deleted the file, and then uninstall it and download something different and most of the time inferior in detection just because they aren't getting popups about viruses or false positives and they think they're 100% safe because they still have a virus or malware program running (even if they just swapped to a rogue lol). People are ignorant that way, when I do cleanings for friends and family, its sad how often I see good programs trashed due to a previous false detection.

    Its a tough market to be in, let alone compete in lol.
     
  18. firzen771

    firzen771 Registered Member

    Joined:
    Oct 29, 2007
    Posts:
    4,815
    Location:
    Canada
    to me, FP's mean nothing. As long as my AV isnt flagging EVERY file on my computer as a virus, im cool with it having more than average amount of FP's, as long as in turn, it has a higher than average detection ability as well.
     
  19. JRViejo

    JRViejo Super Moderator

    Joined:
    Jul 9, 2008
    Posts:
    97,808
    Location:
    U.S.A.
    I believe that with today's McAfee's debacle: W32/Wecorl.a 0-day? we are going to see more AV vendors paying attention to their FP's than never before.
     
  20. Brocke

    Brocke Registered Member

    Joined:
    Mar 16, 2008
    Posts:
    2,306
    Location:
    USA,IA
    yup i agree. :thumb:
     
  21. tobacco

    tobacco Frequent Poster

    Joined:
    Nov 7, 2005
    Posts:
    1,531
    Location:
    British Columbia
  22. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    I put my $$ on aggressive security apps. False Negatives make me madder than a cat that has had his butt sandpapered & dipped in kerosene. :argh:

    Therefore, I do not want to see security apps sacrifice aggressive detection on the altar of lowering FPs.

    On the other hand, I DO believe that aggressive anti-malware apps (such as Emsisoft's) must do a BETTER job of protecting their users from the sort of disaster that a FP can & sometimes does engender -- as witness the McAfee example cited by JRV.

    Perhaps a "balanced approach" for an aggressive anti-malware application might include such steps as the following:

    1- Offer variable levels of aggressiveness, with the default level being "lowest FPs".

    2- Make it difficult to increase aggressiveness level above the default level. E.g., bury the ability to increase aggressiveness under a check-box with an off-putting label such as "For advanced users & security technicians only".

    3- Quarantine nasties instead of initially deleting nasties. Out-&-out deletions should NOT be enabled until AFTER a nasty has been quarantined some minimum number of days (say 5 days).

    4- Always offer the option to set a system restore point when something is going to be quarantined.

    5- Auto-set a system restore point when quarantining nasties with file extensions such as .sys, .dll, .ini, etc. Make it difficult (but not impossible) for the user to negate the setting of the restore point.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.