Anti-Malware Test Lab - Testing of proactive antivirus protection II

Discussion in 'other anti-virus software' started by Thankful, Mar 16, 2009.

Thread Status:
Not open for further replies.
  1. Jin K

    Jin K Registered Member

    Joined:
    Apr 9, 2008
    Posts:
    105
    if i understand you right، f-secure using the old kaspersky engine 6 thats why he scored lower
     
  2. The Hammer

    The Hammer Registered Member

    Joined:
    May 12, 2005
    Posts:
    5,752
    Location:
    Toronto Canada
    Thank you.:D
     
  3. dw2108

    dw2108 Registered Member

    Joined:
    Jan 24, 2006
    Posts:
    480
    No Rising, no Twister, who need to get involved if they want the confidence of greater purchasing markets in The "WildTest" West.

    Dave
     
  4. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    I have to agree with that point.Although I'm a huge fan of Avira it is annoying to have to manually disable the 'detection' of no-cd cracks all the time even when Avira knows them to be safe.

    There's nothing illegitimate about using these patches on software that has been legally purchased,as anyone that's had the kids scratch a brand new £40 game cd will attest to.
     
  5. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Most relevant to this test,
    1. They both use different heuristics.

    In addition, more generally,
    2. There is a time-lag between when Kaspersky publishes an update and F-Secure publishes Kaspersky's update.
    3. F-Secure has another scanning engines so can detects some things which Kaspersky doesn't.
     
    Last edited: Mar 17, 2009
  6. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Actually F-Secure has it's own virus lab, so they can actually add malware before Kaspersky Lab.
     
  7. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Yes, as per point 3.
     
  8. lodore

    lodore Registered Member

    Joined:
    Jun 22, 2006
    Posts:
    9,065
    as far as i know the lastest f-secure products only contain f-secure own in house engines plus avp version 6
    f-secure in house engines are as follows:
    blacklight =for rootkits
    Hydra
    Pegasus (might be from norman acually) i think its part of deepguard. Not sure if its still for deepguard 2.
     
    Last edited: Mar 17, 2009
  9. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Actually other engines don't have much to do with this.
    They can add signatures to KAV engine itself.
     
  10. lodore

    lodore Registered Member

    Joined:
    Jun 22, 2006
    Posts:
    9,065
    why would they add signitures to kav engine when they can add signitures to there own engine?
    maybe one day they will use their own engines only.
     
  11. Sergey Ilyin

    Sergey Ilyin Registered Member

    Joined:
    Mar 25, 2008
    Posts:
    12
    Location:
    Moscow, Russia
    Thanks for you comments!

    Just one thing I would like to draw your attention. The results shows that heuristic components of the most antivirus products were considerably enhanced. Most vendors took essential steps to improve their software, look at this diagram.

    http://www.anti-malware-test.com/files/euristic_test2_eng_5.PNG

    The most impressive progress in heuristic detection showed by AVG. Good job!
     
  12. TechOutsider

    TechOutsider Registered Member

    Joined:
    Sep 26, 2008
    Posts:
    549
    "Proactive antivirus protection" is rather broad. In its most extreme form, it means not owning a computer. I'd like to see better terminology. Another way it can be taken is attempting to get malware on the system in the first place, rather than having a folder full of nasties and having an AV scan that paticular folder.
     
  13. BrendanK.

    BrendanK. Guest

    Bitdefender dropped since 2007 :/
     
  14. vijayind

    vijayind Registered Member

    Joined:
    Aug 9, 2008
    Posts:
    1,413
  15. BrendanK.

    BrendanK. Guest

  16. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Maybe they haven't dropped, but the samples have simply changed.
    Still very good results and that's all that matters I guess :)
     
  17. dw2108

    dw2108 Registered Member

    Joined:
    Jan 24, 2006
    Posts:
    480
    When I look at interpreted test results, there are questions because not my AVs are freebies:

    What can I afford on my "newer" PC,

    And,

    Which FREE AVs can protect as best possible those older PCs.
     
  18. Arup

    Arup Guest

    Avira and Avast both offer excellent free version and they will do very well in your PC.
     
  19. docfleetwood

    docfleetwood Registered Member

    Joined:
    Apr 6, 2004
    Posts:
    36
    Avira 71% detection and .13% false positive

    F-Secure 44% detection and .03 false positive

    These aren't signatures but detecting new/unknown threats.

    This means that Avira detects 27/100 MORE actual unknown/new threats than F-Secure while falsely alerting 1/1,000 more times.

    How are these two even in the same medal category? How can you suggest that detecting 27/100 more is somehow overshadowed by 1/1,000 more false positives?

    In fact, Avira detects 10/100 more than the Gold award winners whilst still having only ~1/1,000 more false positives.

    The award categories are effectively based on false positive rates and that seems a bit unfair given the low false positive rates of all of them (except Sophos - holy yikes).

    I've got no horse in this race - I'm just saying that analyzing the math doesn't suggest the same conclusion as the award categories do.
     
  20. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    I agree, but also what I may have awarded them may differ from what you or everyone else would award.
    The award status is rather subjective and its up to each person what they want to award them. Antimalware just chose that. Same with any award which includes subjective views or have more than 1 variable or require 'weighting' of any sort.
    We're free to make our own "awards" if we want and say what we think they should get just like antimalware.ru. Its upto people if they trust the award (care about it) or not.

    People can "analyze the maths" however they want its highly unlikley most of the analysis will be exactly the same.

    At the end of the day, we have the statistics for detections/FPs, we're all inteligent enough to make our own minds up about what we think of the results.
     
    Last edited: Mar 21, 2009
  21. dawgg

    dawgg Registered Member

    Joined:
    Jun 18, 2006
    Posts:
    818
    Antimalware team, in the methodology (step 5+), you mentioned you updated the AVs after 1 week and done a repeat scan.
    What was that for?
    What is the final % then? - test step 3 or step 7 or a manipulation of the two?
     
  22. risl

    risl Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    581
    One thing is very annoying: when the av suggests that some file is suspicious and might be dangerous, it is considered as a false positive. For example, Dr.Web might detect something as "Probably xxxx", label it as suspicious and then people say it makes a false positive. The file probably only does or is something that malware usually are.

    My point is: If some clean file is detected as suspicious because it has some same charasteristics as a malware file, it isn't a "false detection" but only a suggestion that it is something that should be carefully looked at. That's not a bad thing.
     
  23. andyman35

    andyman35 Registered Member

    Joined:
    Nov 2, 2007
    Posts:
    2,336
    Good point,as the saying goes "better safe than sorry" ;) FPs are inevitable with any good pro-active detection given the similarities between many good apps. and malware.
     
  24. risl

    risl Registered Member

    Joined:
    Dec 8, 2006
    Posts:
    581
    The difference is:

    1) Clean file detected by a specific signature, then stated that it is malware for sure. For example, "Trojan.Asdf.Blah.34576" .. etc.

    2) Clean file is detected by heuristic methods and then labeled as potentially dangerous based on it's charasteristics. For example, "Probably Backdoor.Trojan" .. and etc.

    Nr. 1 is a false positive, nr. 2 is excellent security.
     
  25. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    According to German law, using tools to bypass copy protection is illegal. So why should we invest resources in illegal software while there is lots of undetected real malware that we need to work on? Fix one fp on a crack or in the same time, add a new generic detection that catches 500-50.000 samples?

    You use a no-cd patch 1 time, why not disable the guard while you patch the game and then delete the patcher?

    Also, quite a few products do add specific (generic) detections for cracks/keygens, McAfee or Dr.Web for example.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.