Jotti's Tests

Discussion in 'other anti-virus software' started by JerryM, May 25, 2006.

Thread Status:
Not open for further replies.
  1. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    In the tests of specific malware that Jotti's shows, I note that one of the best AVs, NOD32, does not seem to have the detection rate of some others. BD appears to be one of the best there.
    I also note that KAV misses more than I would expect.

    Is there some reason that NOD misses more than BD when the heuristics of NOD are the best?

    This is not intended to start a comparison as such, but my impression is that it does not square with AV Comparatives.
    I wish there was some record keeping of the Jotti's results so that over a period of time we could see the success of the the various AVs.

    I do not question the validity of Jotti's and believe they are independent and that the results are factual.

    Jerry
     
  2. flyrfan111

    flyrfan111 Registered Member

    Joined:
    Jun 1, 2004
    Posts:
    1,229
    Jotti uses the Linux versions of each scanner and all the functions/options that are available on Windows versions are not available on Linux.
     
  3. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Jotti doesn't test anything. He probably keeps a track record for himself, but in general it's just a scan service, nothing else.
     
  4. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
  5. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Thanks, All, for the replies. That helps.
    Jerry
     
  6. kjempen

    kjempen Registered Member

    Joined:
    May 6, 2004
    Posts:
    379
    I also seem to recall something Marcos mentioned about NOD32's heuristics not working properly at jotti's? Also, you can never know if the software is properly updated or configured (f.ex. I remember a time when BitDefender at jotti's didn't really catch anything at jotti's while BitDefender at VirusTotal did).

    Bottom line is, you can't really trust any scan service 100 percent. And it shouldn't be used as a "reliable test site".
     
  7. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    You won't believe what kind of binary crap (damaged samples, non-malware etc.) people upload to Jotti and VirusTotal - it's little wonder that NOD32 doesn't catch those files.

    Simply do not use Jotti and VirusTotal to "measure" the quality of an antivirus product.
     
  8. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,741
    Location:
    UK
    So how do you measure the quality of such products?
     
  9. pykko

    pykko Registered Member

    Joined:
    Apr 27, 2005
    Posts:
    2,236
    Location:
    Romania...and walking to heaven
    watching independent tests like av-comparatives.org, virusbulletin.com and trialing for yourself. :)
     
  10. Stefan Kurtzhals

    Stefan Kurtzhals AV Expert

    Joined:
    Sep 30, 2003
    Posts:
    702
    In no specific order:

    - reaction time for ITW and non-ITW malware, zero-day-exploits
    - static unpacking (how many, how reliable against small modifications?)
    - x86 emulation (handling of highly complex polymorphic/metamorphic malware)
    - x86 emulation (generic unpacking, speed, available for all platforms?)
    - heuristic (win32, script etc.)
    - variant detection
    - script handling (emulation? VBA, VBS, JS, see Feebs)
    - unpacking of archives (reliable (exploits!), speed, archive/mail bombs, platforms?)
    - speed of scanning (clean and infected files)
    - detection of exploits (= correct file format parsing & analysing)
    - how exact is the malware identification (if the product does bother to identify at all)
    - memory usage of the scan engine
    - size of the pattern updates, size of the engine updates
    - behaviour blocker (false positives? end-user-friendly?)
    - handling of active malware in memory
    - cleaning of infected systems (infections with multiple malware handled?)
    - same detection quality for on-demand and on-access
    - system load of on-access scanning
    - stability (there were lots of exploitable weaknesses in AV archive handling)
    - available for many platforms?
    - outbreak detection (e.g. on gateways)

    Alas, there is no published test that measures all these things :-(
     
  11. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Not to mention ease of use, customer satisfaction, tech support, engine vulnerability, upgrade frequency (not update frequency), price etc. etc. :p
     
  12. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    I 2nd that lots of crappy files are uploaded to Jotti's - far much more than to VT. What's more, there are some problems with AVs allocating too much memory at Jotti's which results in that some scanners do not even scan particular samples, and they are subsequently shown as undetected by these AVs. It is also common that the samples submitted are actually installers which are not detected, but the malicious files are detected upon extraction.
     
  13. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    163,851
    Location:
    Texas
  14. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    I always notice VBA32 picking up everything sent to Jottis. VBA32 seems to detect most any real world stuff I run into as well, so I am pleased to use the product.
     
  15. DjMaligno

    DjMaligno Hispasec/VirusTotal

    Joined:
    Feb 22, 2005
    Posts:
    63
    Location:
    Spain
    Quoting Jotti "You're free to (mis)interpret these automated, flawed statistics at your own discretion. For antivirus comparisons, visit AV comparatives".
     
  16. TairikuOkami

    TairikuOkami Registered Member

    Joined:
    Oct 10, 2005
    Posts:
    3,432
    Location:
    Slovakia
    AV Comparatives is the only company, which public "independent" tests, who knows if they are accurate, when there are no other tests to compare? Jotti is Linux based, but it is testing windows files, and if some AVs even does not have signature for it, at least it shows their reaction time, which is important for zero day exploits. Anyway, any known AV is good.
     
    Last edited: May 26, 2006
  17. Marcos

    Marcos Eset Staff Account

    Joined:
    Nov 22, 2002
    Posts:
    14,456
    If an AV flagged all files on your disk, including malicious files, would it actually be the best just because it detected all threats?

    You must distinguish between legit and malicious files, functional and non-functional files, archives/installers and unpacked files, etc.

    Regarding Jotti's results, many scanners even don't get to scanning as the previous scanners have allocated too much memory. Sometimes submitting an undetected file again will show it as detected.
     
  18. Severyanin

    Severyanin AV Expert

    Joined:
    Mar 19, 2006
    Posts:
    57
    Simply do not use any comparative results that do not fit into your vision of the univers. Just use the product you like:)
     
  19. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    :D seems to be Severyanin day-by-day practise :p (just kidding) ;)
     
  20. Durad

    Durad Registered Member

    Joined:
    Aug 13, 2005
    Posts:
    594
    Location:
    Canada
    Its hard when somebody do not want to change his mind no matter if there are arguments or not :)

    By my expirience with Jotti and VT the following scanners have good detection:

    - DrWeb
    - Kaspersky
    - NOD32
    - VBA32
    - AntiVir

    All others miss a lot.
     
  21. TAP

    TAP Registered Member

    Joined:
    Aug 17, 2004
    Posts:
    344
    Besides the corrupted samples that makes such multi-online scannes are not so reliable and probably has not thing to do with real-world capability of AV scannes. I'd say the scanners itself are also the important factor that make such results are not so reliable, because of some scannes are the older version compared to their latest desktop/server engine version, so this causes a different results.

    For example, the version of avast! engine on VirusTotal is 4.6.695 but the official latest version of avast! engine for desktop is 4.7.827. I have some malware sample that my avast! on desktop detect but avast! engine on VirusTotal does not and I'm sure there're lots of such sample in my malware collection.
     

    Attached Files:

Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.