VBA32 and KAV, same bases ?

Discussion in 'other anti-virus software' started by Mack Jones, Aug 29, 2005.

Thread Status:
Not open for further replies.
  1. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    Either i'm misunderstanding something, or your tests seem biased. Are you saying you provide samples to a "Chosen Few" companies, and the rest are left to fend for themselves?

    If this is the case, then I must summarily discount your entire test results in my own mind.

    I've tested VBA32 to be 95%+ effective against Win/Win32 threats. The only thing I can't do is certify them (yet). Firefighters results seem identical to mine, yet I have no clue what his random samples contain... Also i've been appraised of a test where VBA32 was sampled against a large set of threats (80k+), and scored 96% on that as well. I'm also privy to Jotti showing VBA32 as extremely superior in detection.

    Is something amiss or am I just misunderstanding what is going on here?
     
  2. .....

    ..... Registered Member

    Joined:
    Jan 14, 2005
    Posts:
    312
    IBK, tests against samples of 400000+ samples. ;)
     
  3. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    bah, again you compare 'Jotti test' (or your tests [no offend intended] with av-comparatives). Do not worry, nothing is wrong with av-comparatives. VBA32 has access to jottis samples. first thing they do, is adding everything they get from jotti every day; those companies that add faster the things uploaded on jotti will look better for those users that visit jotti and upload the same file e.g. the next day, because it will show it at detected. but this does not mean that if you use a very large test-set of malware (i do not speak about spyware etc.), a scanner would detect most things like the things uploaded on jotti. also vba32 has very sensitive heuristics, if even with them enabled to the maximum they actually do not reach 85% on my test-set, i hardly doubt they will in 6 months; maybe in 12 months. another thing i often read is that you say scanner xyx would detect all 'real' itw samples (i wrote 'real' because some here use this term with their own definition) but in reality most of that scanners would usually not even reach 100% (or 99%) of detection of the ITW samples according to the wildlist.
    BTW, I could also say: e.g. VBA32 detected here from the av-comparatives test-set at last e.g. 330.000 samples, so why if you used only 80.000 it did not reach 100% in your test? (you see, non-sense question) ;). As I said, if I would look on Win32 threats only, the score would be even lower. But now please stop to get from me details of the VBA32 test results - they are not published and not for public discussion. :D VBA32 heuristic is cool, they will probably score good in the retrospective test.

    back to the topic: VBA32 and KAV, same bases ?
    no, they do not use the same bases. vba32 (and also some other companies, e.g. ikarus) looks which scanner does already detect the sample, choose a name of the proposed ones (and as kav naming sceme is 'nice' and kav detects most things) most names they use are similar or the same as those from kav, and some other names will be the same as some other scanner use. my 2 cents...
     
    Last edited: Aug 30, 2005
  4. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    IBK, can I ask how you can make this prediction about VBA32 or in fact any of the AV's that you test? Do you have a crystal ball :D ?
     
  5. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    sure. i know more or less how much other companies are able to improve in a given period, and by the actual score of VBA32 i guess that they will not be able to improve that much within 6 months. but who knows, maybe they will surprise us all...

    EDIT: the "sure" refered to "can i ask", not to "crystal ball", whcih was added later ;)
     
  6. Blackcat

    Blackcat Registered Member

    Joined:
    Nov 22, 2002
    Posts:
    4,024
    Location:
    Christchurch, UK
    Can you expand on this? Are you constantly testing the selected AV's against your malware collection?

    I would have thought it would be difficult to predict 12 months ahead particularly if a Company decides to add an extra large amount of malware definitions to their database.

    Conversely if a particular AV has dropped in detection is this a reflection of them slowing down in adding malware to their database? Or simply a slight change in the malware composition of your collection?
     
  7. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    Yes, they are constantly watched. In the test reports there is always a simplified overview of added samples, which is maybe of interest for you.
    The reason is probably the first one you said.
    Other thing I can also already guess: next year most AV will score very good (due the new versions they will release and due the improvements they are going to add); but it is only a guess...
    for august 2005, the results that i am going to release soon are valid and show how they performed the 5th august on the test-set of the 5th august.

    P.s.: i saw your otehr post about drweb; do not worry, i am quite sure they will be able to reach easily again advanced in february2006. (just a guess as always). [p.s. sometimes my guesses are wrong, for example i always try to guess how the scanners will perform before i start the tests, but so far i was never able to predict exactly the ranking - so I usually enjoy the surprise of the results too ;)]
     
  8. err...

    err... Guest

    @IBK
    I think, that was not the question. The question was, if VBA32 (and the other too poorly scoring AVs) get your testset samples in order to improve.
    It seems unfair to give your collection only to the "better" competitors, while the underdogs don't have a (realistic) chance to keep up.
     
  9. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    I am sorry, if you find the condition unfair, you have to talk with the av companies (well, not you, the av company that finds it unfair should discuss it with the other av companies). it is not in my hands what the various av companies do with their samples.
    btw: i do not find it unfair. i find it absolutly fair and ok so far.
    btw2: the sentence that vba32 has access to jottis samples was not in relation to the previous sentence; it was to explain the next following sentence etc.
    btw3: what u (or FF and anyone else) can do is to send your samples to the av companies.
     
    Last edited: Aug 30, 2005
  10. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    and 300,000 of those are Dos viruses. Your point?
     
  11. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    No, it isn't fair, and it *IS* in your hands... You are slanting and biasing your test in favor of certain specific AVs. As a supposed "AV Expert", you yourself should know that this nearly completely invalidates the results of your testing!

    I cannot begin to fathom the ridiculousness of providing samples to "Select" AV companies, and then saying "The others will just have to find them their own way".. Worse, you are doing harm to companies, and seem to not be concerned with the "Financial" impact of your published test on the companies that aren't priveledged to recieve your samples..

    I have to ask, what do these "Special" companies have to do to recieve your care packages? Pay you? Certainly you wouldn't be the first "Hobbyist" AV tester to have been exposed as biased, or worse....

    At the very least, my examination of your tests and proceedures seems to have exposed a specific and admitted bias - a conspiracy if you will, and I don't like what you are doing one bit.
     
  12. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    uff, how can i discuss with you about a thing if you do not have a clue about this things (and do not start to offend me if you do not know about those things).
    believe me it is fair. they that participate do not get anything what they do not have already. those that do not participate and feel that they do not have those samples have to ask the other av companies if they can get the samples from them (if they do not get already), they can not get it from me. and if they are not allowed to get them from av companies, there will have good reasons for that (no, not financial reasons or because of concurrence), but you seem to be not aware of that.
    i think that _you_ are doing harm with your homemade tests to companies and users, as you do not give the chance to check if your samples set is valid or not.
    BTW: why should they pay me for something they already have? i of course do not get anything for that.
    i explained now everything. 'nice' to see that i get attacked for _no reason_ from a hobby av tester just because i was so friendly to say "No, does not reach the 85%." maybe i should not tell to the public everything what could be of interest, and be inactive in forums like most other testers does... , because even if i explain everything, someone always will not understand it and get it wrongly (and someone can get everything wrongly if he wants to).
     
    Last edited: Aug 31, 2005
  13. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    We have seen somewhere even here at Wilder's those detecting rates about av:s that are included in Jotti's. If I remember right, the top 3. were:

    1. Kaspersky

    2. Vba32

    3. DrWeb

    Have I missunderstood something about Jotti's statistics? I understood, that ONLY those FIRST SCANNED SAMPLES were within in Jotti's statistics, not those that were scanned again.

    It may also be true that some scanners have not the best available settings to detect malware in Jotti's, but even I have found that Vba32 installed on my PC with best possible heuristics and adware scanning option ON, detected some files that Jotti's Vba32 couldn't, but still Vba32 was the second best there in Jotti's. :oops:

    Best regards,
    Firefighter!
     
    Last edited: Aug 31, 2005
  14. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    Firefighter, I think you know what my personal hidden opinion is: every test (also the most worst one) does show something and can be of interest, as far as the one who interprets it knows how to weight it and to understand what it shows. I think I posted in past a statistic on how usually the AV on Jotti do score, in order that it is not needed to get an overview by counting only single samples. Your test, jottis site, virusp, etc. they all show something, but they can not be compared with other testing sites, like i can not compare av-comparatives with virusbulletin, checkvir, icsa, etc. (they show all a bit different aspects in some way).

    EDIT: found it (the usual results of jotti scanner):
    Kaspersky ~83%
    VBA32 ~65%
    BitDefender ~63%
    Dr.Web ~63%
    NOD32 ~56%
    AntiVir ~54%
    ArcaVir ~52%
    Fortinet ~48%
    ClamAV ~40%
     
    Last edited: Aug 31, 2005
  15. .....

    ..... Registered Member

    Joined:
    Jan 14, 2005
    Posts:
    312
    But ~180 000 of them are non-Dos and other OS :rolleyes:
     
    Last edited: Aug 31, 2005
  16. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    and the DOS category is the category where VBA32 scores best in its results (compared to the other scores of VBA32 in the otehr categories) ;)
     
  17. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    At least when I read some of the provisos in the latest www.av-comparatives.org report, the situation seems quite fair. The specific language states that:
    • Vendors supplying samples will receive missed samples after the test if they give permission to allow their samples to be shared with other participants
    • Those not supplying samples with receive up to 2,500 samples only chosen by the tester
    • No samples provided and no results listed if detection is below 85% on the on-demand test zoo samples
    This seems exceedingly fair. To get something, you have to give something, and that something must be of value (this is where the 85% floor comes into play).

    Blue
     
  18. err...

    err... Guest

    @IBK
    That is imho a critical point. Can you really gurantee that? How can you be sure? Above (post #23) you just said, "they usually have them already". Now what, "usually" or "generally/always"?

    Furthermore: do those companies, which do not fulfill your conditions, at least get a chance to know which samples they missed in order to get more than 85%?

    I hope you don't mind me asking - but I am sure you understand, that we (your "audience") are not satisfied with a simple "believe me it is fair" ... ;)


    @BlueZannetti
    Well, "fair" in terms of getting/giving... perhaps.
    BUT in terms of an objective, fair and non-biased AV test... not really... :(
     
  19. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    The discussion has meandered somewhat off the nominal target of this thread, so if it looks to continue on these lines, that discussion should be split off and continued as a separate topic.

    That noted, any test of capability, be it an AV-test or a high school mathematics exam administered to a nervous student, is a partial snapshot.

    Good tests provide as objective a metric of capability as feasible. In my personal estimation, the workers at www.av-comparatives.org do an incredible job at getting closer to that ideal than anyone else than I know of.

    Save for all vendors unilaterally depositing there samples into a public repository for anyone to sample - and this will never happen since at that point one is, by definition, funding your own competitors - this seems as close to ideal as possible.

    If anyone else has a better approach, I would suggest they start a new thread, outline it in detail, and explain how it would function in a commercial environment.

    Blue
     
  20. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    I have to say "believe me" because it is easier than trying to explain you things that only the ones responsible for the virus labs can understand and know. with usually i mean "nearly all samples".
    those that do not get 85% are a) not included in the test and i start now to think that i should in future not again make public test results of a non-included product b) they get more detailed results and if they ask even more detailed than they got (but will cost me more time to prepare for them the documentation and is not really useful data for them).
    Yes, I understand that you ask and want to know if everything is done correctly. If you search a bit you find me also asking about otehr testing procedures of others, because I also am always a bit curious and do not believe everything. the conditions that are here for discussion where introduced by the av companies after long discussions how to make it fair, i am only the one who applies them, that is why i see nothing to discuss here, because i know all reasons and am after that discussion convinced that it is absolutly fair. i know i ask much if i just say believe me, but i can not do more for you. of course you are free to not believe me, even if that makes me sorry because i really put all efforts to deliver fair procedures and results.
    btw, i know more or less (i do not have all pgp keys) who get which samples from who and who get them again also from me after the test, that is why i know that the published comparatives are fair. who u should not believe to much are some av companies with statements like "we detect all known viruses" or that say "we do not score good in this test because the tester is an #*?§! and did not send us samples" (yes I heard some rumor that someone said this in past, and the most interesting thing is: THEY GOT ALL SAMPLES!, but they do not want to say publicly that it is their fault, so they say the test is ****; usual behaviour); also you should not believe to some reviews you find on the net, last year i find out that some journalists do work not only for magazines but also for some av companies, which did shocked me a bit (for that and also for otehr reasons). So, it is ok if you do not trust everything you read, but I think you can trust me (currently the tests are free of fee, only donations are accepted) and no company has influence in the test results.
     
  21. SDS909

    SDS909 Registered Member

    Joined:
    Apr 8, 2005
    Posts:
    333
    You sir are mistaken, i'm not a "homemade" tester. I do this for a living. I have the faculty to evaluate samples myself, under rare circumstances if I can't, I have people that can and do.

    That is exactly my point.

    What I do not like is IBK seems to have a general disdain for people he views as "Hobbiests", and doesn't like sharing information and discounts as "You are a hobbiest, you don't understand.". IBK says he doesn't get paid, so that tells me that he himself is indeed is a hobbiest, possibly masquerading as a professional test organization. Since he has no certification, publishes no industry trade papers, then we must assume our assumptions correct.

    No need to beat a dead horse, but I think at the least, Av-Comparative results should be treated with a skeptical eye as are all hobby tests.
     
  22. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    let me guess, you work for an av company. i also am able to evaluate samples by myself, and yes, i also have peoples here that are better than me in analyzing samples and do that.
    just because you get paid, it does not mean that your tests are better or more fair. i am not masquerading anything, and if you have problems with av-comparatives, say "you assume" have them and not "we assume".
    about no certification: i was already in various visited av companies where i showed that i am able to do the work.
    about papers: currently i am quite busy, but i and my collegues are planning to making some papers for virus bulletin in future. i will also be at the virus bulletin conference this year, if you want to discuss with me there, you are welcome.
    btw, show me your certification and your papers, i do not see anything from you. i only know that you were an av distributor in past, but no more, that is why i had to assume you are a home av tester. and it is not true that i do not help home av tester, you can ask FF, i helped him a little bit to clean up his collection in my free time.
     
  23. Honyak

    Honyak Registered Member

    Joined:
    Jul 19, 2004
    Posts:
    346
    Location:
    Deep South
    I have to admit that while I always appreciated and looked forward to the results of IBK's tests, in the following thread https://www.wilderssecurity.com/showthread.php?t=92147&page=1&pp=25
    It appears that IBK was speaking on behalf of Eset until Don Pelotas called this to his attention.
    But none the less I appreciate your efforts.
     
  24. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,856
    Location:
    Innsbruck (Austria)
    that was a misunderstanding, it tought it was cleared in the meantime; i was defending tests not products. i did not spoke on behalf of anyone.
     
  25. Don Pelotas

    Don Pelotas Registered Member

    Joined:
    Jun 29, 2004
    Posts:
    2,257
    Actually, that did not come out well and i have apologized to Andreas in a PM conversation about this, so now the apology is official!

    I did think think some could see his posts as he was pro-nod (& i mean seen as, & not that he actually was in their corner!), but the way i went about it was perhaps not the best (never post late at night, when brainfog has set in ;) ).

    Just for the record i like Andreas's test's, there are not many good ones around with a large number of signatures. :)
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.