Discussion in 'other anti-virus software' started by IlyaOS, Oct 9, 2006.
I can assure that Stefan has the required level of technical professionalism to make such comments.
1. Let's keep the discussion technically oriented.
2. When an acknowledged expert (or two) in a field points out potential experimental limitations of an approach that I'm following, I usually step back to consider the points that they are making and adjust accordingly; even if I am expert in the same field. That would be a recommendation that I would stongly encourage in this case.
...and that makes you a valuable person and separates you from pumpkins. And to be honest i didn't expect anything else from you
By the way, I really like pumpkins - especially in pumpkin pie! Yumm!
Difference is here that those pumpkins are innocent and don't mind if you hold them down under cold water for a few minits...
I did not want to reply anything to this thread/test before and look how it evolves, but yes, my thoughts are similar to what Stefan and Mike said (and it is nice from them to explain it so exhaustive) - what they say are facts and can be believed (if they do not know it, who else?)
This one: https://www.wilderssecurity.com/member.php?u=2154
or this one: https://www.wilderssecurity.com/member.php?u=54768
or this one: https://www.wilderssecurity.com/member.php?u=44144
EDIT: Wah i forgot the eset folks which are lurking around here as well, sorry Marcos
And by the way why is ESET behind Bitdefender? They do FOR SURE more unpack than Bitdefender! Anyway off to VB now have a nice discussion here and please keep it technical
I made that comment because there are some inconsistencies which at once made me think that. When you are in the AV business for a long time, you get a feeling for that. As Mike said, why was ESET rated so strange? Why were heuristic detections not rated for other products? Seems the programs were rated with different measures.
The funny thing is, KAV doesn't need anyone to publish "tests" like this one. They have an excellent, top-rate scan engine, there is no doubt about it. So do have others.
Yub. Besides it might do even business damage for the better scoring products in such tests if you are listed in such amateurish tests. And yes of course i agree with Stefan that there is no need to prove once again in an out of question "test" that KAV has a solid av engine. There's absolutely no doubt about it. But spoiling other av companies work (as you might notice F-Prot was not even included so i defend here basically our own "opponents") without knowing how to perform this test correct is just childish and amateurish. This should give to think about
well, i meant if not those that work closely to av developing and know how the products works in depth
I'm not sure so sure that Eset unpacks more than BitDefender. It may have been true in the past but now that BD has generic unpacker as well as VB emulator, hmm
As for commenting on this test, I completely trust the Inspector's comments. The method of testing was not optimal IMO.
.ru is Russian not Romainian (.ro is Romania)
I realised now about my mistake. Spent too many hours in front of this box and misplaced the "u" for an "o". Damn! Sorry.
Inspector, i know, you are more advanced in technogy aspects then me and i don't wanna dispute. But explain me just two thinks:
1. Why KAV and some other AV can detect packed samples without any problems? They have unique technology?
2. It was our first test of packers support, so we'll correct methodology and your expert's assistance would be very useful. What we have to change from you point of view to get right, appropriate results.
Anyway thanks for critic even if i'm not agree with it.
To everybody: please post all your questions, suggestions and feelings about this test, your opinion is very important!
No, its because these companies spent some time in optimizing their unpack engine to support more types of packers.
I'm not sure that such a test can be reliably performed as there is no way to determine whether any vendor has specifically added added detection for packed variants of the malware used for the test (unless of course you start decompiling the database and looking at the signatures, but that is illegal).
EDIT: Maybe one thing you can try is using the absolute latest version of the packers to pack your infected files. There would be a high probability that the vendors using specific signatures for packed variants of malware may not have updated the database to detect the file packed with the new version of the packer. This wouldn't be a problem for AVs with good unpack engines however. But even this method probably has flaws and is not a sure-shot fix.
Yeah, i have some doubts about superiority of NOD32 over BitDefender 10.
BD10 engine is brutal in its latest stages, thats a fact. Maybe it still lags in terms of speed but unpacking is certanly on very very high level.
As far as av tests go, I view all of them with skepticism, even though the av product I currently use scored well on your tests, it seems your test dethroned some of the scared cows, then the demi-gods weighed in (security experts) and the first thing to be brought into question is testing methodology, then the flock follows, this is the standard scenario in security newsgroups with any new av tests by an unknown. On a more personal note, your tests were interesting, flawed or not. I really don't know, only what I read in the forums...
IMHO, the questions you might want to answer when testing how AVs handle (re)packed samples are the following (I'm summing up some of the arguments developed by other contributors):
1/ Does the antivirus provide a specific/static unpacker for a given packer ?
. Depends on the packer
2/ Are the malware signatures picked in such a way that unpacked malwares can be detected by signature ?
a. Depends on the malware (different signatures use different references)
b. Depends on the packer (the structure of the sample after unpacking may be different from its structure before packing).
3/ Are genuine files (e.g. notepad.exe) flagged as suspicious when packed with given packer ?
. Depends on the packer
4/ Is the antivirus able to emulate through the whole unpacking code (can be useful for generic unpacking and might be necessary for heuristics detection) ?
. Depends on the packer
5/ When a sample is found packed ITW, does the AV also provide a signature for the unpacked sample ?
. Depends on the malware and on the packer
In your test, you tried (and, I think, failed) to answer the question 1/ without taking into account answers to questions 2->5.
+ For answering question 2b, you need to understand how the packer operates, either by reverse engineering it or by comparing the initial sample with the executable obtained after manual unpacking.
+ Answering question 2a, is difficult: you would need to use a packer that does significantly alter the file structure and that the AV claims to support as a reference and throw a large number of packed malwares at it --> the answer to question 2 is not a simple yes/no, but a percentage per packer. And if you want this percentage to be meaningful, you need to use a large test bed and/or to chose carefully your samples.
Easy test. Such a behavior (flagging every packed samples) is to be considered as bad for a desktop AV, excepted if the packer's author specifically claims that it has been conceived to fool AVs.
Difficult test although AVs like bitdefender or Norman could be tested by checking for a "BehavesLike" or "Sandbox" detection.
Needs access to a good collection of ITW malwares, and real understanding on what/how to unpack (see question 2).
Conclusion: testing how an AV support runtime packers without reverse-engineering it is not an easy task. I wish you good luck for your next test(s).
Thank you Tweakie for your summary of the issues.
As always a good instructional post on Anti-viruses and testing.
This is for Stefan Kurtzhols. I tried to send you a IM, but your box is full. Is there any other way I can contact you. I have a question about AntiVir Premium, and I don't want to hijack this thread.
Oops, cleaned up the inbox.
Thank you Stefan. I just sent you an IM.
Tweakie and Firecat: Thank you, these're very useful comments for us.
Separate names with a comma.