c't magazine 1/2005 AV test

Discussion in 'other anti-virus software' started by halcyon, Dec 28, 2004.

Thread Status:
Not open for further replies.
  1. Ianb

    Ianb Registered Member

    Joined:
    Nov 26, 2004
    Posts:
    232
    Location:
    UK
  2. halcyon

    halcyon Registered Member

    Joined:
    May 14, 2003
    Posts:
    373
    I'm not saying you are not right, but...

    for allegiations like that you have to show PROOF.

    You can't just go on claiming people have been bribed and that the results are suspect, unless you show proof.

    Please, what is your proof on this matter?

    I'd very much like to know :)

    regards,
    halcyon

    PS I have no vested interested either way. I just want to know if the results are even somewhat trustworthy or not. However, unless somebody gives me proof, I will believe they are of the general relatively high standard quality as c't magazine's articles often are (not always/not all). Also, it would be nice, if the person posting allegiations wasn't doing it under a 'guest' monicker :)
     
  3. ,.-

    ,.- Guest

    @firefighter

    "I'm 100 % sure that my trojan like samples aren't 100 % clean as you wrote, but if a Kaspersky engined av scored over 99 % and some other about 90 % and Ewido 92.1 %, that's enough for me."

    In my opinion, this is not the correct conclusion: Kaspersky does not merely detect dangerous malware but also harmless clients etc. Just because Kaspersky detects something you cannot expect another scanner (which does not search for harmless clients) to detect it as well.

    Whether a scanner needs to detect harmless clients, DOS or Linux malware etc. can depend on whether you want to use it on a private desktop system or in a corporate environment.

    Unfortunately, most testers do not properly inform the public about the details of the test. They simply say that a scanner did not detect x% of the malware samples. This may be misleading because many users /w a windows desktop are not interested in the detection of Linux malware or clients etc.

    Last but not least, there is another problem: Many testers have not properly structured their test archive. They have not separated harmless clients from dangerous servers and, therefore, they are unable to perform a test which solely covers dangerous servers. I feel that this is the real reason why no "server-only" tests are performed. The arguments generally presented by testers (like: "AV scanners should also detect clients because you can do malicious things with them") are just an excuse in order to distract from the fact that the test archive is not properly structured.
     
  4. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    I think that I have posted several times that I don't collect DOS and Linux malwares.

    Although I may have some harmless trojan like malwares, I'm absolutely sure that the vast majority of them are very functional to infect my WinXP Home system. Every time someone is making a test of his/her own, I've seen that kind of opinions, the main question is, which av is the REAL one that don't detect harmless trojan like malwares? The answer is quite clear, there isn't such kind of av.

    Best regards,
    Firefighter!
     
  5. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    There is no such thing as harmless trojan. Every trojan/worm/virus/backdoor is harmful no matter what it does. Even simple spy toolbar is harmless for me since its changing startpage and monitoring surfing habits...
     
  6. halcyon

    halcyon Registered Member

    Joined:
    May 14, 2003
    Posts:
    373
    I read a little bit more about Andreas Marx:

    1. Head of virus testing at University of Magdeburg. Runs av-test.org with some others.

    2. Active contributor to several security lists, incl. Bugtraq

    3. Well respectected in the anti-virus professional world (not some hobbyist forums, but actual professionals in the field). Even Paul Wilders here says he is "trustworthy"

    4. Participant in the Wildlist, VirusBulletin activities, conferences, etc. His skills are being peer-reviewed by other professionals in the field.

    Also, otherwise excellent c't may have published a few technically flawed articles in their lifetime, but to say that they are a paying a win for some other 3rd party Anti-virus company seems just laughable to me. I mean, what would be their gain in this?

    To me, based on this information, it seems just ludicrous to suggest that the test in question is "paid for" or "not trustworthy", unless some hard facts are brought to the table.

    Unless somebody who has actually read the full article can come up with technical proof on how the test was invalid, I think there is very little reason to believe that it wasn't of very high quality and trustworthy test.

    Innuendo, guesses and slander are not proof.

    Again, I have no vested interest either way. I'm just a subscriber to c't. But I just hate disinformation and this is what I see this discrediting as, because no proof is offered for the accusations.

    friendly regards,
    halcyon

    Some random Refs: https://www.wilderssecurity.com/showthread.php?t=8912&page=1&pp=25
    https://www.wilderssecurity.com/archive/index.php/t-2920.html
    http://www.wildlist.org/R_email.htm
    http://conference.eicar.org/conference-old/2002/program/other/ShowDay11_ndb.htm
    http://www.virusbtn.com/conference/vb2002/report/
     
  7. ,.-

    ,.- Guest

    @firefighter

    "I think that I have posted several times that I don't collect DOS and Linux malwares."

    I did not talk about your archive but about the real big archives of the "big" testers.

    @RejZoR

    "There is no such thing as harmless trojan." I am not sure whether this refers to my post. If yes: I was talking about clients and malware that does not run on your computer or does not cause damage to it. Such malware may by dangerous to third parties but not to the user who needs to decide whether s/he wants to install a scanner that does or does not detect such malware. Example: A home user may WANT to play with a trojan client. Therefore, it is irrelevant for such user whether a scanner detects it or not. By contrast, a company may NOT WANT its employees to "play" with clients and remote-control victims. Therefore, it may be relevant to such company that also clients are detected. If a user has a windows machine he may not be interested in the detection of Linux malware. From an overall perspective, however, you may argue that a scanner needs to detect every kind of malware ... just in order to keep the world clean and tidy ... and to make sure that a windows user does not forward via email Linux malware to a Linux user. For the same reasons, you may consider an email scanner important (even if an AV or AT already features a resident on-access scanner).

    Regardless of how you see things a tester should properly inform you about the types of malware contained in the test archive so that everybody can make up his/her own mind and decide whether the detection a particular type of malware is important to him/her.

    @halycon

    I agree. Andreas Marx is indeed a respected tester (although he made some mistakes and his test archive is far from perfect -- I have seen some parts of it). I also talked to Andreas Clementi (on a confidential basis) and he provided me with some information re his test archive. IMHO, there is no reason to believe that Clementi's test are 1000 times more accurate or less biased than those of Andreas Marx. Maybe a little bit. Or maybe not. In any way, both test archives & procedures are much better than those of many amateur testers who have simply downloaded an outdated virus archive from vxheavens (or a similar site) and determine a scanner's detection rate by looking at its log file ...
     
  8. Diver

    Diver Guest

    I think that there is a real problem in defining AV performance. A tester can take a collection of viruses and say quantitatively that X av program detected a higher percentage than Y AV program, but that is far from the whole story.

    KAV is a respected AV, but it can drive some users nuts, especilly if the extended databases are used. All of a sudden Mirc 6.16 is "riskware".

    Testing for false alarms is very important. After all, the AV program is supposed to sort out the bad stuff from the good stuff.

    There are differences in how and at what stage the detection takes place. If you are using Outlook Express and your AV does not scan POP3 something may get past it.

    Then there are the enigmas. AVG is nothing special in the percentage rankings becasue it does not test for a lot of trojans. However, a lot of people are getting good protection out of it. Last night someone who volunteers at the HAL-PC help desk in Houston told me they are regularly using AVG 7 (free) to clean up PC's that became infected whie running Norton and Mcafee. The users stay on AVG and do not have problems again. I have been known to bash AVG in the past, but this is a real eye opener.

    At any rate, these percentage of detection races may not translate into performance in the field for some reason.
     
  9. TAP

    TAP Registered Member

    Joined:
    Aug 17, 2004
    Posts:
    344
    In my opinion, the real-world performance of AV is very very difficult to conduct, especially in real-world-real-time protection, there are so many factors out there.

    It's not collecting malware by downloading from VX sites or anywhere else and scan the whole collection by scanners and compare results, this can lead to misleading at best. For example, scanner X can detect a higher percentage than scanner Y but this doesn't mean that X will provide better protection in the real-world scenario than Y.

    We have ITW viruses/worms but there're a lot malware that are in the wild but not lised in the WildList, this is a problems.
     
  10. meneer

    meneer Registered Member

    Joined:
    Nov 27, 2002
    Posts:
    1,132
    Location:
    The Netherlands
    Perhaps OT:
    It's kind of hard to argue about these matters when contributors are anonymous. If you could just specify your authority in the field, we might just get a better perspective.

    Sorry to bother you all... me being kind of anonymous too, but for my account name...
     
  11. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    Looks now that we have got a real pro. If so, you may have a real opinion how big percent of these downloaders just can't infect any WinXP Home system? 10 %, 30 % or even 80 % ?

    TrojanDownloader.JS.Weis.b
    TrojanDownloader.Java.Foget
    TrojanDownloader.Java.Lamdez
    TrojanDownloader.Java.OpenConnection.d
    TrojanDownloader.Java.Psyme
    TrojanDownloader.NSIS.gen
    TrojanDownloader.VBS.Asank.a
    TrojanDownloader.VBS.Iwill.a
    TrojanDownloader.VBS.Iwill.c
    TrojanDownloader.VBS.Iwill.e
    TrojanDownloader.VBS.Iwill.k
    TrojanDownloader.VBS.Psyme.a
    TrojanDownloader.VBS.Psyme.s
    TrojanDownloader.VBS.Psyme.v
    TrojanDownloader.VBS.Psyme.w
    TrojanDownloader.VBS.Psyme.x
    TrojanDownloader.VBS.Psyme.y
    TrojanDownloader.VBS.Streevs
    TrojanDownloader.Win32.Agent.ad
    TrojanDownloader.Win32.Agent.bl
    TrojanDownloader.Win32.Agent.bp
    TrojanDownloader.Win32.Agent.p
    TrojanDownloader.Win32.Agent.r
    TrojanDownloader.Win32.Agent.t
    TrojanDownloader.Win32.Agent.v
    TrojanDownloader.Win32.Apher.a
    TrojanDownloader.Win32.Apher.gen
    TrojanDownloader.Win32.Apher.h
    TrojanDownloader.Win32.Apher.j
    TrojanDownloader.Win32.Aphex.020
    TrojanDownloader.Win32.Aphex.060
    TrojanDownloader.Win32.Aphex.10.a
    TrojanDownloader.Win32.Aphex.a
    TrojanDownloader.Win32.Asune.b
    TrojanDownloader.Win32.Atmader.10
    TrojanDownloader.Win32.Axload.c
    TrojanDownloader.Win32.BHO
    TrojanDownloader.Win32.Bboxet
    TrojanDownloader.Win32.Braidupdate
    TrojanDownloader.Win32.Briss.a
    TrojanDownloader.Win32.Crypt
    TrojanDownloader.Win32.Crypter
    TrojanDownloader.Win32.Delf.ab
    TrojanDownloader.Win32.Delf.af
    TrojanDownloader.Win32.Delf.ay
    TrojanDownloader.Win32.Delf.az
    TrojanDownloader.Win32.Delf.bk
    TrojanDownloader.Win32.Delf.bn
    TrojanDownloader.Win32.Delf.bp
    TrojanDownloader.Win32.Delf.bz
    TrojanDownloader.Win32.Delf.ch
    TrojanDownloader.Win32.Delf.cq
    TrojanDownloader.Win32.Delf.ds
    TrojanDownloader.Win32.Delf.dt
    TrojanDownloader.Win32.Delf.i
    TrojanDownloader.Win32.Delf.j
    TrojanDownloader.Win32.Delf.m
    TrojanDownloader.Win32.Delf.z
    TrojanDownloader.Win32.Devsog.73.b
    TrojanDownloader.Win32.Devsog.741
    TrojanDownloader.Win32.Dsweb.10
    TrojanDownloader.Win32.Dsweb.101
    TrojanDownloader.Win32.DualDl.10
    TrojanDownloader.Win32.Dyfuca.ap
    TrojanDownloader.Win32.Dyfuca.aw
    TrojanDownloader.Win32.Dyfuca.bb
    TrojanDownloader.Win32.Dyfuca.bm
    TrojanDownloader.Win32.Dyfuca.bt
    TrojanDownloader.Win32.Dyfuca.cn
    TrojanDownloader.Win32.Dyfuca.j
    TrojanDownloader.Win32.Dyfuca.u
    TrojanDownloader.Win32.Dyfuca.v
    TrojanDownloader.Win32.Dyfuca.x
    TrojanDownloader.Win32.Elfdown.10.a
    TrojanDownloader.Win32.Esepor.a
    TrojanDownloader.Win32.Esepor.c
    TrojanDownloader.Win32.Esepor.f
    TrojanDownloader.Win32.Esepor.h
    TrojanDownloader.Win32.Esepor.j
    TrojanDownloader.Win32.Esepor.q
    TrojanDownloader.Win32.Esepor.s
    TrojanDownloader.Win32.Esepor.u
    TrojanDownloader.Win32.Esepor.w
    TrojanDownloader.Win32.Fearless.12
    TrojanDownloader.Win32.Femad.h
    TrojanDownloader.Win32.Fidar.10
    TrojanDownloader.Win32.Friendown
    TrojanDownloader.Win32.G-Spot.10
    TrojanDownloader.Win32.G-Spot.15
    TrojanDownloader.Win32.Getfiles
    TrojanDownloader.Win32.GhostRA.b
    TrojanDownloader.Win32.Glukonat.a
    TrojanDownloader.Win32.Gotecom
    TrojanDownloader.Win32.Harnig.a
    TrojanDownloader.Win32.Herman
    TrojanDownloader.Win32.IED.11
    TrojanDownloader.Win32.IMCdown
    TrojanDownloader.Win32.INService.c
    TrojanDownloader.Win32.Injecter
    TrojanDownloader.Win32.Iowa.a
    TrojanDownloader.Win32.Iowa.f
    TrojanDownloader.Win32.IstBar.ag
    TrojanDownloader.Win32.IstBar.ay
    TrojanDownloader.Win32.IstBar.bo
    TrojanDownloader.Win32.IstBar.bu
    TrojanDownloader.Win32.IstBar.bx
    TrojanDownloader.Win32.IstBar.ci
    TrojanDownloader.Win32.IstBar.cl
    TrojanDownloader.Win32.IstBar.du
    TrojanDownloader.Win32.IstBar.dv
    TrojanDownloader.Win32.IstBar.dw
    TrojanDownloader.Win32.IstBar.dx
    TrojanDownloader.Win32.IstBar.dy
    TrojanDownloader.Win32.IstBar.eh
    TrojanDownloader.Win32.IstBar.fa
    TrojanDownloader.Win32.IstBar.gen
    TrojanDownloader.Win32.IstBar.q
    TrojanDownloader.Win32.IstBar.u
    TrojanDownloader.Win32.Kather.c
    TrojanDownloader.Win32.Kather.d
    TrojanDownloader.Win32.Keenval
    TrojanDownloader.Win32.Keenval.e
    TrojanDownloader.Win32.Kotan
    TrojanDownloader.Win32.Krepper.d
    TrojanDownloader.Win32.Leodon.a
    TrojanDownloader.Win32.Leodon.b
    TrojanDownloader.Win32.Livup.a
    TrojanDownloader.Win32.Lookme.b
    TrojanDownloader.Win32.Mendwar
    TrojanDownloader.Win32.Miled.a
    TrojanDownloader.Win32.Monurl.gen
    TrojanDownloader.Win32.Mosw
    TrojanDownloader.Win32.MultiWeb.a
    TrojanDownloader.Win32.Netcomp
    TrojanDownloader.Win32.OneClickNetSearch.f
    TrojanDownloader.Win32.Perfiler
    TrojanDownloader.Win32.Puram.09
    TrojanDownloader.Win32.PurityScan.b
    TrojanDownloader.Win32.PurityScan.d
    TrojanDownloader.Win32.PurityScan.e
    TrojanDownloader.Win32.QDown.a
    TrojanDownloader.Win32.QDown.b
    TrojanDownloader.Win32.QDown.d
    TrojanDownloader.Win32.QDown.h
    TrojanDownloader.Win32.RVP.c
    TrojanDownloader.Win32.Rameh.c
    TrojanDownloader.Win32.Realtens.g
    TrojanDownloader.Win32.Revop
    TrojanDownloader.Win32.Sandesa.11
    TrojanDownloader.Win32.Shutit.10
    TrojanDownloader.Win32.Small.aj
    TrojanDownloader.Win32.Small.bx
    TrojanDownloader.Win32.Small.cz
    TrojanDownloader.Win32.Small.df
    TrojanDownloader.Win32.Small.do
    TrojanDownloader.Win32.Small.en
    TrojanDownloader.Win32.Small.eo
    TrojanDownloader.Win32.Small.es
    TrojanDownloader.Win32.Small.ew
    TrojanDownloader.Win32.Small.f
    TrojanDownloader.Win32.Small.fe
    TrojanDownloader.Win32.Small.fo
    TrojanDownloader.Win32.Small.fq
    TrojanDownloader.Win32.Small.ga
    TrojanDownloader.Win32.Small.go
    TrojanDownloader.Win32.Small.hb
    TrojanDownloader.Win32.Small.hg
    TrojanDownloader.Win32.Small.hl
    TrojanDownloader.Win32.Small.hp
    TrojanDownloader.Win32.Small.hs
    TrojanDownloader.Win32.Small.ja
    TrojanDownloader.Win32.Small.jb
    TrojanDownloader.Win32.Small.jf
    TrojanDownloader.Win32.Small.jg
    TrojanDownloader.Win32.Small.jk
    TrojanDownloader.Win32.Small.jl
    TrojanDownloader.Win32.Small.jm
    TrojanDownloader.Win32.Small.jn
    TrojanDownloader.Win32.Small.ka
    TrojanDownloader.Win32.Small.ni
    TrojanDownloader.Win32.Small.ny
    TrojanDownloader.Win32.Small.q
    TrojanDownloader.Win32.Small.qd
    TrojanDownloader.Win32.Small.qv
    TrojanDownloader.Win32.Small.ug
    TrojanDownloader.Win32.Small.vf
    TrojanDownloader.Win32.Smokedown.a
    TrojanDownloader.Win32.Swizzor.bn
    TrojanDownloader.Win32.Swizzor.n
    TrojanDownloader.Win32.Swizzor.t
    TrojanDownloader.Win32.Swizzor.u
    TrojanDownloader.Win32.Swizzor.v
    TrojanDownloader.Win32.Theefdl.10
    TrojanDownloader.Win32.Tibser.a
    TrojanDownloader.Win32.Tiner.a
    TrojanDownloader.Win32.Tiner.c
    TrojanDownloader.Win32.Tiner.d
    TrojanDownloader.Win32.Tooncom.p
    TrojanDownloader.Win32.Troll
    TrojanDownloader.Win32.VB.aa
    TrojanDownloader.Win32.VB.ai
    TrojanDownloader.Win32.VB.aj
    TrojanDownloader.Win32.VB.bb
    TrojanDownloader.Win32.VB.be
    TrojanDownloader.Win32.VB.bj
    TrojanDownloader.Win32.VB.bo
    TrojanDownloader.Win32.VB.ca
    TrojanDownloader.Win32.VB.cl
    TrojanDownloader.Win32.VB.cn
    TrojanDownloader.Win32.VB.ct
    TrojanDownloader.Win32.VB.db
    TrojanDownloader.Win32.VB.dc
    TrojanDownloader.Win32.VB.eq
    TrojanDownloader.Win32.VB.ex
    TrojanDownloader.Win32.VB.k
    TrojanDownloader.Win32.VB.n
    TrojanDownloader.Win32.WebDL.02
    TrojanDownloader.Win32.WebDL.b
    TrojanDownloader.Win32.WebDL.d
    TrojanDownloader.Win32.WebDL.f
    TrojanDownloader.Win32.WebDown.10
    TrojanDownloader.Win32.Webaut.g
    TrojanDownloader.Win32.Whomp.10
    TrojanDownloader.Win32.WinShow.aa
    TrojanDownloader.Win32.WinShow.b
    TrojanDownloader.Win32.WinShow.l
    TrojanDownloader.Win32.WinShow.u
    TrojanDownloader.Win32.WinShow.w
    TrojanDownloader.Win32.Wintrim.af
    TrojanDownloader.Win32.Wintrim.ag
    TrojanDownloader.Win32.Wintrim.ah
    TrojanDownloader.Win32.Wintrim.ai
    TrojanDownloader.Win32.Wintrim.ao
    TrojanDownloader.Win32.Wintrim.aq
    TrojanDownloader.Win32.Wintrim.as
    TrojanDownloader.Win32.Wintrim.av
    TrojanDownloader.Win32.Wintrim.q
    TrojanDownloader.Win32.Wintrim.u
    TrojanDownloader.Win32.Wintrim.v
    TrojanDownloader.Win32.Wintrim.w
    TrojanDownloader.Win32.Wintrim.z
    TrojanDownloader.Win32.Zdown.10

    Total 242 samples from VXHeavens, that's all they have just now beginning "TrojanDownloader..."

    Best regards,
    Firefighter!
     
    Last edited: Jan 5, 2005
  12. Joerg

    Joerg Guest

    Here a short summary of the c't-test:

    - tested on Windows XP SP-2
    - testset: ITW-Malware (Wildlist 08/2004); more than 60000 Trojans and Backdoors.
    - tested for false positives and performance (20000 clean objects).

    - Some of the tested programs:
    Avast Home 4.1: Did not found all ITW-samples. Not recommended.
    Avast Prof. 4.1: Same as free home edition.
    AVG 7 free: fast. average search results.
    Bitdefender 7.2 Free: No on-access-scanner. Best detection rate within freeware scanners.
    Bitdefender 8 Prof.: One of the best detection rates; fast. Does not scan outgoing mails (mentioned as "disturbing").
    F-Secure: Good detection rate.
    Kaspersky Personal Pro 5: High detection rate. On-access-scanner does not scan archive (rated bad!). Too few functions.
    McAfee: Replaces security center (SP2) with own center, which is full of advertisments. Requires ActiveX. Tries to clean objects on read-only media (like CD-R). Automatic cleaning set as default.
    NOD32: Did not provide program for testing (so program was bought in shop). One of the fastest scanners, one of the best heuristics. Easy to handle.
    Norton: Good detection rates. One of the slowest scanners; updates provided too late. Rescue discs unusable.

    - Conclusion: Not recommended: Avast (both versions) and ViRobot as these scanners did not detect all ITW-samples. McAfee and NAV provided updates too late -> not recommended, too. Fast On-Access-Scanners: AVG, NOD32, PC-Cillin. Good overal protection: Bitdefender Prof., F-Secure.

    regards, Joerg
     
  13. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Dyfuca,Small and IstBar are most common on XP machines. I see them almost on every corner...

    And about not recommending avast!. Who gives a F... if it didn't catch all ITW samples when your ass is hit by some "zoo" sample?
    God knows how many samples did they used so they can not reccommend AV based on two missed samples :roll:
    I also don't understand how can they define ITW samples? And what are zoo samples (thats why i put word zoo into quotes)? If the file can be obtained in some way (email,P2P,WWW,CD) thats ITW for me and i don't care what experts say/rate that file.

    The funniest thing is that i trust only av-comparatives and Firefigther (+results from Jotti).
    VB100% was ok at first,but their web test results turned out to be the biggest joke which makes many users think bad things about some AVs. You can't rate things just with Pass or Fail mark. There is way too many other importants factors.
     
  14. Joerg

    Joerg Guest

    To clear things out: The conclusion was made by the testers, not by me.
     
  15. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Don't worry, "bad" words were not meant to you, i was commenting the text you posted here ;)
     
  16. Ianb

    Ianb Registered Member

    Joined:
    Nov 26, 2004
    Posts:
    232
    Location:
    UK
    Couldn't agree with you more.
     
  17. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    The list I posted above is the summary of all TrojanDownloader samples just now in VXHeavens. Strange, that when I checked randomly about 20 of them with VGrep, the oldest one was about 2 years and most of them were less than a year old. How they can be crap files after that?

    VXHeavens has a lot of crap files too (mostly DOS files), but if you are enough patient, you can collect thousands of potential samples enable to infect a WinXP Home system.

    The category of samples that they have is very usefull, that's the one that I have used in my tests, much better than I have seen in most of those other tests. Here you can see for example how good an av is against Exploits or Worms only, these categories I have not seen in many other tests (Rokop and AV-Comparatives.org have worms too).

    Best regards,
    Firefighter!
     
    Last edited: Jan 5, 2005
  18. ,.-

    ,.- Guest

    @firefighter

    I did not know that you use samples from vxheavens. My comments re amateur testers did not relate to you since - at that time - I did not know anything about your test archive and/or test procedure.

    Moreover, I am not sure why you posted this part of the virus collection: I did not say that the entire collection consists of harmless files.

    I would be grateful, however, if you could answer a few questions:

    1. Have you tested any scanners that do NOT detect samples contained in this part of the archive?

    2. Are you sure that, for instance, TrojanDownloader.Win32.Theefdl.10 is dangerous malware ... or is it just a harmless editor that can be used to construct the actual, dangerous downloader? That's what I believe. (I have not analyzed, however, whether the editor itself is infected by a completely different trojan, worm or virus. In theory, that would be possible.)

    3. Do you think that it is of particular relevance to detect trojan downloaders that can do nothing else than downloading a real, working trojan (that may be detected by a scanner as soon as it is downloaded and written to the harddrive)? I believe that the detection of trojan downloaders is less important and, therefore, the detection rate in respect of this section should not be overestimated. (Of course, this does not apply to trojan downloaders with an additional backdoor or av killer component, or to trojan downloaders that load the payload directly into the computer's RAM -- see webexe from Aphex). By contrast, I consider the detection of worms and trojans of particular importance. In any case, a good tester should have an opinion on this issue.

    4. Last but not least: does it make sense to use samples from vx heavens for testing? Every AV/AT developer can easily download such samples and provide for detection. If an AV/AT developer does not add detection for such samples ... doesn't this indicate that something is wrong with respect to a particular sample?

    How about the following ... could you give me the names of a few samples that are not detected by a well reputed scanner? I will then analyze them in order to figure out whether it is malware or not.
     
  19. vlk

    vlk AV Expert

    Joined:
    Dec 26, 2002
    Posts:
    621
    avast of course DID detect all ItW samples in the test.
    What it didn't was to detect them on-access because Andreas Marx (the lead tester) was not willing to set up avast to scan files on-copy (which was the way the on-access scanner test was done). :doubt:

    And avast 4.1 (unlike 4.5) was not scanning files on-copy by default...
     
  20. ,.-

    ,.- Guest

    This complaint sounds quite familiar to me. See here ( http://www.rokop-security.de/board/index.php?showtopic=3364&st=0 ) where the lead coder of Ewido complained about A. Marx's test procedure:

    __________________________________

    "Ich kann dir die Fakten sagen, den Rest kannst du dir selber zusammenreimen. Auf jeden Fall hab ich so etwas Unprofessionelles noch nie gesehen.

    * wir haben vor der Veröffentlichung nichts davon gewusst, nicht was getestet wurde und wieso wir so bewertet wurden

    * wir haben auch nach der Veröffentlichung nichts davon gewusst, konnten uns die Punkte in der Zeitschrift ansehen

    * wir haben keine einzige Begründung erhalten warum wir die entsprechenden Noten in den Kategorien erhalten haben

    * wir haben diese Begründung auch nicht erhalten als wir per Telefon und Email nachgefragt haben!

    * wegen dem Testset haben wir Andreas Marx gebeten uns die nicht erkannten Samples zuzuschicken -> "Da wir noch nicht lange genug auf dem Markt sind ist dies leider nicht möglich."

    * wir haben ihn gebeten uns einen Scanlog zu schicken -> auch den haben wir nicht erhalten

    * a² das nun auch nicht viel länger auf dem Markt ist, hat die fehlenden Samples bekommen !?

    ...

    Ein Konkurrent schickt uns die Samples, die uns ein professioneller, seriöser und unabhängiger Tester nicht schicken will....nochmals danke Andy.

    ...

    und jetzt kann ich ein bisschen Stellung beziehen zu dem Test und mich aufregen das in dem Testset Dinge sind die dort meines Erachtens nichts zu suchen haben (Nuker, Constructors, gezippte Trojaner, ...)

    aber viel mehr regt mich der ganze Ablauf auf ..."

    --------------------------------------------------

    @vlk

    Are you "just" an avast user or are you from the company? I am still wondering why many (smaller?) av developers shyly complain about the tests performed by av-test.org but do not do anything against it. If there are really not just one or two but several av developers who believe that the test procedure is flawed such developers (e.g., emsisoft, eset, ewido, avast) should go public and jointly ask for explanations/changes to the test procedure/better information etc.
     
  21. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    You wrote before:

    > In any way, both test archives & procedures are much better than those of many amateur testers who have simply downloaded an OUTDATED virus archive from vxheavens (or a similar site) and determine a scanner's detection rate by looking at its log file ...

    I showed those all TrojanDownloaders that are just now in VXHeavens, only because to show you that they are something else than outdated.

    Secondly, how potential streaths they actually are, maybe this will tell you a bit of that. Here are the detection rates against 263 TrojanDownloaders of mine. Of course I have used VXHeavens too, but tried to collect elsewhere also, that the sample collection isn't so skewed, but not succeed so well because eScan still detected 100%, what I hate.

    100.0 % -- eScan Free 4.7.6

    _93.9 % -- Ewido 3.0

    _85.2 % -- Norton 2005

    _82.5 % -- DrWeb 4.32 b

    _77.6 % -- NOD32 with AH

    _75.7 % -- McAfee VSE 8.0i

    _72.6 % -- Avast 4.5

    _71.9 % -- BitDefender 7.2 Free

    _70.0 % -- Command AV 4.92.7

    _64.6 % -- AVG 7 Free

    _60.8 % -- AntiVir PE

    After so high detecting rates as above, I'm the last to say that TrojanDownloaders are harmless.

    PS. For some strange reason, I feel more protected by using those over 80 % detection rate proggies than those a bit over
    60 % ones. That's human nature.

    Best regards,
    Firefighter!
     
    Last edited: Jan 6, 2005
  22. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Yes, vlk works for Alwil Software
     
  23. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    halcyon,

    What is your primary AV, and why would you use BitDefender or other AV as a backup scanner?
    Don't AVs interfere with one another?

    Thanks,
    Jerry
     
  24. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    I notice that Ewido is paired with several AV. I assume it "gets along" with the AV, and compliments them so that the total score is much better than the AV alone.

    Is Ewido better than LnS for instance, or is it just what you had when you tested?

    Jerry
     
  25. Firefighter

    Firefighter Registered Member

    Joined:
    Oct 28, 2002
    Posts:
    1,670
    Location:
    Finland
    Ewido is a sort of Anti-Trojan, or maybe better, Anti-Malware, which protects you very well against worms too. LnS is a Firewall.

    Best regards,
    Firefighter!
     

    Attached Files:

Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.