Av-Comparatives February '08

Discussion in 'other anti-virus software' started by Abeltje, Feb 1, 2008.

Thread Status:
Not open for further replies.
  1. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    How much does Trustport slow down your system? Both in terms of file access and on demand scanning? What is your experience with it in terms of speed, as compared to just using Dr. Web by itself, for example?

    I have not tested Trustport (one of the few I haven't) and would like your input before I test it.
     
  2. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    I'm disappointed for the first time in these tests. F-Prot and DrWeb should have been included. I don't care what criteria IBK has a lot of folks use these AVs and they are very good and should have been included. This is the first time they have been excluded. Why? IBK is including much more obscure AV than these two and I firmly believe they should have been included. Did they request to not be included? F-Prot is the ORIGINAL AV and it is disgusting to not include it unless Frisk asked that it not be included. Was this the case?
     
  3. xandros

    xandros Registered Member

    Joined:
    Oct 30, 2006
    Posts:
    411
    avira antivir excellent and its very light on the computer
    kaspersky very good but its slow the computer
    avg good but its slow the computer too
    avast pro long time i dont use it , i dont know if its still slow the browsing!!
    nod32 very good and its light on the computer
    norton good but its slow the computer
    bitdefender good but its make the computer very slow i will stay away from it
     
  4. ren

    ren Registered Member

    Joined:
    Nov 1, 2006
    Posts:
    45
    Hello,
    yes, both.
     
  5. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas
    It has all been covered in depth before including statements by the company officers- you will have to search the threads. They chose not to be included (Dr. Web and F-Prot).
     
  6. aigle

    aigle Registered Member

    Joined:
    Dec 14, 2005
    Posts:
    11,164
    Location:
    UK / Pakistan
    I don,t see any text here. Is it like that?

    I downloaded the pdf multiple times.
     

    Attached Files:

  7. Mele20

    Mele20 Former Poster

    Joined:
    Apr 29, 2002
    Posts:
    2,495
    Location:
    Hilo, Hawaii
    Thank you. I found the nine page DrWeb thread. I don't know how I missed that as it is recent too. I didn't see a thread on F-Prot and AV-Comparatives so I guess I'll have to look harder. I still think some of the additions this time are stupid and that the test has lost some of it worthiness. I would have expected the vendors and IBK to work out their differences. Is AV-Comparatives to become the new AV-Test.org? :(
     
  8. aigle

    aigle Registered Member

    Joined:
    Dec 14, 2005
    Posts:
    11,164
    Location:
    UK / Pakistan
    Never mind. I got it working now. All is OK now.

    Thanks
     
  9. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    Mele, fprots comments are in the drweb thread aswell.

    It was a 2v1 thread and was quite funny to see anger arising :)
     
  10. Thankful

    Thankful Savings Monitor

    Joined:
    Feb 28, 2005
    Posts:
    6,564
    Location:
    New York City
    It's good to see that most vendors are now faster in adding missed samples.
     
  11. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    Disagreements between testers and vendors is inevitable, every tester has faced it at some time or the other. That does not necessarily make a test unreliable. IBK still works just as hard as he used to, his tests still have the same integrity (flawed or not). Now whether some of the vendors still have it is unknown. After all, I least expected an outburst on the proportions I saw in that thread......
     
  12. shadek

    shadek Registered Member

    Joined:
    Feb 26, 2008
    Posts:
    2,538
    Location:
    Sweden
    Avira and Trustport - Wow!! Amazing. I'll have to try them sometime.
     
  13. Joe_Jones

    Joe_Jones Registered Member

    Joined:
    Aug 31, 2007
    Posts:
    41
    Bunkhouse Buck:
    The performance can be tuned in many ways.

    Example, lets say you select Dr. Web as your On-Access engine
    and all engines (4 or 5) for On-Demand then there is no difference in slow down in your system compared with Dr. Web.
    But with one major difference: You have the possibility to scan On-demand
    with ALL engines.

    You can select which engines you use for:
    - On Access
    - On demand
    - Internet

    Here i have it running on the pc i am working on now, with 5 engines
    Dr.Web , Ewido, Norman selected for On-Access and Internet and
    Dr.Web , Ewido, Norman , AVG, VirusBlokAda all selected for On-Demand.
    I have scheduled a full scan when i am not behind the system,
    so i noticed NO slowdown at all.

    The thing i have noticed is that the tests don't give you a good idea
    about spyware, adware, browser hijackers etc.
    I found major improvements on that as well when i switched to trustPort.

    With trustPort you buy 4 or 5 engines , and you can decide on every moment, which one (ones) you want to use.
    Of course a full ON-Demand scan with ALL engines, takes longer, but then it finds thousands more samples then any other AV. ;)
     
  14. C.S.J

    C.S.J Massive Poster

    Joined:
    Oct 16, 2006
    Posts:
    5,029
    Location:
    this forum is biased!
    this is not true.

    it takes alot more than throwing engines into an antivirus program to make it great, and 'light'

    Trustport with only drweb enabled, will still be heavy.
     
  15. LoneWolf

    LoneWolf Registered Member

    Joined:
    Jan 2, 2006
    Posts:
    3,784
    Most everyone did quite nicely this time, but it is still only a test, a decent one but a test at that.
    And it was an on demand test. Wonder how they all will fare in the May pro active test?
     
  16. Bunkhouse Buck

    Bunkhouse Buck Registered Member

    Joined:
    May 29, 2007
    Posts:
    1,286
    Location:
    Las Vegas

    It is illogical for it to be as fast as Dr. Web by itself. My view is that if you can scan with four engines and scan as fast as any of them individually, it would have been done years ago. Some firm would have amalgamated them and probably have made a fortune in the marketplace.


    I may be willing to concede that detection might be a bit "better" with four or five, but to contend that it runs with no slowdown is hard to believe.
     
  17. DavidON

    DavidON Registered Member

    Joined:
    Mar 7, 2008
    Posts:
    19
    Location:
    North Island
    How do you know that? do you work together with him or have you seen him working?
     
  18. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    No, I don't work with him. Haven't seen him work either. But I could also ask you about why you should be trusting the vendors. After all, have you seen them going through the "millions of garbage files" in the test sets? It is easy to claim things about bad test sets, it is another thing to provide evidence. Neither Frisk nor Dr.Web provided evidence. One had to go solely by word of mouth. On the other hand, AV-comparatives *did* release information about the impact of corrupted files on the test results with very little deviation in scores. Note that Clementi worked with AV vendors to perform that test. And the fact that AV-comparatives took the effort means that they score 1-0 over the vendors for now. If they had something to hide then the sample analysis and resultant release of information would not have happened at all. This shows AV-comparatives was willing to improve its test set, it also showed that the differences were statistically insignificant (didn't change any ratings for any product). Does that mean their sample set is perfect? No, far from it. But it does show the determination to make a good test. :)

    And while I haven't personally been anywhere yet, I did learn a few things behind the scenes. The most important lesson from those was to not always take everything anyone says at face value. :)
     
    Last edited: Mar 11, 2008
  19. DavidON

    DavidON Registered Member

    Joined:
    Mar 7, 2008
    Posts:
    19
    Location:
    North Island

    wow owo take it easy =), what are you talking about Frisk Drweb? what have i said about those products? nothing, i have no comment, i just asked you if you worked with the team since you said they work very hard.
    I think you take the wrong person, look who has posted about Frisk and Drweb and then give your comments.
    wooow
     
  20. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    I think there is an error in the total percentage for avira -shown in the report.

    Let's add up percentages = 99.1% (report shows 99.6%) :blink:

    Adding up the malware files individually produce correct results, however (1.676.963)
     
  21. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    The percentages in each individual malware type are rounded off, that's why you get 99.1 if you sum up the rounded-off numbers. If you calculate the number of malware detected per total malware, you'll get your 99.6%.
     
  22. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    thanks solcroft but i still don't get why the math works for all other av's (regarding percentages only) except for avira.
     
  23. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    @Macstorm: uff, math is not your thing, eh? (just joking ;)) I get such calculation reports quite often.
    you can not simply sum 99,8+100+97,4+99,8+99,7+99,5+97,5 and then divide per 7 (which will bring you to 99,1). note the different sample sizes of the subsets.
     
  24. Macstorm

    Macstorm Registered Member

    Joined:
    Mar 7, 2005
    Posts:
    2,642
    Location:
    Sneffels volcano
    Just got it ;) thank you

    And yes, you're right. Honestly, i never was good at math :D
     
  25. Jadda

    Jadda Registered Member

    Joined:
    Jun 5, 2007
    Posts:
    429
    Norman went up from 90 % to 94 % since the last test. And from Standard to Advanced. Quite happy indeed.

    Rest of the results was as expected. Microsoft OneCare har really started to shape up.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.