Dennis Technology Labs: PC Total Protection Suites 2012

Discussion in 'other anti-virus software' started by malexous, Apr 10, 2012.

Thread Status:
Not open for further replies.
  1. malexous

    malexous Registered Member

    Joined:
    Jun 18, 2010
    Posts:
    828
    Location:
    Ireland
  2. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    Sweet!! Dennis has some well thought out tests, approved by AMTSO. Nice to see ESET do well on this test. Another test where Webroot is near the bottom.
     
  3. trjam

    trjam Registered Member

    Joined:
    Aug 18, 2006
    Posts:
    9,057
    Location:
    North Carolina
    Congrats to Norton, Kaspersky and Eset.:thumb:

    WOW is all I can say about Webroot. Not good, not good at all.
     
  4. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    4,050
    Location:
    USA
    It is, especially after seeing so many other recent tests rating it poorly.
     
  5. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,634
    Location:
    UK
    My only criticism of the report is I wish they'd given the build numbers of the products tested. The tests were conducted between 16 January & 3 February; Webroot has come a long way since then as there have been numerous updates to the product.
     
  6. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,221
    What has happened to Avira? From what I have seen in the last couple of years it has gone from among the top 3 o 4 to the bottom of the barrel.
    Jerry
     
  7. lordraiden

    lordraiden Registered Member

    Joined:
    Jan 30, 2006
    Posts:
    3,075
    I wonder why in the Dennis Technology Labs tests norton is always the first :D
    "Tomorrow" Avira will do as a sponsor and will be the first
     
  8. malexous

    malexous Registered Member

    Joined:
    Jun 18, 2010
    Posts:
    828
    Location:
    Ireland
    Perhaps it is the same reason Norton is first/joint first in 3/4 (false positives prevented 4/4) AV-Comparatives Whole Product Dynamic Tests.
     
  9. Nebulus

    Nebulus Registered Member

    Joined:
    Jan 20, 2007
    Posts:
    1,582
    Location:
    European Union
    Actually, this is the only test that rates Avira so low, so I wouldn't come to any rushed conclusions :)
     
  10. Sherlock_Holmes

    Sherlock_Holmes Registered Member

    Joined:
    Mar 21, 2010
    Posts:
    1,448
    Location:
    Mumbai
    Yayy Kaspersky :D :p
     
  11. Ianb

    Ianb Registered Member

    Joined:
    Nov 26, 2004
    Posts:
    232
    Location:
    UK
    No test sponsored by any AV company can ever be trusted.
     
  12. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    why not,somebody has to pay for these tests may as well be an AV vendor,besides if there were not conducted fairly don't you think all the other AV companies would make a big song and dance about the results?seems the most in depth test of any to be fair,and if Dennis labs didn't conduct it fairly what would it do to their reputation in the industry?
     
  13. STV0726

    STV0726 Registered Member

    Joined:
    Jul 29, 2010
    Posts:
    900
    Alrighty, well it has got the AMTSO sticker so even though yes, it is sponsored which is not perhaps the ideal type of independent test, this still should be considered reputable and valid I would think.

    Webroot SecureAnywhere did so darn well on AV-TEST and they tested a fairly old build too...

    ...Like I've always said, it seems the way the product handled threats wasn't fully baked so I'm giving them till mid-2012 latest and by then the testing bodies should be using a build past .150 for sure or higher and that should really tell us how they are doing.

    Being completely fair and unbiased tho, I have to slap WSA on the wrist for this. I love you Webroot, but this has got to stop if you are really serious about "revolutionizing the AV world". Others, especially those that think "cloud" is a marketing gimmick (believe me they exist), are already dismissing WSA as a fail. The time to put up or shut up is now......and they likely are. They have released more changes in weeks than most vendors do in like a year it seems.

    So there. I love WSA. I'm a total fanboy. But see, I am critical of them too when need be. I am not going to call any test BS that they do poorly in because that's silly and an idiotic way of thinking. However, if people start linking YouTesters and posting those "I decided to throw some viruses at Webroot" threads, then I'll be the first replier to call BS. It's all about valid, reputable, and responsible test results.
     
  14. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,069
    I don't get why Avira is doing so poorly.

    Perhaps a bias in regional samples, and cloud detection ?
     
  15. chriswiles

    chriswiles Registered Member

    Joined:
    Mar 27, 2010
    Posts:
    209
    That's like saying a website which has ads from a particularly company will warp results/reviews etc to meet the requirements of their advertisers.

    I work for various UK publishers (Google my name, if you like) and I can assure you that editorial teams have nothing to do with with advertising, budgets or the sponsor.

    Indeed, more fool an advertiser or sponsor who attempts to influence editorial decision. It's not only frowned upon (some of the smaller brands do try), but you'll find that the Editor will quickly make sure that money and advertising has to remain independent from the reviews.

    We had one scenario recently where an unnamed software company paid for adverts in one of our magazines. Their product received a 2-star review. They called the magazine and told them they weren't happy and wouldn't advertise going forward. Fair enough. That's their prerogative.

    However, the day you start warping editorial around paid-for advertising, is the day you lose all credibility.
     
  16. Scoobs72

    Scoobs72 Registered Member

    Joined:
    Jul 16, 2007
    Posts:
    1,108
    Location:
    Sofa (left side)
    I suspect the sponsorship has come after the test was done, i.e. Dennis did the test and then said "who wants to sponsor this?". It's "sponsored by" rather than "commissioned by", so nothing that's suspicious as far as I can see.
     
  17. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    I'll have to do some research internally to see what happened with this test but I don't think anyone was informed within Webroot of this test and as far as I know, no one supplied them with any information on how to test WSA. It's very possible there was some issue with testing WSA as it is indeed very different from the average product, and it's different in ways that aren't normally immediately obvious and could certainly affect specific types of test scores.

    That being said, I can't stand any test where we don't score at the top and I will definitely be working to see what happened here.
     
  18. STV0726

    STV0726 Registered Member

    Joined:
    Jul 29, 2010
    Posts:
    900
    Well, just to play devil's advocate (and quote Thankful [pretty sure it's him that employed this phrase]): if "other AVs are able to do well", why can't Webroot? Yes it may be different but can you really expect them to revolve around/cater their testing methods around each product?

    If all they are doing is throwing malware at it with default settings, for example, how do you make that any different? Webroot should catch that stuff no matter how it enters the system.

    The only thing I can possibly think of that might not be "fair" for WSA is if independent labs fail to take into consideration files placed on monitor mode. But other than that, isn't a miss...a miss?

    Again you know I'm with you all the way I love WSA and I want nothing more than to have it pass and lead the pack with flying colors but even tho I'm a raving fan I must maintain my skepticism and logic. That is of course, unless I misunderstood you.

    Regarding the "sponsored by v. commissioned by", I would tend to think that could be a true statement of concern. Merging the two concerns due to my own curiosity...Joe, was the Passmark Performance Tests for WSA sponsored by Webroot or commissioned by them?
     
    Last edited: Apr 10, 2012
  19. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    Well, I really don't know, as the other thread going on at the moment shows WSA performing rather well, even though it was also a sponsored test. Scoring 99% versus the poor showing in this test strikes me as odd as I imagine they both took place at a similar time, but honestly, I don't have any more information here.

    (The other thread is here: https://www.wilderssecurity.com/showthread.php?t=321900 if you haven't seen it)

    Regarding the Passmark tests, those were commissioned by Webroot but they do similar tests for other vendors and use the same methodology across all of the tests they perform. I personally think of it as more of a fee we pay to be able to use it in our marketing material as the tests will always have the same results no matter what vendor pays for them or when they're done (short of minor variations between builds, of course).
     
  20. qakbot

    qakbot Registered Member

    Joined:
    Aug 25, 2010
    Posts:
    380
    Definitely worth getting to the bottom of this.
     
  21. Techfox1976

    Techfox1976 Registered Member

    Joined:
    Jul 22, 2010
    Posts:
    749
    I personally don't trust the report at all. And yes, I will defend WSA specifically in this because I know the technology behind it quite well.

    Here's stuff that goes into my decision:

    The very first statement is "This report aims to compare the effectiveness of Symantec’s Norton 360 v6 product with the premium consumer products available from other well-known security companies." This immediately makes things suspicious. I'm sorry, but there's no way around that. Even science is being abused these days for profit.

    The numbers don't add up.
    Pages 10-12. Testing FPs against 40 "legitimate" programs. In summary, a security program gets a point if it allows the legitimate program to run, and loses between 0.05 and 5 points for warning about the program and/or blocking it.

    By simple logic, these are at least partially mutually exclusive. Either you allow it to run, or you warn about it, or you block it. I could see "Warn about it but allow it anyway" resulting in points for allowing but minus points for warning, but "Block" and "Allow" are fully mutually exclusive. So either it should GET a point for allowing, or lose partial or full points for blocking. According to the chart, Webroot blocked one and warned on one. Therefore, having had one BLOCKED means that it wasn't allowed, therefore it could not have gotten a point for allowing it. Out of 40 points, that means the MAXIMUM Webroot could get is 39, then reduce by the negative points for blocking and warning. Yet Webroot has a score of 39.4. What happened here? That can be explained away in Panda, which only warned on one, so could have started with 40 points for allowing them all, but -0.5 for warning about a "medium impact" FP.

    Specifically neutering the way Webroot's WSA product works:
    "In some cases a product might request a further scan to complete the removal. We consider secondary scans to be acceptable, but further scan requests would be ignored."

    What I call "Exceptionally Stupid User" activities. The rules of engagement are effectively "Pick the default, or pick the Topmost or Leftmost option." Hand-typed malware URLs (Yep, everybody does that). XP SP3 with no patches beyond that (That's not as common as people claim). Intentionally out-of-date and exploitable third party software. IE 7. IE7 has somewhere between 0.1 and 2.5% of the total browser use right now depending on where you look. XP falls between 17 and 28% of the Windows visitors, however the prevalence of IE9 indicates that people are getting their patches (since IE9 doesn't exist on XP unless the system gets updates). So "XP SP3 with no other patches and IE7" is a "Realistic and normal state" for users? Further analytical data munging shows that evidence of an unpatched XP SP3 system running IE7 is a 0.003% rate. FFS, the test is "realistic" for three in 100,000 people?!

    And finally, the actual tests' results...
    Visit a PHP page that uses a VB script in IE 7 and unpatched-beyond-SP3-XP to download and start an executable (already reducing the possibility of even encountering this infection vector to under 0.1%). WSA blocked and killed the executable, however the VB script was left in TIF. Failed. -2 points. Despite the fact that the VB Script couldn't do anything else anymore.

    Visit a page that loads a PDF exploit to download and execute a payload. The payload is blocked but the PDF exploit was not, Neutralized, half points. Despite the fact the PDF exploit can only act as a dropper when the web site is actively visited. In my book the actual threat was blocked, but they count it as half score.

    Did we mention the numbers don't add up? Test 21, WSC is labeled as "Complete Remediation", "Defended", and "Compromised" all on that one line, which assumedly the score is automatically calculated from... ?!?!? Seriously?! Come on guys, pick one or the other, you can't have both.

    Oh, and of course if a VB downloader that fails to actually download anything is left on the system by WSA, it failed that test, despite the fact the downloader a: will never operate again unless the web page is revisited. b: Will only operate at all on 3 in 100,000 cases, aside from this test.

    Conclusion:
    They can't get their numbers right, have displayed a lack of knowledge on several of the security packages they tested, call a completely unrealistic situation "realistic", and use testing methods that are wholly in favor of certain operation methodology and penalize other operation.

    All in all, I think I agree with them on PRECISELY the following: If you are running an XP machine that is unpatched beyond SP3 and running an unpatched IE7 and old versions of Java and Adobe, then YES, you should absolutely use Norton. However when you sift through all the FUD, WSA is still perfectly fine for actually-intelligent people.
     
  22. STV0726

    STV0726 Registered Member

    Joined:
    Jul 29, 2010
    Posts:
    900
    You just won the Internet and Wilders User of the Year...not enough thumbs up for you!

    Wow. So yeah. Very good analysis and definitely in addition to some of their methodologies and numbers being disputed here, you also made a great point:

    Their "real world" test was more like an "absolute worst case scenario" test that may have been modeled around an environment Norton knew they could do well in. It's interesting, especially since Norton departed from participating in AV-Comparitives tests...I am always skeptical of Norton and this doesn't change my feelings.

    What's next...test done on a fully patched Windows 98 SE computer to best simulate "real world" sceanrio? :D

    I mean, yes, I get why it is important to be able to protect even when the OS is old/unpatched...BUT...that being said (I'm assuming most here know why that's crucial)...if that "situation" drawn up is so unpatched that it is no longer "real world"...what's the value in it? Maybe some, but definitely not nearly as much value as protecting what IS a "real world" scenario.

    From your brilliant analysis, I certainly must say I am now even more cautious of taking these test results and holding WSA at full fault.

    Some people may come back and say Webroot should have done well and their "uniqueness" is irrelevant. Is it? Remember folks, Norton had to SHARE PC Magazine Editor Choice with Webroot SecureAnywhere. Don't get me wrong, please, I'm not trying to start a conspiracy theory or even begin to accuss anyone of anything, but...you must realize that ANY test that which can fault Webroot (even if it faults no one else) is going to be appealing to Symantec Norton.

    So before I start entering ad nauseum territory and continue rambling, I'll close this with another thank you to Techfox and thanks to Joe for sharing what you know. Dennis Labs really needs to communicate more with vendors accused of missing samples!!!
     
  23. steve1955

    steve1955 Registered Member

    Joined:
    Feb 7, 2004
    Posts:
    1,384
    Location:
    Sunny(in my dreams)Manchester,England
    testing on unpatched OS's etc is actually very relevant in some ways as a very large number of users do use PC's in that condition,lots of average users don't even know you have to update the PC in any way,thing is though to counter that is would those users install an AV product
     
  24. Techfox1976

    Techfox1976 Registered Member

    Joined:
    Jul 22, 2010
    Posts:
    749
    You're right. Chances are high that many people who have unpatched computers don't use AV either.

    But what is the definition of "very large number of users"? When I looked at web analysis from several locations of Browser + OS + Supplemental Information That Hints At Patch Status, combined the stats for an average, and even went so far as to err that "Any XP system with IE7 is Unpatched" when supplemental information is not available, I came up with 0.003% of systems meet the "Unpatched XP SP3 with IE7" condition.

    In my personal experience, more people have unpatched XP SP2 than unpatched XP SP3, strangely enough, since patching to SP3 tried to initiate automatic updates as a default, which results in less-savvy users being more likely to be patched beyond SP3 if they got to SP3 at all.

    It's also noteworthy that XP is in extended support and almost end of life.

    So, yes, Security Software is very important for people who are in the exact state as the test systems, but the number of people in that state is not that large. Even if we go by just metrics from one of my websites alone, and rely on IE7 as the lone metric, IE is at 26.50% of the total hits, and IE7 is at 5.56% of the total IE hits. That means literally just 1.47% of people even meeting the "IE7" precondition, of which an even smaller number will also have XP SP3 with no other patches, and then further smaller will have outdated Adobe Reader, Flash, and Java.

    I still stand by saying "A test that is representative of at most 1-2% (or likely much lower) of the systems out there cannot claim to be representative of the 'Average User' or 'Average Computer Environment'." and also saying "Norton is great for the unaware corner cases who never patch, but Webroot (and others) is still better for power users and educated users who do."

    Plus, you really have to wonder about the material errors such as impossible score and tests marked as both Pass and Fail concurrently. You also have to be aware that they put arbitrary limitations on the functionality of the software. They straight out decided to not permit more than one rescan despite the fact that at least one of the packages uses multiple rescans to ensure cleanliness.

    It doesn't take paranoia or strong conspiracy theory to say that this looks like the tests were directly stacked against Webroot either. I mean, really, fail a test because a broken VB script that can affect under 2% of the general computer population -if it actually worked- was allowed to remain inactive in TIF? So those few people who are vulnerable to it should go get Norton, and the rest of people who want a better system can get something else. :)
     
  25. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    Thanks for your analysis Techfox1976. I agree that your analysis make good objective sense, and undercuts the "real world" meaning of the testing results reported.

    I'd also add that I've worked closely with analytical testing personnel in various industries and have repeatedly seen the complexity of statistical aspects of analytical testing design. Statistically reliable testing results require input and analysis by qualified statistical experts. I have a very hard time believing that the minimal tests and samples reported in the subject test report can be considered statistically reliable.

    Many thanks. BTW I don't use WSA or have any personal interest in same. My only interest is in reliable internet security testing.

    __
     
Loading...
Thread Status:
Not open for further replies.