Why is PrevX not tested by AV-comparatives?

Discussion in 'Prevx Releases' started by Jeroen1000, Apr 13, 2010.

Thread Status:
Not open for further replies.
  1. Jeroen1000

    Jeroen1000 Registered Member

    Joined:
    Aug 18, 2008
    Posts:
    162
    I believe I had read that this would happen in their 2010 tests but no sign of this yet. I feel they do a good job in making the choices easier:).

    Perhaps Joe could shed some light here?
     
  2. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    We currently do not conceptually agree with the testing done by many of the testing firms that perform large on-demand tests. That being said, firms like WestCoastLabs with their Platinum certification (which we have received and are frequently at the top of the list of vendors by performance) perform realtime testing 24/7 of antivirus solutions with new, zero-day threats which we believe is far more valuable to us than a test where security products can score 99+%. Any antivirus product appearing to score 99% in a test of threats indicates that the threats used are not representative of real world threats - none of the products score anywhere near 99% in the real world.

    Hope that helps clarify it a bit better :)
     
  3. Jeroen1000

    Jeroen1000 Registered Member

    Joined:
    Aug 18, 2008
    Posts:
    162
    While I agree only partially, I must admit I had never heard of West Coast before PrevX got the certification a while back.

    Protection against zero day malware is beyond doubt very important as this is how I 'unmasked' NOD32 as the top dog. Avira did find those missed samples and this reflected what I read in the reports on AV-comparatives. So to me it seems like the product with the best heuristics and fastest updating will more than likely score highest. Also valuable things like scanning speed and false positives come to light. I'm a bit more wary when Avira flags a file whilst I would instantly delete it if NOD does: I wouldn't have known Avira produced more FP's if it weren't for AV-comparatives.

    I'll be off reading how AV-comparatives tests exactly. All I wanted to convey was that many people respect their tests and if PrevX were to score high, it might gain many customers:). I only rarely get in contact with a 'new' virus but when I do I find it most amusing to see which of the scanners I have purchased find it first (I play system admin for friend in my spare time, most of which I have recommended NOD in the past, or Avira at present).



    A couple of pitfalls for PrevX I can think of myself:

    • PrevX only looks for active threats and leaves dorment virusses for what they are
    • Virus definition updates for PrevX cannot be frozen so it cannot take part in proactive testing.

    PS: I am a fan of PrevX and am in no way employed by Avira of Eset. I just want to find out how PrevX compares to them directly.
     
  4. PC__Gamer

    PC__Gamer Registered Member

    Joined:
    Dec 26, 2009
    Posts:
    526
    prevx doesn't only look for active viruses, the scanner finds all types of malware whether dorment or active.

    and as Joe said, AVC cant be trusted as their tests are meaningless.

    sure, they are trying dynamic tests but again, but dont trust them one bit.

    AV's dont have 99% detections over such a large amount of files, not unless 'something is a miss'
     
    Last edited: Apr 13, 2010
  5. dr pan k

    dr pan k Registered Member

    Joined:
    Nov 22, 2007
    Posts:
    204
    west coast labs are actually very famous in the security industry and they certificate various types of solutions, not just the proactive or detection rates of progs..

    as for the dynamic test performed by AVC, well, it was on 100 samples and this says it all. i can perform that myself... AVC stated that they will do some dynamic tests with more nasties in the future but until then..
     
  6. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    I agree with this perspective. However, the question still remains: why is Prevx averse to having its product compared with its competition through dynamic (not on-demand) testing, as done by AV-Comparatives or AV-Test, for example?

    Personally, I think that Prevx is missing a major marketing opportunity by failing to do so because I suspect that the product would do "well" (but how well, as compared to the competition, is unknown).

    To clarify, the “testing” done by West Coast Labs is a certification process, not a comparative assessment.

    You might be interested to learn that PrevxHelp is of the opinion (and I do agree) that a dynamic test on even 40 samples is worthwhile: “although their sample set may only be of 40 samples, it is still a valid test {by Dennis Technology Labs}” (see post #136 here). Obviously, a larger sample set is preferable -- and possible: AV-Test, in its dynamic test, used 600 samples (10 per day over a 60-day period).

    Also, consider that Prevx engaged Immunity to test its product using only 5 samples (see here).

    Therefore, an argument that Prevx can't be properly assessed in a comparative manner through dynamic testing because the sample set is too small simply isn't solid.
     
  7. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    We aren't, and the line you quote from me below shows this.

    The public version of the WestCoastLabs report is indeed just a certification process, but to companies that subscribe to WestCoastLabs' realtime testing are well aware of the strengths of Prevx :) We currently do not see incremental value in also doing testing through AVComparatives, however, there are some other testing organizations which are looking promising which we are considering.



    This is an entirely different case entirely - if you read the report, you will see that Immunity completely ignored the detections of Prevx intentionally to test the ability of it to prevent the concepts of the threats. It is of little value to block individual samples when dealing with complex banking trojans, but SafeOnline blocks entire infection vectors. Because of the massive shift to banking malware and information stealing, tests like the one done by Immunity are the most valuable way to assess a product's ability to block threats relevant to users rather than an on-demand mass of samples that will never be seen again by a real user in the wild.
     
  8. dr pan k

    dr pan k Registered Member

    Joined:
    Nov 22, 2007
    Posts:
    204
    i was talking about AVC and as u can see they used 100 samples.
    http://www.av-comparatives.org/images/stories/test/dyn/dynamic2009.pdf

    the small test case of AVC is infact acknowledged by the same AVC organisation and does not have any statistical relevancy. probably within this year we ll start seeing some "real" dynamic tests... personally i find dynamic tests with such a small pool of malware only a waste of time.

    as for west coast, they have what i believe to be the best testing procedure:

    Real Time Anti-Malware for HTTP

    For the HTTP attack vector, each solution is tested 24x7x365 against a feed of malware in real time which transferred through or to it via the HTTP protocol. Samples are harvested in real time through WCL's global honeynet which spans North and South America, Europe and Asia.

    http://www.westcoastlabs.com/checkmark/vendorList/?techGroupID=76

    ps: i dont use prevx, but i ll consider it in the near future..
     
  9. PC__Gamer

    PC__Gamer Registered Member

    Joined:
    Dec 26, 2009
    Posts:
    526
    But if its tested constantly at west coast, where are the results compared to the others tested?

    I know we probably won't see these, but it would be good to know rather than just getting the certification.

    I'd like to see prevx tested at av-test.org for their dynamic test, with default settings if false alarms degrade the result or maximum if it doesn't, personally I use something inbetween.
     
  10. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    Does this mean that the user community will soon see a comparative assessment of anti-malware products that includes Prevx? Can you share a rough estimate of when that might occur -- and, what organization may be doing the testing?

    That is a fair criticism.

    I agree that testing a “mass of samples that will never be seen again by a real user in the wild” in an on-demand manner has limited value. However, similar to the case with West Coast Labs, the Immunity investigation did not compare the capabilities of Prevx to those of the competition.

    In my opinion, the Immunity and West Coast Labs research affirmatively answers the question, “Is Prevx a good product?” The unanswered question, however, is the degree to which the protection provided by Prevx may be superior/inferior to that of the competition. Hopefully, the forthcoming test(s) in which Prevx participates will provide empirical guidance on this point.

    A superior result from such testing would not only assist the marketing efforts of Prevx, but -- perhaps more importantly -- it would drive the competition to be ever more innovative (and thereby “raise the bar” for the industry in total).

    Yes, I agree that would be valuable to do.
     
  11. TonyW

    TonyW Registered Member

    Joined:
    Oct 12, 2005
    Posts:
    2,741
    Location:
    UK
    Whilst I agree it's a certification process, they must have done some tests in order to issue the certificate. However, the actual results of that test appear to be closed to the general public.
     
  12. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    I'm unsure when we'll take place in a comparative assessment publicly but as TonyW has alluded to, the test results from many other organizations are made available to their customers (large enterprises, governments, etc.) and we are not complaining about the results so far ;)

    I'm sorry that I can't go into further detail, primarily because of confidentiality agreements, but rest assured that the companies that need to know the strength of Prevx know it and it has proven itself many, many times in real-life scenarios.

    We have been seeing rapidly increasing cases of semi-large institutions (15,000 - 200,000 PCs internally) becoming massively infected while using some of the "major" vendors and contacting us in the heat of the emergency to fix their network, which we're able to do with poise and ease because of how Prevx is designed.

    It is cases like these and the 24x7 testing that WestCoastLabs does that really show the strength of a security solution and if some subset of users want to rely on using security software because it scores well in certain tests but then fails in real life, then we'll end up fighting a losing battle :)
     
  13. Longboard

    Longboard Registered Member

    Joined:
    Oct 2, 2004
    Posts:
    3,238
    Location:
    Sydney, Australia
    There we have it: the corporations are happy. ;)
    That's a good thing.

    ;)
    Go easy on the marketing speak poetry there Joe: "ease and poise" eh.
    Was it 15 or 20 thousand ?
    I hope they coughed up for 15000 licenses. !! :eek:

    I'm sure that appropriate methodology is important.
    It's good that you and WestCoastLabs have an arrangement.
    When you say confidential don't you mean commercial as well?

    Some of us are the troublesome rump zit : always bitching about "visible independent testing" : to my mind commercial relationship always poisons the well so to speak: I'm sure it's not the same but Moodys, S&P and CDOs spring to mind. :ninja:

    Still, if it works: well and good.
    Some of us have been with PrevX since ( and prior to ) PrevX Home in 2005, when pcMag could not recommend it for installation.
    :eek: ( http://www.pcmag.com/article2/0,2817,1879986,00.asp ) old news now. :D
    PCMAG is a big booster now.
    I for one have been pleased and impressed with the continuous evolution of one of my primary protection tools: I always felt that the concept was good.
    (I have some quibbles with some implementations currently. :) )

    Unhappily, I cannot keep the enterprise afloat: especially as 3/5 of my licenses have been won for free :D

    I remember when PrevX was tested by AVC and some flaws were identified and patched: that was years ago and PrevX was different: who knows, it may happen again. :)
    >> cant find that review now <<

    We already know of several weak spots re zero-day and "non-malicious" files.

    Test design is important.
    I design and participate in Double Blind Randomised Crossover trials as part of my professional life and well understand proper analysis and confounders, nonetheless there are other statistically valid cohorts for analysis.

    Putting PrevX through it's paces in a perhaps more 'fluid' environment, should be welcomed so as to demonstrate the capabilities; n'est-ce pas ?

    I dont understand what that statement is or says, sorry, I must be a bit simple.
    So:
    Scoring well @Immunity & @ WCL is good, scoring well elsewhere is bad.??
    No lab test is any good??
    No sample malware is real unless picked up in real time?
    No one else can do testing?

    Why do you feel it's necessary to diss AVC as a test organization when really their credibility is quite good: isn't it?
    I dont mean to shoot the messenger.
    Thanks for indulging me.
    Regards.
     
    Last edited: Apr 15, 2010
  14. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    If you can't raise the bridge, lower the water. :shifty:
     
  15. Jeroen1000

    Jeroen1000 Registered Member

    Joined:
    Aug 18, 2008
    Posts:
    162
    After some reading, I now know more or less why the AVC on demand test istn't the holy grail. I appear to have fallen for those nice looking numbers too. (I hope I get this part right) => The samples that AVC uses for its on demand comparative, are _not_ 100% verified to be malware? Call me confused but how can the sniff out the FP's then? How can they even be sure it is malware as one vendor may have classified it as such and the other one's may just have followed suit?

    I believe that the West-Coast certification is a good thing but it hardly gives me (personally then) any hints as how different products compare.
    Why do only the big companies and governments get to know (apart from the fact they make PrevX a ton more money) the statistics and details?

    And sorry for opening a can of worms as I had no idea I was doing so.
     
  16. vojta

    vojta Registered Member

    Joined:
    Feb 26, 2010
    Posts:
    830
    Maybe because the test company's results are confidential?. That's their business: selling the information that they generate with their tests.
     
  17. kasperking

    kasperking Registered Member

    Joined:
    Nov 21, 2008
    Posts:
    406
    well.......if a product claims to be.......

    "World's strongest, fastest, most powerful security solution"

    then it should be prepared to be independently tested..be it AVC/VB/Westcoast/Neil rubenking and co. ...so as to justify its claim.Each test has a methodology...good or bad :p it has to be accepted as each product has its strengths $ weaknesses...i wish prevx too take the plunge & prove itself....at least for the nay sayers...get the grade A+ :D :D :D
     
  18. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    Not necessarily, although that is a valid point as well, but assuming that AVC's collection is perfectly free of false positives, we still do not agree with the testing methodology. Upwards of 2/3rds of samples today live for less than 24 hours, the rest dying quickly thereafter - detection of samples that are 6+ months old to 15 years old is almost entirely useless. We can add detection for 100% of old samples if we really wanted to, but the only added value will be to scoring better in AV tests and we have enough to do working with staying ahead of malware authors creating new threats :)
     
  19. PrevxHelp

    PrevxHelp Former Prevx Moderator

    Joined:
    Sep 14, 2008
    Posts:
    8,242
    Location:
    USA/UK
    Actually not my words :D Those come directly from one of our recent clients :)


    Yes :) Although we do this frequently as part of our "emergency response service" so fixing a 20K computer network is not out of the ordinary in the daily goings-on at Prevx.


    We agree, but we have yet to find an organization which has sufficient methodology for fully testing Prevx. However, we have been speaking with the major testing organizations to try to work on this methodology, which will likely become more commonplace as more products move components of their protection into the cloud.

    No, not true - we just have yet to find a case where the lab has tested it properly. The most accurate way to test is with live infections coming from real infected websites in realtime on real PCs with active internet connections. It isn't terribly much to ask for but it is understandably difficult to do so in a controlled enough environment. It also subsequently misses the benefits of the community based approach - if the threat isn't immediately detected by Prevx from the first user, it will likely be found on the second user and somewhat later on the first user. Tests would have to be re-run after infections were allowed to become active to really emulate a live environment. PC__Gamer and other members here frequently run these tests on live PCs which is why we're always a bit surprised that organizations have not chosen to run these tests when they really are the correct way to test.

    Also, it is always worth noting and frequently forgotten that Prevx guarantees its malware cleanup. If a user's PC is infected and they buy Prevx, we will clean it. No other vendor says that (most charge $99+ per incident to attempt to clean it even if they were using the vendor's product before and the infection slipped through) and it shows a significant level of confidence in our products and just how well we know they perform :)

    [QIPTE]Why do you feel it's necessary to diss AVC as a test organization when really their credibility is quite good: isn't it?
    I dont mean to shoot the messenger.[/QUOTE]

    We do not intend to speak badly about AVC at all - they are a well known and credible testing organization, however, we disagree with the methodology at a conceptual level.
     
  20. Pleonasm

    Pleonasm Registered Member

    Joined:
    Apr 9, 2007
    Posts:
    1,201
    In contrast, what we don’t know is whether the other “major vendors” have had similar experiences with large institutions that have been using Prevx as their primary malware defense.

    I certainly echo this perspective and agree. Yet, AV-Comparatives and AV-Test actually do conduct their dynamic testing in exactly the manner that you propose -- correct?
     
  21. PC__Gamer

    PC__Gamer Registered Member

    Joined:
    Dec 26, 2009
    Posts:
    526
    cheers Joe, i do try ;)

    Prevx is a top performer indeed, i will try to do a bigger, better and more accurate one with more statistics but time isnt a thing i have alot of, I have a full time job which includes pressure! grrr

    but i do what i can with the time i have, i think some people on here will appreciate them as i do tend to show failures aswell as success, but if they dont - not too bothered anyway :D
     
  22. Jeroen1000

    Jeroen1000 Registered Member

    Joined:
    Aug 18, 2008
    Posts:
    162
    Wait, I'll bite on this one here. This may very well be an excellent reason to pair PrevX with other AV-software. I will put it in another thread because I have a question concerning this. I'll quote you there:)
     
  23. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    AVC can also test according to a test design wished by the vendor, in single product tests. we would not test or force a tool like prevx in an on-demand scan. if prevx is not tested by AVC may have several reasons, like e.g. Prevx not wanting to be tested or even AVC not agreeing or having time to test prevx. (we do more tests/reviews and more products internally than can be seen on the website; if they are not on the website it is because the tests are confidential).

    p.s.: as written in the report, samples used in current AVC testset are mainly samples seen relevant in last 8 months - not 15 years. anyway talking about on-demand is senseless, its nearly like if you would talk about on-demand in a test of defensewall :p.

    btw, afaik, certifications bodies usually allow (for additional fees) several attempts to pass a test (i do not know if this is the case also in this type of test, but I will try to find out). It would be good to know if or how many attempts were needed to get the certification (if its allowed to disclose).
     
    Last edited: Apr 16, 2010
  24. fax

    fax Registered Member

    Joined:
    May 30, 2005
    Posts:
    3,898
    Location:
    localhost
    As far as I know, you can fail those tests and keep been tested up to when you get the certificate. Just a matter of covering the costs of the multiple tests. At least this was the process (i.e. we don't actually know if PREVX has failed during the tests) :)

    VB100, in this regards (only), is more trasparent.... failures are public.
     
  25. IBK

    IBK AV Expert

    Joined:
    Dec 22, 2003
    Posts:
    1,886
    Location:
    Innsbruck (Austria)
    correct, this is how I have it in mind too.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.