Antispyware comparison from Malware Test-lab (07 Aug 06)

Discussion in 'other anti-malware software' started by phasechange, Aug 7, 2006.

Thread Status:
Not open for further replies.
  1. dallen

    dallen Registered Member

    Joined:
    May 11, 2003
    Posts:
    825
    Location:
    United States
    Manny Carvalho,
    I must not have explained my thinking very clearly because I'm not trying to say...
    "Well that test didn't rank the programs in an order I think is correct, so it must be wrong, or bad."

    Rather this...
    "A program that I know is a quality program is ranked below a program I know is almost worthless...something is not right about these result. I wonder if their methodology is sound."

    Another way to explain my point is to say that you and I do not need to see test results to know that KAV and NOD32 are quality AVs. We can argue over which is better all night long, for sure. However, if I cited test results that ranked KAV higher than NOD32 and you realized that those same test results ranked NOD32 and KAV as the 12th and 15th best AV on the market, I think you would be justified in calling into question the legitimacy of the test not because it wasn't in your favor, but by the simple fact that something has to be wrong with any test that ranks two of the best AVs on the market out of the top 10. Even worse would be if an AV that you knew was poor was ranked above both NOD32 and KAV.

    In legal terms, I would say that there is a shifting burden. I may have the initial burden of proving a claim that your testing methods are flawed, but when your tests conclude something that is prima facie outrageous…my opinion is that the burden then shift to you to prove that your methodologies are sound.
     
    Last edited: Aug 10, 2006
  2. Manny Carvalho

    Manny Carvalho Registered Member

    Joined:
    Jun 3, 2004
    Posts:
    270
    Darn and I thought that I was going to make some money.:oops:

    You know results are results even if they go against your thinking. These guys measured how well these apps cleaned spyware from a system. It was pretty simple. They infected a system, cleaned it and then looked whether the system was taken back to the clean state.

    You can quibble whether they use the right spyware but you know for the 14 that they did use that's what they found. If they used a different set then the results would be different. The tests are pretty basic and certainly don't form a sufficient base to make a decision on whether something is better then another overall. It's just one data point in the continuum.
     
  3. dallen

    dallen Registered Member

    Joined:
    May 11, 2003
    Posts:
    825
    Location:
    United States
    Very good points.

    Don't forget motivation though. I could pick one trace of a virus that KAV handles perfectly, but NOD32 does not. Based on that one data point, I can conclude that KAV is 100% effective and NOD32 is a completely worthless 0%...might as well not have an AV at all. Sample selection can determine outcome. I guess that goes back to what you said:
     
  4. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Although in some ways the results seem skewed, I would not argue with the top few. I never considered the AVs to be great at spyware detection. I know that AsAware always detects some traces that are not critical. They might even be deleted when FF is closed.

    I agree that if the items of spyware are selected the results can be whatever one wants. But, again, I do not consider KAV to be better at pure spyware, whatever that is, than the dedicated AS applications.

    Jerry
     
  5. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    I am not one to put down any other competing product, most of the top players each have their merits. We do find, and based upon our own internal testing, and reviews such as those at CNET Download.com, that SUPERAntiSpyware tends to detect and remove more of the "tough" or "nasty" threats than other products.

    http://www.download.com/SUPERAntiSpyware-Free-Edition/3640-8022_4-10564983.html?sb=2&v=0

    Many products focus on shear "numbers" of items detected and detect .GIF's from known sites, .CSS files, etc. items that are not necessarily "harmful" to one's computer. For a user, sometimes the "number" of detected items it the only way to quantify an applications effectiveness.

    SUPERAntiSpyware focuses on removing the nasty components as an errant/leftover registry key or .GIF or .ICO file is not what is causing the computer to run slow, or capture passwords, etc.

    The same goes with cookies - some applications may detect more cookies that we do, but cookies pose no threat to your computer, and you can clean them all with a single mouse-click.

    We are dedicated to supporting and updating SUPERAntiSpyware, and we release daily definition updates. We will analyze any samples submitted to us and update our database accordingly to protect our users.

    Nick Skrepetos
    SUPERAntiSpyware.com
    http://www.superantispyware.com
     
  6. eburger68

    eburger68 Privacy Expert

    Joined:
    Mar 4, 2002
    Posts:
    244
    Manny:

    You wrote:

    The simple answer is: vendors exclude "low risk" apps because they don't think that said apps represent a signficant enough threat to the user to justify detecting and removing it. It's a targeting decision, in other words -- which is to say, a policy decision.

    Let me expand on that. There is a whole mass of applications on the Net that prove controversial or contentious because there is disagreement among respected authorities as to whether or not said apps ought to be targeted, detected, and removed by anti-malware or anti-spyware apps. Different anti-malware vendors make different policy decisions on these apps -- an app that one vendor decides meets their criteria for detection is exluded by the next vendor because it doesn't rise to a sufficient "risk" level to merit detection and removal.

    Moreover, these "low risk" apps also can prove to be a headache for anti-malware vendors because customers and users have contradictory expectations wrt these kinds of applications. While one customer will scream at your tech support because your application failed to detect some toolbar that you regarded as too innocuous to be of concern, the minute you add detections for that app you'll get a whole new set of phone calls from: 1) panicked users who think you're telling them that there's a "virus" in a toolbar they voluntarily downloaded and that they really like; 2) outraged users who a furious that your app removed the toolbar they've come to like and depend on.

    See the problem? Two good examples of apps that I regard as "hair-pullers" from the very recent past:

    1) AOL's new fre ActiveVirus Shield (which uses the KAV engine)

    See: http://www.dslreports.com/forum/remark,16660767

    On the one hand, it's an attractive app for many users -- KAV engine, hourly updates, completely free. On the other hand, AOL buries key mention of functionality that many users will find objectionable in the EULA, which they surely know few people will read.

    That's classic adware behavior -- sub-standard, misleading installation notice & disclosure practices. What's worse, unlike so many adware vendors, AOL is going to collect and exchange Personally Idnetifiable Information, not just semi-anonymous info about surfing habits and so forth.

    But is this all enough to compell an anti-spyware vendor to include the app in detections? Is it enough to include on the Rogue/Suspect Anti-Spyware page, which lists plenty of apps with similar characteristics? Many will say, "No, don't list the app because it's Kaspersky and AOL." But do AOL and Kaspersky get to play by different rules than lesser-known vendors?

    2) Windows Genuine Activation Notification Tool

    In somewhat similar fashion, MS initially pushed an app that many would reagrd as spyware (given the data it collected and transmitted) on users through deceptive, misleading installation practices that have been well-documented, all the while maintaining that users "consented" to the installation of this invasive software.

    Should anti-malware vendors target WGA Notification?

    Those are but two of the most recent "hair-pullers" -- there are plenty of other controversial apps out there, let me assure you. How about P2P apps? Password crackers/recovery tools? Apps that display ads, but only in the conext of the application's own window (e.g., Eudora)? Are those worth detecting in all cases?

    Different vendors will come to different conclusions. Thus, it's not necessarily the soundest decision to include such apps in a test bed for anti-spyware testing, unless you're specifically committed to testing & documenting policy choies -- that is, decisions by vendors about what "threats" to target and why. Such testing differs in imporant ways from tests performed against apps that few would regard as "controvesial" -- Haxdoor rootkits, backdoor-worms, etc.

    To return to the test itself, all too many of those apps in the testbed look to me to be the very kind of "low risk" variety that quite a number of vendors would have excluded from their threat defintions. Thus, this round of testing doesn't provide a good sense for how well the tested anti-spyware applications would actually have performed against live, kicking malware "in the wild" -- the kinds of threats that users could actually be expected to encounter and which users would be most concerned about.

    Sorry for the long-winded response, but there you have it.

    Best,

    Eric L. Howes
     
  7. Manny Carvalho

    Manny Carvalho Registered Member

    Joined:
    Jun 3, 2004
    Posts:
    270
    Thank you Eric that's an excellent explanation. It does make the point that testing with borderline apps could lead to results that are not optimum for the more "real" life situation.

    I appreciate your thoughtful response.
     
  8. Durad

    Durad Registered Member

    Joined:
    Aug 13, 2005
    Posts:
    594
    Location:
    Canada
    Just in case that autor of this test was selecting spywares to test, these results are possible.

    I just do not trust to this test because it is against my expirience and probably many of others here...
     
  9. sai7sai

    sai7sai Registered Member

    Joined:
    May 3, 2006
    Posts:
    21
    Location:
    Taiwan
    All-in-one solution is a trend, so antivirus softwares will integrate antispyware, anitrookit and so on into their products, although they are not the best now, but you can expect (business is business).

    Usually, you will comment a new report, despite good or bad, this is a good thing, but please think about current comparison reports from testing organizations or popular magazine, do they publish detailed test methologies and results?

    Cleanup testing is a method of testing antivirus or antispyware, so far they focus on it and they also said it depends on spyware samples, that's why they need to continue to test antispyware softwares then publish the reports.

    Finally, they are experts in security area, especially antivirus, antispyware, antirootkit, antispam, antiphishing, content security etc. like all of you, you can continue to observe them, you will see they will contribute theirselves to this area. :D
     
  10. Chubb

    Chubb Registered Member

    Joined:
    Aug 9, 2005
    Posts:
    1,967
    Excellent explanation. It maybe that spyware is somewhat different from virus in a sense that, the definiton of a virus is clear and there is not much borderline case, but borderline case may exist in some spyware like the WGA. The selection of borderline samples for testing may sometimes lead to strange results.
     
  11. JRCATES

    JRCATES Registered Member

    Joined:
    Apr 7, 2005
    Posts:
    1,205
    Location:
    USA
    OK, the latest test results from Malware-Test Lab has been released, and once again, Trend Micro Anti-Spyware tops the list and performed the best.

    http://www.malware-test.com/test_reports.html

    From reading Eric's posts....it seems that this outfit is a reliable testing operation. So my question is.....is it time to take Trend Micro Anti-Spyware a little more seriously, and look at this product as more of a serious contendero_O

    I haven't tried it yet (TMAS), but if anyone has LATELY (because the results are markedly different from several months back), and have some feedback they can provide, that would be great. Perhaps they have addressed and corrected some of the earlier things that I heard about and didn't like (like the inability to "schedule scans", or the inability to "restore a quarantined item", etc.). Also, I'd really like to know how it performs against keyloggers, and see them incorporate a generic "keylogger guard" with this product like SpySweeper, Spyware Doctor and others have as well.

    But this is interesting, none the less....
     
  12. Chubb

    Chubb Registered Member

    Joined:
    Aug 9, 2005
    Posts:
    1,967
    Ewido came fourth this time: 62.16% :eek: :eek: :eek: :eek:
     
  13. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    I wish I knew what to believe about these tests. Some say that they are not real world, and critize the samples or numbers of samples, and the methodology.

    Are those criticisms valid, or just sour grapes? It is difficult for many of us to determine the effectiveness of anti-trojans, and anti-spyware applications. If one never gets infected he cannot necessarily assume that his AS is the primary cause, especially if he has safe surfing habits, and other good malware programs.

    So how do we "dummies" determine how effective the various anti-malware applications are? I like SAS, and also Spyware Terminator, but how do I know how effective they are? Certainly SAS has not performed well on the tests of this thread, but Nick, SAS, disputes the validity of those tests.

    As opposed to AV Comparatives in the AV testing area, there doesn't seem to be a well/generally accepted test agency for othe anti-malware applications.

    Best,
    Jerry
     
  14. eburger68

    eburger68 Privacy Expert

    Joined:
    Mar 4, 2002
    Posts:
    244
    JRCates:

    You wrote:

    This outfit's tests do have a number of things to recommend:

    * the methodology of the testing is clearly disclosed

    * the test bed and test environment and transparently and thoroughly revealed and documented

    * test results are reported exhaustively

    * testing has been performed against a wide variety of adware, spyware, and malware, not just a narrow range of samples

    * testing is performed against "badware" in its actual environment, as opposed to simply running a scanner over a static collection of samples

    * detection and remediation are both tested, not just detection

    * files, Registry keys, and processes are tracked, not just files

    * it appears that some selection of items or traces (files, Reg keys) was performed to isolate the most critical components to test and track (as opposed to tracking every last garbage data file and Reg change)

    * testing is performed with only reputable anti-malware applications that users could actually be urged to consider purchasing

    On the other hand, though, the test beds selected by this group range from so-so to bizarre. See my earlier posts in this thread on the overuse of "low risk" apps, which might present a somewhat distorted picture of the apps tested.

    So, it's a bit of a mixed bag. If this outfit could be persuaded to become a bit more methodical and hard-headed in its selection of test bed threats, these tests would be far and away the best anti-spyware testing being done on the web.

    As to Trend Micro AntiSpyware, I have always regarded it (and its predecessor, SpySubtract) to be a quality anti-spyware app.

    JerryM:

    You might start by evaluating the tests themselves with the crtieria that I list here:

    http://spywarewarrior.com/viewtopic.php?t=22210&start=0

    SOme readers may not be in a position to evaluate any particular test against all the criteria listed in that SWW post, but if you can't assess a test against a particular crtierion, that in itself is important information.

    Best,

    Eric L. Howes
     
  15. SUPERAntiSpy

    SUPERAntiSpy Developer

    Joined:
    Mar 21, 2006
    Posts:
    1,088
    I was asked by several forum members for my my comments on the newly released Malware-Test.com tests. Here are some initial observations regarding the tests performed by Malware-Test.com.

    1. They listed the infections that they used and inferred that they derived the list from the urls to vendors’ “top threats” lists which they displayed below the list of infections used. However, only one infection from ANY of the urls was actually listed on their “current infection” list.

    2. They put out a call/request on their forums for infections and samples to be submitted. (Presumably vendors were permitted to submit samples). It is interesting to note that several of the vendors ranked much higher on the list which raises a question. I wonder if they actually submitted samples that were used in the test bed ? We declined to submit samples in an attempt to keep the tests ethical and legitimate. Why they can’t simply go to crack, porn and other well known sites to install infections is troubling – this shows to me, at least, that they may not be on top of actual infections and the spyware “game” in general.

    3. The tests performed were NOT real-world tests. In the real-world, infections are running and installed on the system and thus in their “native” habitat. The tests they performed, based on their r reports and from the sample sets they uploaded to us after their last tests, featured samples that were not actually running/active, and many of the samples were simply installed in the folder “c:\virus” which of course is NOT their native habitat. They claimed this was a “reporting mistake” last time, but the current report shows the same folder structure. They also renamed the files and folders of the infections – I am not sure why this was done – but all of the above are NOT what we see in the thousands of computers we diagnose weekly. All of this is contrary to the methodology they list in their report.

    4. The version of our software used in their testing, 3.2.1026, was released 6/20/2006 – over two months ago – yet they claim they check for updates before using and scanning with the products. Our current version is 3.2.1028 released in late July and features many enhancements regarding scanning and removal. Interestingly, they had listed our current Core and Trace database versions, but those were released on August 25, just before the tests were released. Thus, I am not sure how they could have downloaded the new definitions, yet ignored the updates to the product, and had enough time to run the tests, gather the results and post the findings all within a few hours of our update. This is troubling even with the time differentials. I have requested the IP they used so we can check our server logs to see if current definition updates were even downloaded and used.

    5. Malware-Test.com features Google ads all over their site – those ads are pushing software that they are not even testing, and in fact, they advertise sites known to sell “shady/rogue” anti-spyware software. If Malware-Test.com were truly interested in protecting users, they would be selective in choosing their advertisers. My concern is that the test site exists to draw traffic and earn money from users clicking the Google ads.

    If Malware-Test.com, or any 3rd party, wants to perform more accurate tests with real-world samples, they could do the following:

    1. Infect a clean system with one or more samples by visiting known infection sites, or installing software known for installing malware.

    2. Disconnect the system from the Internet so that no more samples can be downloaded and then take a snapshot of the system so that each anti-spyware software package can be tested against a real-world infected machine, and each application (package) can be tested against the same sample group.

    3. Install each package and scan the system and see what is left over. Reset the system to the saved snapshot and repeat for each software package. There are many tools available for tracking system changes and taking snapshots, i.e., VMWare, Ghost, etc.

    4. List the type of items (registry key, data file, or executable file) that were removed or missed. This is important because if an anti-spyware application simply misses non-harmful registry keys and/or data files, this is not as critical as the anti-spyware application missing critical and active files. Scoring should be weighted towards removing the critical components, not simply a global “%”. The fact that data files and errant registry keys, cookies and other non-critical items are not removed is not relevant as these do not represent true infections and most of the time present no “threat” to the users system.

    As an anti-spyware vendor, we have a chosen to focus on real-world infections on real-world systems. We design our rules/definitions to be able to detect as many variants of the real-world spyware/malware/adware samples as we can. The testing of various anti- spyware applications needs to simulate a real world environment. I hope that this provides forum members some food for thought.

    Nick Skrepetos
    SUPERAntiSpyware.com
    http://www.superantispyware.com
     
  16. JRCATES

    JRCATES Registered Member

    Joined:
    Apr 7, 2005
    Posts:
    1,205
    Location:
    USA
    Makes you wonder, doesn't it? I'm not really all that surprised that any vendor that didn't fair as well as hoped might or would "question" the authenticity of these tests.....but after all, it's the exact same for all products involved. But that's why I respect Eric's opinion regarding them as well. And Eric, I do appreciate you taking the time to respond with the following:

     
  17. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Hi Eric,

    I appreciate the reply. However, there is no way I could evaluate the test as you suggest. I don't have the ability to do such an evaluation.

    I am going to have to depend upon good test organizations and the vendors. Others, such as yourself, will evaluate the methodology, and let those of us, who do not have either the time or ability, have the benefit of your experience and expertise.
    I and others do appreciate that.

    Regards,
    Jerry
     
  18. JerryM

    JerryM Registered Member

    Joined:
    Aug 31, 2003
    Posts:
    4,306
    Nick's comments make a lot of sense to me. I suspect that to run tests as he has stated would be more time consuming, but I am not very sharp in this area. It does appear that the methodology used by the Malware Test-lab is not very useful in determining the effectiveness of a program to an average user in normal activities.

    It is a valid point, from an average user's viewpoint, that only tests that are truly real world are worth much. Also, I am not very interested in malware that has no impact on my system, such as the stuff that AdAware finds, and which will be removed in the general operation of the system.

    I am much more interested in the detection and removal of a rootkit than a trace left from some uninstalled program or site visited.

    So here we are. I still hope to see some real world tests that evaluate the anti-spyware applications. I do not consider AdAware or Spybot even in the running, as they do not operate real time to prevent the infection of malware. I realize that the Tea Timer may do some of that, but I never use it.

    Best,
    Jerry
     
  19. btman

    btman Registered Member

    Joined:
    Feb 11, 2006
    Posts:
    576
    I tried trend micro anti-spyware... it seems decent.

    But I've never trust trend micro as "THE BEST" after seeing it's anti-virus on our computers at school.... seeing it detect over 2000 viruses and it kept going up... wouldn't clean them either...and the computer worked fine too...
     
  20. eburger68

    eburger68 Privacy Expert

    Joined:
    Mar 4, 2002
    Posts:
    244
    Hi All:

    Nick raises some important issues with respect to this organization's testing. Let me gloss and respond to them:

    This would be consistent with my major complaint -- namely, that the test beds used for these tests aren't as carefully selected and evaluated as they could be and ought to be. The result is a somewhat haphazard collection of threats to be tested against, and the resulting picture can be somewhat skewed. The worst of these test beds, by far, was one of the early ones that used a random selection of apps from Spyware Warrior's Rogue/Suspect Anti-Spyware page -- the test bed consisted of apps that not even Suzi and I would necessarily recommend as valid targets for anti-spyware vendors.

    This is news to me. Sunbelt Software never received such a request, nor have we had any communications with this organization. Coincidentally, I happened to talk with a rep from another major vendor late last week. He reported that they actually had exchanged emails with Malware-Test.com, and came away with the same impression that I had of their test bed -- namely, that their selection of threats for the test bed was somewhat haphazard. Significantly, he did not report having received any request for samples.

    One other note: Malware-Test.com isn't the only organization to approach vendors for samples. Neil Rubenking, who conducts PC Magazine's anti-mlaware tests, does the same thing.

    I'm still not convinced that the methodology they report using (see Appendix B of the report) isn't in fact the methodology that was followed. The details of the traces they tracked (files, Registry) makes it difficult for me to believe that they simply dropped samples on a drive and ran a scanner over them. It's highly unlikely that one would end up with the traces they report tracking (esp. the Registry keys) unless they actually had installed the threats. Nonetheless, that their reports have been marred by the same flaw twice in a row significantly damages their credibility and raises questions as to just what methodology was practiced for the tests.

    Can't say much to this other than that it highlights the importance of a thorough reporting of every detail pertaining to the tests so that problems like this can be spotted.

    This is a bit of a red herring. Why? Check the sites of two of the most respected anti-malware testing oorganizations out there:

    AV Comparatives
    http://www.av-comparatives.org/

    VirusBulletin
    http://www.virusbtn.com/vb100/index

    Notice that both carry Google AdWords ads for anti-malware apps, some of which appear on the Rogue/Suspect Anti-Spyware page. Would anyone argue that Virus Bulletin's widely respected anti-virus tests are to be considered suspect or even disregarded as a result?

    I didn't think so. Let me clear, I'm just as unhappy with the use of Google AdWords advertising by rogue anti-spyware vendors as anyone else -- my own Rogue/Suspect Anti-Spyware page has a whole section devoted to this issue. But I can't regard these ads as sufficient reason to doubt the testing itself. The testing conducted by this organization can stand or fall on its own merits and the merits of its reporting.

    One final note: I do hope that readers are gaining an appreciation for the importance of thoroughness and transparency in the conduct and reporting of these kinds of tests. If Nick and I have been able to raise significant questions regarding these tests, it is because the ogranization gave us the data and wherewithal to do so. How many tests have you seen posted to the Web or published in magazines that contain nowhere near this amount of detail? All too many, yet those tests aren't necessarily any better or more credible for withholding the kind of data that we have for these tests. In fact, I've long suspected that other testers withhold this kind of data simply to deny potential critics the ability to subject the testing to the kind of scrutiny that Malware-Test.com's testing has been subjected to here. And that ought to make you wonder.

    Best,

    Eric L. Howes
     
  21. Ngwana

    Ngwana Registered Member

    Joined:
    Jul 5, 2006
    Posts:
    156
    Location:
    Glasgow, United Kingdom

    Good stuff, and how about testing in sync?

    1.COLLECTION POINT - Samples of malware are first collected by exposing an unprotected against spyware for at least 24 hours, and the unprotected PC is used to visit known nasty sites during the exposure period. The second source of malware can be provided by independent security researchers.
    2.TEST SAMPLE VERIFICATION - There has to be a reliable and transparent way to provide information as to what nasties are in the infected PC. The number of such malware nasties must exceed a sufficient(standard) number for the test results to be valid. Then exact copies of infected hardisk can be sent to at least three different Malware test labs.
    3.DIFFERENT TEST METHODS - The results from the three independent labs using different test methods of their choice are published for all to see.

    Just wishful thinking, Ok I am going back to bed. :D
     
  22. Infinity

    Infinity Registered Member

    Joined:
    May 31, 2004
    Posts:
    2,651
    Written by Samplas (Admin @ malware-test.com) yesterday .. an answer to what a rep @ SuperAntispyware responded / thought about their testing methadology ...
    Personaly, I cannot believe (after I have tried it in the past) that Trend Micro AS can compete against other mentioned AS programs in their tests ...

    off course it all depends on the malware tested ..

    like this : Malware detected : 14 detected and cleaned

    when scanned it were 14 tracking cookies or familiar ...

    when scanned for spywarelike trojans or keyloggers / CWS or nasty hijacker ... I would like to see how TMAS would go with detection and more important: Cleaning (permanently) ...

    maybe Andreas would be so kind to test TMAS on his collection of Spyware/Malware .. maybe we would get a better picture on this all ..

    best wishes,
     
  23. kdm31091

    kdm31091 Registered Member

    Joined:
    Jul 18, 2006
    Posts:
    365
    Anyone find Windows Defender's numers higher then they thought? Maybe I'll use it again.
     
  24. emsisoft

    emsisoft Security Expert

    Joined:
    Mar 12, 2004
    Posts:
    328
    Location:
    Nelson, New Zealand
    That comparison test is crap. Why?

    Some of the tested programs use exactly the same scan engines. But the result differs up to 10%. It's impossible to get such results.

    Therefore I could throw a dice to get better results..
     
  25. JRCATES

    JRCATES Registered Member

    Joined:
    Apr 7, 2005
    Posts:
    1,205
    Location:
    USA
    So "scan engines" do NOT differ from "definition databases"?
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.