Got a question, so here is my AV selection procedure

Discussion in 'other anti-virus software' started by Kees1958, Jul 24, 2010.

Thread Status:
Not open for further replies.
  1. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Got a question on how I would select an AV. Also a request to post it, to see whether this would stand against Wilders Members opinion (maybe AV specialists would drop in).

    So here you go and join in


    I allways evaluate AV-products on the following criteria

    1. How do they perform against in the Wild virusses?

    Compare The VB100 track record of the last two years for instance. Allthough only a limited sample for each test, the 2 year period will include several viri families, so this gives a fair impression of in the wild protection capabilities. Write down the top 10


    2. How well are there heuristics and how fast do they respond to missing fingerprints in their blacklist?

    The RAP test covers this fairly well. AV comparatives also publishes retrospective and proactive test. Just sample the top ten of each test and set a single top 10.

    3. Combine the above lists into a top 10 and check for deep de-infection capabilities?

    When your are owned by malware, is the AV able to clean it up. This is very hard to evalaute as a consumer. Avast with its safe boot capabilities and GMER rootkit knowledge scores very high (of the freebies), Panda with its nessecity to have internet connection scores the lowest. Note that PrevX with its specialised rootkit and behavioral engine compensates for the cloud weakness. Without a roll back like Comodo Time Machine I would not use Panda (no matter how well they score on zoo-virii dominated tests). I am not crusading against Panda, just telling that I only install Panda on a friends PC when CTM works on their PC also).

    Be rutheless when they do not have cleaning on steroids capacity forget it. The reality of life is that 100% is unachievable, so when **** happens choose a companion which can be of use in post infection situations. Established a top 5.

    4. Trial the top three. Have a look at the AV-comparatives 'on-demand' results. AV-comparatives contains a lot of old (ZOO-ed) viri. Use their scores of the last year only as prioritisation of your trials. So the highest scoring contender in AV comparatives out of your top 3 is going to be trialed first. I am not saying AV-comparatives is useless, but in my experience the contenders left were all scoring higher than 94% in AV-comparatives. Off course when you have selected a candidate which scores standard in AV-comparatives, there is something strange and do some research on the internet, check other testing institutes whether this 'image' is consistent. Preferably choose the number four when this happens and skip the ambiguous one

    5. Make your final choice
    a) On disk I/O
    Unless you have at least 4 fast Samsung disks (single disk has +140 MB/s through put capacity and large RAM chache) running in Raid 0 or two SSD running in Raid, disk I/O is the largest bottleneck on any new system. So the total amount of disk I/O (in reads and in bytes) is a fair indicator on how lean the AV does feel.
    b) On CPU spikes
    Modern 4 core CPU can handle easily a steady CPU load of up to 6-8% of the AV. What makes you multi core monster feel sluggish is when a high priority process is spiking up. Just using the AV in practise is the best test for this.
    c) Value for money
    What are the one time lisence costs, what are the yearly repeat costs?


    Regards Kees
     
    Last edited: Jul 24, 2010
  2. PJC

    PJC Very Frequent Poster

    Joined:
    Feb 17, 2010
    Posts:
    2,959
    Location:
    Internet
    So, Kees, what's your Top Three (3) AVs (Free and Paid)?
     
  3. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Considering the paid I really have not tried enough to give a fair top three (GData, Trustport and Trendmicro seem good, but I have no experience and did not do any testing, same applies to Vipre and DrWeb which seem interesting, but do not have much testing info available), so for what it is worth in alfabethic :)P ) order A2, Eset, Kapersky are the ones I have told friends to try when they were looking for a paid product.

    For free (same alfabetic order): Avast, Avira (when you can stand the nag screen), MSE.

    Ironically the latest freebie install on a friends PC was: Comodo Time Machine, PrevX facebook Free, Panda 1.1. Free. Reason for this is that she runs LUA (with pop-up for Admin credentials) and I advised her to invest the extra money into a good Router/FW plus NAS (because she never thinks about backing up data). CTM only protects Program Partition. Windows FW kicks-in two way when she is not at home network accessing internet.


    Regards
     
    Last edited: Jul 24, 2010
  4. 3GUSER

    3GUSER Registered Member

    Joined:
    Jan 10, 2010
    Posts:
    812
    In my opinion (You don't need to agree or disagree with it) your way for evauluatuon is completely wrong.

    First :
    In-the-Wild malware . OK , have a look at VB tests but just to have the information if they pass or not . First , these type of threats are not that much . Second , the samples used in the test are old . Third , the type of scan is just on-demand and on-access ,,, and finally all it is needed to pass is 100% detection of old samples of in-the-wild malware and no false positives . Well , that is an easy task . And to sum it up , this is just a test . It proves nothing to your real-world experience

    Second:
    Heuristics themselves . Well , it proves nothing again . You should evauluate product capabilities all-in-one. Nowadays , in year 2010 - July , all(most) products rely not only on signature or on heuristic . No . They rely on combination of many many technologies .

    Many tests (even reputable such as VB and AVC) use products with outdated updates to test proactive protection . This is wrong because nowadays you simply can't be without internet connection for long time . Even if you will be with no Internet, you are less likely to get infected because the malware today is web-based , Internet-borne. In order to test proactive detection , up-to-date products must be tested against zero-hour malware samples.

    Third:
    (no comment but I don't fully agree)

    Fourth:
    AV-Comparatives 's on-demand tests (in particular) are on-demand tests and they don't evauluate all the products capabilities. These tests (done twice per year) could again give just a note if the product is good or not good (let's say it so)


    Something important you miss :
    1. Don't pay too much attention to tests which are non-real-world type . Well , decide which 2-3 products you will want to test on your own.
    2. Pay attention to how the product installs and uninstalls
    3. Test the product support (act as a customer and as a non-customer , ask various of questions and expect them to answer you ; check how fast and correctly you'll receive an answer . check for phone/email/forum/chat support offers)
    4. Test the virus laboratory (how promptly they will respont to you for a clean file and for a new malicious code)
    5. See how a give product will act on your own computers
     
    Last edited: Jul 24, 2010
  5. Cudni

    Cudni Global Moderator

    Joined:
    May 24, 2009
    Posts:
    6,963
    Location:
    Somethingshire
    It is a good set of criteria. Based on somewhat similar approach I have come to conclusion some time ago. Without knowing the user and its particular needs or computer knowledge, so for an average user there is no difference between choosing from either Kaspersky or Norton or Eset or McAfee or Trend Micro or Avira or Avast or MSE and few others. Simply because on the average those will protect just fine. And they do.
     
  6. ZenPirate

    ZenPirate Registered Member

    Joined:
    Aug 21, 2005
    Posts:
    18
    Generally, I look at a few key things

    - How good is the general protection rate over time
    - How high is the false positive count over time
    - Does it "stay out of the way", pop ups, nag screens etc..
    - Over the life of the product how many times has a signature update misidentified critical system files as viruses, and hosed computers

    So far, I've come to "trust" MSE, which is my current AV, Avast, ESET, and if Norton continues it's trend it will also gain my trust again.
     
  7. 1000db

    1000db Registered Member

    Joined:
    Jan 9, 2009
    Posts:
    718
    Location:
    Missouri
    1. Decide the browsing habits of the person/people that will be using the computer. I also consider the hardware specs: new or old, laptop, netbook, or desktop, direct internet connection or router. If its a laptop or netbook do they normally use it at home or are they mobile alot?

    2. Try to asses the best I can their technical knowledge (this is easy if its a friend or family) and whether thy are willing (or in many cases able) to pay for a product. Most people I've found just want to not be infected and not really know the technical aspect of prevention or disinfection.

    3. For people with little computer knowledge, a direct internet connection, and multiple users that include kids I would normally suggest Eset SS or Kasperky IS for paid products due to consistent performance and reputable companies. For free products I would suggest Panda Cloud, MSE, or Avast.

    However, a few times I have installed a couple (at different times of course) of the products mentioned above and they still became infected even with a decent software firewall (Online Armor or Private Firewall). So after cleaning the machine up I'll mention the value of non-traditional security products like Defensewall, Sandboxie, and Appguard. None of the people that I have installed any of these products on hae called me back to clean them up. With a brief tutorial on how to properly use the supplemental product, these people remain infection free.

    Lately I have been convinced of the value of products such as PrevX, AppGuard, and Defensewall. I suggest the use of a traditional AV simply for level of trust regarding new files. In short I consider the people, the use, the machine, the cost.
     
  8. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Yep, it is a small sample and the criteria are ridiculeous rigid. Have a look at http://www.wildlist.org/WildList/ See what organisations and/or professionals have submitted. Also note the differences per month. See http://www.securecomputing.net.au/News/214420,vb100-test-blasts-windows-server-security-vendors.aspx and ask yourself why it is newsworthy in a security magazine that a quarter did not pass? So to me this list does not appear to be 'old' or 'static' (when so why would not everybody get a pass, I ask). For me a company who has build a reputation of passing the VB100 test surely earns credits.


    To me the RAP telss something about heuristics and general/family fingerprint capabilities. Same for AV-comparatives proactive and retrospective tests. For new threats the higher the quality of heuristics/general fingerprints the higher the chance your AV will recognise it. So to me very relevant. This is endorsed by the value some vendors give to the RAP test for example http://www.sophos.com/blogs/sophoslabs/?p=9330 and http://sunbeltblog.blogspot.com/201...&utm_campaign=Feed: SunbeltBlog (Sunbelt BLOG)

    Well I value the RAP and Proactive/retrospective test of higher value. The On demand is just for prioritising and act as sanity check (in the case one of the candidates performs miserably at on Demand test).

    Ad 3.
    Is it reasonable to ask from a AV-vendor to respond to all your questions fast? Remember a yearly lisence fee may cover 15 to 20 minutes of US - Europe hourly mancosts. This implies that when they have answered you three times they will not make a profit on your lisence!
    Ad 4.
    How many AV-users are hit by an unkown new virus? Is this real world practise or are you pulling my leg?
    Ad 5
    That is my final point, so we agree on this one :D
     
  9. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    For the sake of discussion, I would like to disagree, but it makes sence what you are posting. ;) so agree.

    It was what Cudni said: without knowing user/usage circumstances. :D Ahh found argument for discussions sake: ;) Telling them how THEY should evaluate and acquire information is better approach than doing the user/usage analysis yourself. Simple when you are doing it, there is little knowledge transfer to the one you are helping. When you tell them the process, they end up with more security awareness. :p
     
  10. 1000db

    1000db Registered Member

    Joined:
    Jan 9, 2009
    Posts:
    718
    Location:
    Missouri
    I agree that they should learn as knowledge can prevent many infections and in the end they end up a smarter user. In the words of Matthew Henry, "Those that would learn must first see their need to be taught." Many people I have worked with appreciate the explanations surrounding infection and prevention but do not put to practice what they learn. They simply want their pc to work for them (I think this is part of the reason Macs are so popular but thats for a different discussion). I suppose they figure that that's what I'm for; to clean their pc when things go awry.

    Picking an AV product is really only part of the solution. The end user needs to understand the importance of OS updates, user rights, and even browser choice. So i do agree that knowledge imparted to the end-user(s) is very important but sometimes there is a need to save the user from themselves concerning overall security strategy.

    For example, I have a friend that teached auto mechanics and he ends up fixing all of my vehicles. The most important thing he has taught me is how to recognize real problems and when problems are "out of my league." I trust him because he knows vehicles much better than me. I don't want to learn to fix my own car, I just want to know how to determine the appropriate response when something breaks. Also, my friends (and my employer) trust me with their computers and their data protection so I educate them according to thier "need to be taught."

    Always a good discussion when you're involved Kees. :thumb:
     
  11. Techfox1976

    Techfox1976 Registered Member

    Joined:
    Jul 22, 2010
    Posts:
    749
    I'm going to see if I can go through this whole post without naming any names. Especially since I need to disclaim that I work in the industry in a technical capacity currently and have worked for various AV companies historically.

    In practice, nothing will be 100%. Everybody here has done a good job of realizing this for the most part. In reality, they all do about the same in sum, just different strengths and weaknesses. Better heuristic detection at the cost of more FP's and/or higher system impact. Aggressive cleaning capability at the cost of much greater potential to damage a system. Lower system impact and footprint at the cost of catch capability.

    Therefore, looking at the end user is going to be the main point. What features are going to benefit them the most, what downsides are going to hurt them the least?
    Does the person keep their unholy trio religiously up to date?
    Do they want to know every time their security software does something in big bold obviousness, or do they want it to work silently in the background?
    Do they need excellent mitigation or is a small footprint going to serve them better?
    If (when?) they get infected, are they going to be fine taking a whack at it alone or with slow email responses, or will they want more options for assistance?
    Do they really appreciate talking to a support person in the US, or do they enjoy finding out what the weather is like in India?

    Everybody's needs are different. Every security vendor is a mix of so many aspects that nobody can really make One Method to Choose Them All. One person's process for choosing will result in catastrophe for another person.

    For example, my method involved a few simple steps.
    1: Do they have a free trial or some free way to test the product?
    2: What is their score on the vacuum test, weighted for approximate user base if necessary?
    3: What is their detect rate on my stack of CDs and DVD of obscure malware ranging back as far as 1997?
    4: On a sandbox machine, how do they do at mitigation of a "lightly aged" sample ("found this in use a week ago")?
    5: How well does it do against live drive-bys? In my case, I am religious with updates and patches, so I don't test against drive-bys with old unholy-trio versions.
    6: What downsides do they have? Nags? Alerts all the time? Slowness? Are these made up for by the benefits?
    7: If I call their support (Can I call their support?), what country am I speaking to? What options do I have if I am infected anyway? How well and quickly do those options work?

    I made use of one company for a number of years. I renewed their 2006 product in Oct 2008 because I was not fond of the newer products' system impact and alerts. The end of December 2008, my product alerted me that it was no longer supported, and no I would not get a refund on my October renewal (Because it was after 30 days), nor have the time shifted over, now go buy the 2009 product. After some substantial hassle, I convinced them of my side of this and they shunted the 10 months over when I purchased the 2009 product.

    I installed the 2009 product and it did an initial scan. It proudly proclaimed that it had found and removed some malware for me! My computer was now safe! Wait... Malware? On my system? Oooooooooh... It found the samples folder on the E drive (Not an optical drive, mind you)... It found several of the samples that I had submitted to them back in 1999. It only took about nine years to detect them.

    I had them scan about a dozen of my sample discs. Detection for anything over 3 years old: 67%.

    One refund (after a LOT of hassle) later, I was looking for new security software.

    Found a nice little company. VERY fast, low-impact product. Cool! Detect rate on sample discs over 3 years old: 87%. Not optimal, but the speed of the software made up for this, since newer discs had over 99% detect rate.

    Found some live samples from phishing missives a week or two old. No immediate detect? Hmm... Well, the realtime protection on this package does act somewhat oddly. Okay, let me put this on my sandy laptop...

    Copy the malware. No detect. Right, I'll run it then.
    "ALARM! I see malware in that file! LOOK at that! It's a TDSS rootkit! It's running! It's installing itself! It's installed! Oh Noes! It's..." BSOD, reboot. Sandy laptop now carries a TDSS rootkit and network stack insertion. Security software is still running but doesn't see a thing. Wow... It SAW it, but didn't stop it. Touch of death.

    Reimage the laptop.

    Poke around at some more things. Find a company that has had good reviews and has a surprisingly low score on the vacuum test (Most "good" software has a score in the low thousands, this one scored under 100 at the time. Notably lower is better in the vacuum test). Researched some factors, found the common misconceptions that when taken into account made it look even better. Fine. Money back policy? Works for me. Buy!

    Detect rate on old malware discs was surprisingly high, 98%. Impact was relatively low. It also did not detect the multi-part dropper that gave the other program hassle, but as soon as the TDSS installer fell out of it, *BLAM*, stopped in its tracks. The network stack injection was caught by heuristics.

    Sometimes some relatively small alerts that would go away on their own after a few seconds, but I could still look at the history. Tech support was in the US. Overall it was an excellent balance for my needs, so it ended up being my package of choice. On a whim, I applied for employment there about six months later and have worked there since.

    It's hard to find somebody who supports a company because they like the product, rather than supporting the product because the company would fire them if they badmouthed it. Is the product perfect for everybody? Definitely not. But it's a great product for my needs and getting better as the company makes a few changes in direction, so I'm happy.



    Regardless, the main thing to consider is that usually, somebody else's method of selection will not meet your own needs. Pick and choose. Find your own specifications and look at what is important to you. Other people have crowed about certain products, but those products have deal killers for me. That doesn't make those products bad. It just means they are not for me. They are perfectly fine for other people who find that feature/downside set better for them.

    Look for new ideas on testing, but take it all with a grain of salt. Is it worth it to you to get the absolute fastest and smallest impact if the detection rate stinks or it doesn't actually block anything? Is it worth it to get the best detection if it's going to blow up your OS about twice a year? All a big balance.
     
  12. Kees1958

    Kees1958 Registered Member

    Joined:
    Jul 8, 2006
    Posts:
    5,857
    Let me guess when your exhaust produces a 1000 Decibel is one of clear signals?

    Thanks - we agree, possibly also on open exhaust on motorbikes/cars :D
     
  13. dw426

    dw426 Registered Member

    Joined:
    Jan 3, 2007
    Posts:
    5,543
    My selection procedure is as follows:

    Step 1. Log in to Wilders.

    Step 2. Go to which ever forum talks about the kind of software I'm looking for.

    Step 3. Search for posts referring to possible vendors I have in mind.

    Step 4. Weed out the useless, BS posts containing fanboy praising and opinions on why one is better than another without providing one iota of data to back up the yapping.

    Step 5. Look for an agreement between at least 4 well respected members regarding the effectiveness and stability of a product.

    Step 6. Use said product.

    If the product goes all wonky on me:

    Step 8. Come back here and blame well respected members for said wonkiness...I'm kidding :D
     
  14. 1000db

    1000db Registered Member

    Joined:
    Jan 9, 2009
    Posts:
    718
    Location:
    Missouri
    Yes 1000dB blast of exhaust is a clear sign of engine trouble! Not to mention destruction of the universe.

    OK mods, I promise to stay on topic now.
     
  15. Cudni

    Cudni Global Moderator

    Joined:
    May 24, 2009
    Posts:
    6,963
    Location:
    Somethingshire
    Please, the noise ratio is way to high ;)
     
  16. doc77

    doc77 Registered Member

    Joined:
    Jun 10, 2010
    Posts:
    55
    I thought about making a thread similar to this to try and advise what to look for when choosing an AV and what info about AV's we should share to make the choice easier. Here's my list of what I want to see for a complete AV review, also the following are the kinds of things I look for when reading about various AV's on this forum (some of which isn't discussed).

    1. Detection - I trust apps with a fairly large user base that backup the programs real world use and online tests like AV-comparatives can backup these user claims. This info is easy to find commonly discussed.

    2. Resource usage - This encompasses a lot of things which aren't often discussed in detail:
    - does your system feel noticeably slower, less responsive, longer boot time, etc after installing?
    - how many startup items does it add and what each startup item is called?
    - how many services does the program install and what is each service called? can they be disabled/enabled easily to essentially tweak the AV into an on demand scanner?
    - Memory usage from each process? Total memory usage? I/O usage?

    3. Features, reliability/stability, often these are discussed in detail
     
  17. Techfox1976

    Techfox1976 Registered Member

    Joined:
    Jul 22, 2010
    Posts:
    749
    I hate to say it this way, but trying to listen to the user base is not useful in most real life cases. You get the two extremes shouting out about it and that's about it normally. So either you hear fannish drivel that is less than helpful in many situations, or you hear the people with a thorn in their finger who have bile to spew over it, and that is less-than-helpful as well.

    The people who have Things Just Work Right™ (joking on the trademark mind you) are the best indicator of a good system, but also the least likely to say anything about it. Complain when things break. Crow when you are a fanboi. Satisfied customers just stay quiet.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.