Ok I am not saying that these are not somehow reliable but they do not really reflect real world use of some AV products. In the end, it's about personal preference. I look at AV tests just as a guide for me but I base my choice from experience. The best way IMO is to ask, watch some reviews, see what tests labs says, and most importantly to test it yourself and build your own opinion.
I just watch them for fun but they may have an impact in my overall view of a product but that does not mean it will change my opinion on A or B products.
Yup, I 100% trust AV-Comperatives. NOTE I trust them, I never said that they are 100% accurate in their results. But I do believe that these tests are the best what you can go by when selecting an AV product.
Most tests at least worth reading. Though you must be well aware of their methodology and integrity to make decisions on them.
I trust them, but don't pay too much attention to them, as I don't really care to see if my AV of choice has the best detction rates. All I care about is that it offers good protection and is extremly light on CPU usage, I'm not too worried how the detection rates compare woth other AVs from month to month.
The results of AV Comparatives are important in the long run, say over a year or two. I haven't changed my choice of AV for more than 5 years as its results have always been acceptable. Tests don't always reflect the real performance of an AV but give an indication of its detection capabilities.
I keep up on results but I take them with a grain of salt. If a product like WSA cannot even be tested properly then why would I listen to what AV Comparatives has to say about it's ability to protect the end user? I do my own testing along with getting opinions from co-workers who work in network security as I do. Not to mention some people who frequent this forum whom I have much respect for
I chose C. They are good for a browse to see how the vendors are faring.But i do not base any choice in regard to what product i use on them. 1.What malware is being tested here. 2.What is the realistic chances of us coming across this malware.? If the end user,s chosen security software is keeping the malware at bay then that is a good enough test for anybody.
It doesn't matter if their results are 100% accurate. What matters is each product is tested (some aren't because they wish to cherry pick which tests they will take) using the same methods and data. This means the user can compare the detection rates and cleaning powers of products. The comparisons are valid even though YOU may get different detection rates on your own samples. It only matters that I can see product X is better than product z. If users say I don't care what they say my mind is set.... well they clearly use different methods for decision making.
They're just data that I consider less of compared to trustworthy organizations, unless proven false by reproduction.