Discussion in 'other anti-virus software' started by Esse, Jun 7, 2012.
Thanks. Interesting read. Agree with most their comments, though I find the same faults with Comodo(yes hard to use, but lighter than they say) & Panda that they did.
Thanks for the link
but the result differs from latest AV-test
Avast is great in cleaning in this review while its repairing ability was poor in AV-Test
check the" Protection" in the sites....they re quite different
look at here:
these obvious differences between the results make it difficult for users to choose the right choice
they all claim they are independant
Users dont know which one to trust
Yep, I agree. That's why besides reading the reviews, I do my own testing.
I want to know more about the sample set used .3 for real time protection of Panda seems off compared to other testing sites.
I am pretty disappointed with this test. It almost seems that they had the ratings to be given in mind and they just gave a small note for each AV.
Avg has been a winner in recent tests and i would like to know if its so easy to disable its installation on a infested pc as they claim.
Though i agree, ID-Theft does take some ram. B'coz after disabling it, my booting was faster and lighter generally.
I agree, and i am amused by the "Active" thing, they in fact test webfiltering and i wonder if they set Comodo DNS during the installation of CIS or install Panda URL filter...
Im surprised by Panda result, it was the only one than removed a malware where MBAM and HMP failed.
Also, they grade the usability and test an HIPS...
and honestly who has 25 000 malwares on a drive...
trust only your real-world selftesting. the rest is nothing.
Wow. I think that the report on panda was the smallest of all of them. I find that panda usually does better in tests than some of the other ones. I'm not really sure if they were able to use an internet connection.
Also what is Ultra Virus KIller? Its under the complimentary section. Anyone ever heard of it?
This is pretty much the complete opposite of what recent AV-Test and AV-Comparatives tests reveal.
I wonder what testbed and source they used for their tests and if they allowed connectivity to cloud-scanning during the malware execution and on-demand tests. Their published methodology doesn't specify this but it definitely looks like they didn't.
Wow, I think Panda wasn't given a fair evaluation. Those tests are definitely wrong. I think Panda is extremely good on detecting threats.
it is an efficient manual malware removal and system fixer tool
Agree with your comments.
Seems to me that results from different tests are more and more conflicting, as a result users are left confused about what to choose. AV Comparatives remains the only test organization with some kind of credibility.
Mse is doing very well and I tested it on my VM and I am on with it on my main machine,with combination of mbam and hitman pro you can rely on mse and its lightest among all.
I used to visit TomsHardware frequently a few years ago but since they changed their website design i feel their reviews have declined also. (I mean the quality of the reviews, this is not related specifically to this review.)
Well,i'm so concerned about how bad Panda perform in this test...therefore i re installed it yesterday on my machine!
Test methodology: http://www.tomsguide.com/us/best-free-antivirus-2012,review-1767-2.html
It does not mention if Panda had access to the internet during testing however.
This is ridiculous. On-demand scan time is the least important of performance indicators in a real-time av solution. It might be the most important in an on-demand scanner such as MBAM or HitmanPro, but not on a real-time AV solution.
* Does not say if cloud-connectivity was allowed during these parts of the testing or not.
* Does not disclose how testing of "threats of different time periods" is performed. Is this offline?
* Does not disclose if testing of "other protective features of each program like sandboxing" also includes testing of "url filters" as in the case of Panda.
Scanning and Cleaning
* Does not disclose if products were allowed connectivity to cloud-based scanning.
* Does not disclose if default installation was used for all products.
* Does not disclose if additional default tools such as URL Filter in the case of Panda were installed.
* Does not disclose source of testbed.
* Does not disclose how each sample of the testbed was verified to be working and malicious.
* Does not disclose if particular vendor(s) provided samples or URLs.
* Did not provide missed samples to vendors.
* Does not provide a contact to ask for missed samples or for verification of results.
There's a lot more documentation over at www.amtso.org to check against this published methodology. But from the perspective of a vendor this test and methodology is for sure lacking a lot of basics. I simply do not believe AVG got 3/5 and Panda got 2/5 on detection while Avast got 5/5... ridiculous unless the methodology and source of malware is made clearer.
It is not allowed and I don't want to start a debate here about the Panda cloud AV capabilities. I invite everyone who think that Panda Cloud is such a great antivirus and believe the AV-TEST results and TomsHardware test is wrong to go on YouTube and search for Panda Cloud Antivirus.
WOW, with such powerful argument there is simply no possible counter-response.
I agree. I never run scans with my AV anymore. I use things like MBAM on demand. I dont care how long it take for my AV to scan. I care more about disk io, cpu time, ram, etc....
Perhaps TomsHardware test was done on the cheap by YouTube testers.
Separate names with a comma.