Discussion in 'other anti-virus software' started by SergM, Feb 8, 2010.
February 2010 only in Russian now
From Russia with Love
In the first message the link was incorrect. I have corrected it.
According to this test none of the tested products could detect or remove Sinowal still. If the results are to be trusted this test emphasizes the importance of prevention.
any particular reason for using avast 4.8 instead of v5 ?
Very good result for Dr. Web
I really do not get it. How is it can someone take a bunch new products and test them with a bunch of old products and say abcd won and wxyz lost. It is quite beyond logic.
Comodo 3.9. The current version is actually 3.14.
How can a test be credible when you take the current technology of some products and you pinned them against last year technology of other products and still have an unbiased result?
Isn't kind of odd that the only two Russian companies got the only two top spots in this Russian test?... just an observation
Kaspersky is always the winner in this Russian tests
What are you looking at? This is the list in the report ...
In the test involved 17 manufacturers of antivirus products, including:
1. Avast! Avast! Professional Edition 4.8.1368 Professional Edition 4.8.1368
2. AVG Anti-Virus & Anti-Spyware 188.8.131.52 AVG Anti-Virus & Anti-Spyware 184.108.40.206
3. Avira AntiVir PE Premium 220.127.116.11 Avira AntiVir PE Premium 18.104.22.168
4. BitDefender Antivirus 2010 (22.214.171.1245) BitDefender Antivirus 2010 (126.96.36.1995)
5. Comodo Antivirus 3.13.121240.574 Comodo Antivirus 3.13.121240.574
6. Dr.Web Anti-Virus 5.00.10.11260 Dr.Web Anti-Virus 5.00.10.11260
7. Eset NOD32 Antivirus 4.0.474.0 Eset NOD32 Antivirus 4.0.474.0
8. F-Secure Anti-Virus 2010 (10.00 build 246) F-Secure Anti-Virus 2010 (10.00 build 246)
9. Kaspersky Anti-Virus 2010 (188.8.131.526 (ab)) Kaspersky Anti-Virus 2010 (184.108.40.2066 (ab))
10. McAfee VirusScan 2010 (13.15.113) McAfee VirusScan 2010 (13.15.113)
11. Microsoft Security Essentials 1.0.1611.0 Microsoft Security Essentials 1.0.1611.0
12. Outpost Antivirus Pro 2009 (6.7.1 2983.450.0714) Outpost Antivirus Pro 2009 (6.7.1 2983.450.0714)
13. Panda Antivirus 2010 (9.01.00) Panda Antivirus 2010 (9.01.00)
14. Sophos Antivirus 9.0.0 Sophos Antivirus 9.0.0
15. Norton AntiVirus 2010 (220.127.116.11) Norton AntiVirus 2010 (18.104.22.168)
16. Trend Micro Antivirus plus Antispyware 2010 (17.50.1366) Trend Micro Antivirus plus Antispyware 2010 (17.50.1366)
17. VBA32 Antivirus 22.214.171.124 VBA32 Antivirus 126.96.36.199
I accept that Avast 4.8 is not the latest version but Avast 5 has been out for how long now? As each vendor introduces their next year product at different times it is difficult for any tester to be always using the very latest build. However, unless the build is very ancient then it still represents a reasonably valid test.
Dr.Web came out on top with Kaspersky and Dr.Web 5.0 is the current version but not the new version that is in development. You could include the current Dr.Web as one of your 'old' products.
Either it's a conspiracy, as you could be hinting, or perhaps those two products just happen to be the best. I doubt that the test was sponsored by either of those companies.
I'm somewhat sceptical about these results. If I were to take a guess I'd say these tests are based mainly on Russian malware, what the Russian AV's specialize in. To have a comprehensive test you need to make sure you're getting malware from all parts of the world.
IMO the only conclusion to draw is that the 'winners' in this type of test are Macrium,Acronis & Paragon.
With all due respect I do not think that you even know what you are talking about. You mentioned software release schedule, so what about it?
The test was conducted on 11/26/2009 and during that time most security vendors, to my knowledge, have already released their 2010 offerings. Here is the proof:
Kaspersk 2010 was released on the 24th of June 2009
BitDefender 2010 was released on the 17th of August 2009
Symantec released its Norton 2010 products on the 9th of September 2009
F-Secure released its 2010 products on the 3rd of September 2009
TrendMicro released its 2010 products around the 27th August 2009
Now the test was conducted during the 26th of November 2009 when most security vendors, at least the ones whose products were tested, already released their 2010 products. The fact is that tester willfully included current products for some vendors and old products for others while the tester could have had new products for every vendors had he wished.
To me such a test is pure rubbish since the tester was not fair to all products that were tested. I could give you a good example how could someone compare KIS 2010 with NIS 2009. When NIS 2010 includes a new technology by the name of "Quorum" and their improved "Sonar 2". The only explanation I could find is that the tester wanted Kaspersky to beat Norton, period. To this is bias to the superlative.
Am I missing somethinf because on the test results page it lists Norton 2010 as well as the 2010 version of the other softwares you mention. It also list the correct version of Comodo, so what do you mean?
Also, just because a product implements a new technology does not mean that said product will automatically be better. Both are great products, and there are always doubts associated with any test.
Congratulations, you've gone astray multiple times.
First you miss the test the topic points at.
It is this one:
Not this one:
Second you miss the testing period of the test you whine about.
It was performed from July 7 to October 22 2009, like you can read here:
Now compare again with your list...
A lot of fuss about nothing.
LOL I thought so.
You didnt observe properly - http://translate.googleusercontent....le.com&usg=ALkJrhjBgqAlQXgqpe6njXKZfzdOmhNjJw
... Kaspersky nor the Doc got the first 2 spots in all the tests.
Probably correct for on-demand/dynamic tests... although this was a test of removal, so I presume all AVs detected it at first. If they detect it, they should be able to remove it, regardless of where the sample was from - although quality of removal procedure may vary depending on the prevalence of the threat.
Gotta love the good old cold war thinking of some people.
Tests from east, mainly china or russia = bad, unreliable and require "sceptical approach." The methodology is clearly explained, the malware names are exact and nothing is hidden. The reports from anti-malware.ru have always been well written.
So what's the problem? This test only evaluates the quality of technology "under the hood." In these situations people usually start building defences and talk about prevention, but the question in my opinion is how can some antivirus protect you against malware if it doesn't even have enough muscle/brute force to eradicate if off your system?
To clarify: How effective are those "weak" antiviruses in withstanding a real attack even if it's detected by a signature. Will it show an alert that a virus was detected and then "disappear" from tray because the malware simply removed the av?
The malware fixing forums are full of these "I got a virus alert but then my antivirus x, q, or z said it cannot be removed and then the av disappeared" -type of threads.
Removing capabilities, self-defence, rootkit detection and other "difficult" stuff only tells how advanced low-level technology is implemented in the AV software.
Nice test and i m 99% confident in it.
During the time Dr Web and Kaspersky have already showen the ability to clean malware properly.Russians do know how to make security software.
I feel a little bad that BitDefender failed the test ,but nothing surprising ,cleaning wasnt a hot spot for it (even when it was working).
Somehow surprized by Microsofts ,but wait ,it still has the old cat-eyed RAV code under the hood
I also like the fact that Avira failed the test.
Why you hating playa?
Dont feel bad. At my first of the year prediction thread, and my saying Avira was going to take a tumble this year in its detection ability, is starting to prove me right.
Yeah in terms of detection avira also has let me down(Its beta v10 had unsatisfactory detection rate and it only depends on its web guard)
I don't know who this directed to, but hopefully it wasn't me. I was only making a suggestion based on a possible assumption. I have no issue with trusting these results.
I feel badly, because I just gave my daughter Avira Premium. How about a prediction that will give the results I want?
I also am giving her a license for MBAM to run alongside.
Although I do not dismisss this test, the comments indicate why I only really trust AV Comparatives -- and you.
Separate names with a comma.