http://passmark.com/benchmark-reports/ "The aim of this report is to compare the performance impact of Malwarebytes Anti-Malware when installing it in conjunction with another security product on the same system. Each test was performed with a single product installed and then repeated after installing Malwarebytes Anti-Malware in conjunction with that product." Endpoint security products were tested. It also shows results for Malwarebytes Anti-Malware for Business on its own.
Page 17 of 23: ------------------------ https://www.wilderssecurity.com/thre...hmarks-passmark-software.368733/#post-2456277
So which parts of the test have been falsified or are questionable in your opinion, or is copy & paste + two smilies the full extent your thoughts?
Since they offer antimalware that can be run next to other security solutions, it's nice to see they've decided to test it's performance alongside different AV suits.
1) Malwarebytes funded the production of this report. 2) The list of products tested and the metrics included in the report were selected by Malwarebytes. ---------------------------- Same story as the other so called "test" =
Seems like it. But of course system impact varies from system to system, so I consider performance reports as a guide rather than a definite indication of performance. I've seen ESET products be heavy on some machines and light on others. I've rarely used the real time protection in Malwarebytes so can't really comment on the performance of MBAM alongside antivirus software.
Used MBAM with CIS and now I use MBAM with ESET... sad to see that ESET performs kinda slow compared to Norton / Kaspersky in the tests listed on the website :/
Which metric(s) are you referring to? Could it be you're misinterpreting the results? The green bars show the result *without* MBAM. The pink/orange shows the added impact by MBAM.
Well I feel fine with it on my system (I also run HitmanPro Alert v3). But my current laptop and also my previous laptop with CIS kinda have enough ressources: Currently: Dell XPS15 9530, 15.6" QHD Touchscreen - i7-4702HQ 2.2-3.2Ghz (Haswell) - 16GB RAM - NVIDIA GT 750M 2GB - Samsung SSD SM841 512GB Full specs & my review: http://forum.notebookreview.com/thr...haswell-2014-top-configuration-review.741648/ My previous laptop was a Malibal (Clevo) P170HM with similiar specs (stronger CPU but older which in the end means same CPU performance, same goes for GPU). Full specs & my review: http://goo.gl/jCPhgo ESET feels faster to me than CIS. But it still seems to be slow compared to Kaspersky/Norton when looking at all the reviews
You shouldn't look only at tests results, especially when it comes to performance impact. You could test Kaspersky/Norton on your system and see if they are really quicker.
Really can't see the applicability of this test to the home user. The commercial ver. of MBAM was assessed on commercial versions of mainstream AV endpoint products. Extrapolating these results to the retail versions is like comparing apples to oranges. This report is more relevant to retail users: http://passmark.com/ftp/totalsecuritysuites-jan2015.pdf
Even moreso factoring enterprise versions often use different architecture. Trend Worry Free has a much slower dev cycle.
Ok, this is not an independent test, of course. But does this mean that the whole test is nonsense? No. If we take this, then we can carefully look in the details and results and look what they tell. So what do we get there? Some nonsense performance metrics (Installation time and size...) and a result that generally says "the more security products you install, the higher is the performance impact" wow. The details show, that some products impact some actions more than others. All known. Reasons: Programming but often different types of implementation. Examples: Stream web scanning takes longer as simply blacklist checkings, so here is the impact for web surfing larger. Some products watch actions that others don't, so of course there is once an performance impact and in the other case not. But all this reasons are not mentioned, only stupid diagrams. The only thing that makes me smile is this observation "These products recorded no change or an improvement in the test after installing Malwarebytes Anti-Malware" So MBAM is a booster for some softwares What I really don't understand how a company can pay for such! type of testing instead of taking the money to improve products.
Exactly. This test doesn't/can't place products based on their scores. I don't see what benefit Malwarebytes gain by cheating. If product A is more compatible than B, is it beneficial for them? And actual results even don't allow such a simple "product A is more compatible" argument. I want to know where's need to cheat.
@AdvancedSetup Look at the comments above. Remember Is Malwarebytes still useful you nearly got my point. Please arrange a proper detection/protection test with MBAM and Defender against other Av's