Comodo get 100% in Protection nearly every time, but usually it can't get 6.0 points because of the lower detection rate (I don't know why they include detection in protection, but that's it)
Testing organizations should add another component to their tests: "Buggy Software." (OK -- Software Reliability) I see at least two "Top Products" that from my experience would be knocked down a few notches in the rankings.
I found the trick with Comodo is only use the firewall. It is the AV component that sucks resources, and it isn't any good anyway. Just use the Firewall with auto-sandbox, cloud look up, etc. and it runs pretty light. I have been using it on a couple machines the past few weeks, and find it is MUCH less noisy than it used to be. You have to turn HIPs off or start HIPS in training mode for a week or so though, otherwise that component is still verbose.
"Industry average" of "Protection against 0-day malware attacks, inclusive of web and e-mail threats (Real-World Testing)" is "100%" huh...
McAfee seems to be pretty decent in the performance category in both Av-test and Av comparatives tests. That's a lot better than it used to be, Intel has made some improvements. It used to be a hog.
Looks like Intel and McAfee want to wash their hands of each other. So, McAfee's future scores might be interesting. McAfee reportedly has spun off into other ventures independently of Intel already. McAfee's founder has some sordid background stuff, and that detracts further from any value of its product, at least in my eyes. https://techcrunch.com/2017/04/03/intel-security-is-mcafee-again/