The Death of Anti-Virus: conference paper

Discussion in 'other anti-malware software' started by SweX, Dec 20, 2013.

Thread Status:
Not open for further replies.
  1. SweX

    SweX Registered Member

    Joined:
    Apr 21, 2007
    Posts:
    6,429
  2. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    It's not the death of AV. It's the death of reactionary technologies, the assumption that we can somehow protect users by looking at how they've been attacked after the fact.

    But just as the people who write AVs can't manage to get ahead of attackers, neither can anyone else currently (at least in mainstream) - thus, they persist. Because no one else is going to pick up the slack.

    The paper's main contention is with the conflation of AV with 'signature based security' for lack of a better term. That's really their own damn fault - you're the ones who turned AV into a marketing term...

    Regardless, while AV's have certainly moved on, as they put it, from signatures (no longer relying solely on them) there is very little in an AV that isn't based on MASSIVE collection of malware samples, and reversing of the samples.

    All of an AV's security (for the *most* part, the core of it) is based on reactions, and analysis of live malware - that's not good.

    Basically massive companies, like Mcafee and Kaspersky have malware analysis labs. They collect *hundreds of thousands* of unique malware, and sift and analyze them with yarra rules, reverse engineering, and multiple stages of automated analysis.

    You know what you need for all of that to work?

    Samples.

    And there in lies the problem, and it will remain that way for some time.
     
  3. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    Not this crap again. They've been saying antiviruses are dead like 10 years ago and yet, they are still here and still effective. In fact, i think they are currently by far the most effective. New cloud technologies in connection with existing other technologies are really giving malware writers proper challange and while anything can get bypassed, that time is greatly reduced. And by greatly i'm talking minutes here.

    After all, antiviruses have always been around to greatly reduce the risk of getting infected, no one ever stated they are 100% protection. Just like airbags, seatbelts, virus inoculations for real diseases never result in 100% protection, just the same applies to antivirus software. But some still fail to understand that.

    @Hungry Man
    McAfee and Symantec are huge as a whole machinery, but their labs are rather small. And avast! for example, even though it has grown considerably, it is still small company, but they also proces thousands of samples automatically with their systems. Most of that goes to cloud processing and Evo-Gen detections. Doing it by hand is just not efficient anymore.
     
  4. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Trying to analyze and create detection/removal mechanisms for a potentially infinite quantity of malicious samples is not only inefficient, it's an excercise in futility. It's no wonder that AVs are moving the detections to cloud servers. The databases are getting too big to keep on individual PCs, regardless of how they're compressed. Taking all of the code a user downloads, executes, or contacts via websites and comparing it to ever growing databases of potentially unwanted code is inefficiency taken to the extreme. With AVs needing almost constant interaction with and being almost totally dependent on these severs, those servers are for all purposes part of your attack surface, a part that you can't secure or control in any way. Your security package becomes almost totally dependent on the internet, one whose very structure is insecure.

    As long as we have unskilled users running operating systems that permit most anything to run, the problem will continue to get worse.
     
  5. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    @Rejzor,
    Not sure where you're getting that from. They have multiple research divisions, with specific teams dedicated to all sorts of things. They acquire other companies and absorb their teams as well - Mcafee has absorbed a couple of malware analysis companies over the years.

    Doing it by hand would be terribly inefficient.
     
  6. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I wish it was so. Unfortunately, there's too much money to be made with reactive solutions. If someone did manage to develop a completely proactive solution that was effective against known and future threats, they won't be able to make a living from it. It would be a one time sale. Most everyone who has tried has either sold out to a larger corporation, disappeared, or went broke.
     
  7. zapjb

    zapjb Registered Member

    Joined:
    Nov 15, 2005
    Posts:
    5,556
    Location:
    USA still the best. But barely.
    I feel much safer now than 10 or 15 yrs ago.

    I'd like to take all the credit for that. Smarter, experienced, blah, blah, blah.

    But I don't think I can even claim most of the credit.

    Free or low cost widely available security solutions today are quantum leaps ahead of yesteryears solutions.
     
  8. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    Default-deny (whitelisting) or MAC
    script control
    least user privileges
    backup, backup, backup

    That's where it's at. I suppose AV could be used for on-demand, but just obtain files from trusted sources and all is well.
     
  9. guest

    guest Guest

    V.gif

    Or might increase it.
    hxxp://www.blackhat.com/presentations/bh-europe-08/Feng-Xue/Whitepaper/bh-eu-08-xue-WP.pdf

    Why? Because some people think that their AVs or any other security software will protect them from anything, even from their own mistakes. Why? Because security software vendors are enjoying themselves to spread nonsense for the purpose of marketing tricks. I prefer to blame the vendors/developers TBH.

    EDIT: But anyway, AVs are not going to die. There are always people who use and buy them, which is fine if they know the limits. I agree if AVs are easy to use, but the most effective? Oh ho ho... general fact says no.
     
    Last edited by a moderator: Dec 21, 2013
  10. moontan

    moontan Registered Member

    Joined:
    Sep 11, 2010
    Posts:
    3,931
    Location:
    Québec

    absolutely,
    the best defence is a good offence!

    there is no need to have a real-time AV to be running.
    it's wasted CPU cycles and system resources.
     
  11. trott3r

    trott3r Registered Member

    Joined:
    Jan 21, 2010
    Posts:
    1,283
    Location:
    UK
    Moontan:

    I notice you are running Dopus.
    Running v6 here which is really old but looking at the pricing it is expensive.

    Have you seen it available elsewhere or is it still only available from GPsoft?
     
  12. spywar

    spywar Registered Member

    Joined:
    Oct 23, 2012
    Posts:
    583
    Location:
    Paris
    I don't know which company is still doing all the job by hand..
     
  13. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Your computer is at its most vulnerable when installing new or updating existing software. Attacks on servers are much more common than ever before. The internet itself can't be trusted to always take you where you want to go, thanks to many factors from DNS altering malware to government tampering. IMO, the trust a user places in an update or file server needs to be conditional. Trust but verify. You can verify the files integrity with hashes or armored signatures. You can use online sites to scan it for known malware. You can monitor the install process and its first run, recording what changes it makes to your system, what internet access it wants, etc. You can reduce the risk but you can't eliminate it. This is the primary reason that I've made updating/installing a manually performed admin only task. There isn't much that an installed AV can do here that can't be done with an online scanner. If I recall correctly, wasn't the attack against Blue Frog (the spam the spammers operation) done by exploiting Norton Internet Security and using it to perform the attack?

    IMO, any security package that relies on others servers and constant interaction with the vendor is unnecessarily adding to your attack surface and can potentially create more vulnerabilities than it fixes. A good security package should stand on its own. AVs can serve a purpose. They can perform that purpose without being installed on the PC.
     
  14. RejZoR

    RejZoR Lurker

    Joined:
    May 31, 2004
    Posts:
    6,426
    This only works in theory. In real world, all this has been and still is highly user unfriendly. Comodo has been pushing their default deny for ages and to be frank, it's still very problematic, because in the end, people start excluding everything and it then defeats the whole purpose. That's why most of companies are using blacklisting with mild forms of whitelisting, mostly to avoid false positives and to aid their cloud sensors make better decisions.
     
  15. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Default deny (of executables) will never work as a core policy. If it were ever widespread it would get destroyed just as fast as AV does. It's not as stupid, because it's not as reactionary, but it makes poor assumptions about attackers.

    Least privilege is where things are going, slowly. No one has released a sandboxing solution that works as a standalone, yet.

    AV is important. Reaction and detection are important. But as a core policy they are very silly.
     
  16. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Unless privilege escalation exploits can be eliminated, least privilege as a core defense will be destroyed just as quickly. If combined with default-deny, least privilege would effective. The problem with default-deny is that it doesn't address code that exists and executes solely in memory. If that can be addressed, default-deny would be formidable. IMO, as long as malicious code can execute, sandboxes will be bypassed and privilege restrictions will be defeated. It's just a matter of finding the flaws in the code. Since perfect code doesn't exist, both methods are a continuation of the same arms race we've had for years, penetrate, patch, repeat. No matter what mitigation methods get introduced, it always comes back to that. That said, no matter what mitigations we come up with, social engineering will continue to be that big wrench thrown into the mechanism.

    edit
    Default-deny will not become widespread on equipment like desktops and laptops. Users won't accept them if they can't do with them as they please. Since default-deny will not become mainsteam, the specific methods/applications used to implement it won't become standardized or widespread. Unless the user is directly targeted, there will always be easier victims. For those reasons, default-deny will be effective, even if not truly secure. It definitely won't be less secure than the present alternatives.
     
    Last edited: Dec 22, 2013
  17. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    I confess I do prefer the MAC approach that Apparmor in Linux uses over default-deny, but the latter is very strong, imo, if the user's setup is fairly static. I think script control should be used as well with either. I forgot to mention no sensitive data should be kept on the main system either, unless on an encrypted location.
     
  18. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    The rapid update policy used by several apps have made default-deny harder to implement without trusting some form of digital signing. Any form of trust in that regard makes you vulnerable to compromises (or coercion) on their end.

    Regarding scripts, they shouldn't be treated any differently than applications or system files. Scripts that you use should be whitelisted. The rest should be blocked. On both my 98 and XP units, that task falls to an oldie but goodie, Script Sentry.

    Default-deny shouldn't be limited to applications or executables. With the right tools, it can be applied to internet access, scripting, inter-process activity, command line parameters, and to a point, what the applications can do in memory. Unfortunately, implementing default-deny to this extent is well beyond the average users ability.
     
  19. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    Absolutely agreed! Except I don't bother with inter-process activity, command line parameters, etc...
     
  20. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    @noone_particular: what you're describing is mandatory access control, applied to the entire system. This can be done, but as you indicate, it is rather labor intensive.

    Of the existing MAC frameworks on Linux, I know Tomoyo can do it (with difficulty), and I think GrSecurity can as well. AppArmor probably not, SELinux maybe. RSBAC I have no idea.

    Anyway this sort of thing is probably not worthwhile for most desktop users; better IMHO to restrict known vulnerable applications. Efficiency and stability are also important.

    Servers might be another story; though in view of current developments (i.e. many servers being VM sessions, or OpenVZ/VServer containers) I'm not really sure. Better to ask a long-time sysadmin, I'm relatively new in this field.
     
  21. justenough

    justenough Registered Member

    Joined:
    May 13, 2010
    Posts:
    1,549
    Interesting posts. Are there any software suggestions that fit with these ideas for a non-tech person like me?
     
  22. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    When executables like RUNDLL32.EXE that perform normal activities but can also be used to load malicious DLLs, some form of control over their activities is necessary. The command interpreter is even worse. If an attacker can gain access to either of these, it's usually game over.
    I'm not familiar with mandatory access control or the extent of its abilities. Judging from what little I've read on it, it has a lot in common with what I've tried to implement, but with different tools. As for whether or not it's worth the effort, I guess that depends on who you regard as an adversary and what your risk really is. The way things are going lately, I'd rather take the risk too seriously than not seriously enough.
    That's been the problem all along. There is no realistic way to implement the stronger security measures without becoming something of a techie in the process. A user for instance can't implement default-deny properly without knowing what needs to be allowed. They can't know what needs to be allowed without knowing what it's for or what it does. The industry has struggled with this question almost as long as computers have existed. For most of its existence, Windows did nothing to help the matter. They made an OS that allowed anything, made it easy for unskilled users to play administrator while requiring no skill or knowledge from that user.
     
  23. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    @justenough: HTTP Switchboard looks rather like RequestPolicy... Probably not necessary for most people, I think. Not on Linux anyway. The Chrome sandbox on Linux is an empty chroot - between that and Yama process restrictions in current kernels, a kernel exploit is probably the only way out.

    For Windows, I'm not really sure. If you're on Windows Vista/7/8, I'd say use EMET in addition to an AV. With Chrome as your browser you're pretty well defended. Choose your userspace applications carefully, especially those that will open complex files or have web access - they, not your AV, are your first line of defense. The AV is there to give you a chance to react if things do go pear-shaped.

    If you're on Windows XP, and can't switch to Linux, I'd say look into possible deals on an upgrade to 7. IIRC there are cheaper upgrade plans for students. Not sure if they still apply though.

    For privacy I don't have any opinions really. Maybe use some form of ad blocking - local proxy, hosts file, etc. Encrypt stuff that you wouldn't want Google to look at. Stuff like that.

    In all cases, have a backup strategy.

    (For mine, I've ditched Noscript - it interferes too much with work. Current setup is just click-to-play, and a proxy for ad filtering.)

    Edit: oh, mandatory disclaimer - I'm far from being an expert on this stuff, so please take my statements with a grain of salt.
     
  24. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    I forgot to mention I restrict dll's via whitelisting as well in my AppLocker setup in Win 7.
     
  25. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    Mandatory access control is an extension of discretionary access control. In DAC, a process launched by a given user inherits that user's privilege restrictions. In MAC, the process inherits the user's restrictions and whatever restrictions apply to it specifically, regardless of what the user tries to do with it... Thus, mandatory.

    Things can get more complicated than that, but that's the gist of it.

    Windows XP uses DAC; e.g. Firefox launches with your user's privileges. Slapping a HIPS on top of that is MAC; Firefox can now have additional restrictions (such as not being able to launch other processes).

    NB, DAC isn't necessarily insufficient - you can do a lot with it, with a little kludging. For instance, running Firefox in a separate user account on a Linux desktop.

    I see you're not a fan of mincing words. :)

    Unfortunately the tech industry has been cottoning onto this lately, and the results have been... less than desirable, IMO. See for instance Apple iOS and its walled garden.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.