The Death of Anti-Virus: conference paper

Discussion in 'other anti-malware software' started by SweX, Dec 20, 2013.

Thread Status:
Not open for further replies.
  1. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Cars and computers are really nothing alike. I dislike analogies, but the car/ computer one is probably the one I find most troublesome. The troubles with cars are completely separate - computers do not float around potentially attacking each other, your attackers in a car is *everyone* with some being worse than others. Attackers with computers are dedicated, and there is a team of people 'invisibly' defending you at all times.

    A drunk drive is not a hacker. These are complex issues all on their own :p

    They're just nothing alike. The threats are not analogous.

    Personally, I think training is useless. Enterprises engage in it constantly and then a pentester comes in and, of course, they get through with phishing. As one penetration tester put it "clicking **** is the most fun thing you can do on a computer, expecting users to not click things is not realistic".

    Training is nice, and it should be available, but it should never be a requirement for safety, certainly it should not be a requirement for usage.
     
  2. jmonge

    jmonge Registered Member

    Joined:
    Mar 20, 2008
    Posts:
    13,744
    Location:
    Canada
    Hungry Man:thumb:
     
  3. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    How can you say that when computers handle financial data and transactions?
    Regarding the car analogy, you're looking at the wrong aspect of it. You can't compare a car being driven by an operator to a computer being run by an inept administrator. The cars computer doesn't expose critical system settings to the driver. The driver can't change the ratio between front and rear brakes, alter the fuel/air mix ratio, or change ignition and cam timing settings. Access to these is only available to those who have the equipment and (hopefully) the knowledge to use it correctly. The PC by comparison lets the operator change most anything.

    As for educating users, it's a lost cause. It hasn't worked to any real degree in the last 20 years. Things have only got more complicated since then.
     
  4. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Ah, I suppose I should differentiate between servers and desktops. Servers are about providing services. Desktops are about allowing users to access services and information.
     
  5. jmonge

    jmonge Registered Member

    Joined:
    Mar 20, 2008
    Posts:
    13,744
    Location:
    Canada
    noone_particular very true especially the privacy been invaded
     
  6. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    @HM: point taken. Yes, attacks against the OS may not be within a user's capacity to deal with.

    (BTW I've seen the presentation you're referring to. It was... interesting. Default passwords all over the place, ugh.)

    OTOH, phishing doesn't seem like a very good counterexample to me, since it is more or less within the user's control. How would one combat phishing on an OS level, beyond the usual spam filtering and such?
     
  7. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Phishing itself is already dealt with nicely by email providers. Detection of phishing isn't bad.

    Targeted phishing is more difficult for them to deal with, at which point you say "Well what else can I deal with?" and the answer should be whatever that phishing leads to. Is it socially engineered malware? You can deal with that. Is it an exploit? You can deal with that.

    It's not to say you should ignore phishing, but there are technical solutions to the chain of attack.
     
  8. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I'm not sure that one can make such a distinction. On Windows, many of its services function like servers. The current versions have more ports open than they ever have, and they're much harder to actually close. Some of them are servers and weren't intended for use beyond the LAN. Attacks on other internet devices such as routers can expose these services. There's no way I'll believe that the services that open those ports are immune to being exploited.
     
  9. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    But the goals of the devices are separate. The priority of a server is to provide some service, the priority of a desktop is to access those services.
     
  10. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    On servers, the reason that a port is open is clear. It's to receive incoming requests for the pages and files it serves. On Windows PCs, the purpose of the open ports isn't so clear. Regardless of the service behind them, ports are opened to receive some form of unsolicited inbound traffic. I'd like to see a detailed list of all of the open ports on Win7 and 8 that describes what type of traffic each is supposed to receive and what the purpose of that traffic would be. Most desktop/laptop units have no reason to accept any type of unsolicited inbound traffic, yet it's nearly impossible to close some of them, close as in not listening, not just blocked by a firewall. Much of current Windows functions as a server for reason that aren't clear but are enabled by default. On most home (and many commercial) networks, a new UPnP exploit could easily expose most of them to the internet. Besides being a total babysitter/snitch, this is my main issue with Windows from Vista onwards. They've made it more difficult for code to exploit the stack and memory but have also increased the exposed attack surface. Many of the services responsible for these open ports have system level permissions. Exploit any one of them and the PC is owned. IMO, this is a designed backdoor.
     
  11. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Those are windows specific issues, though. I'm talking about the purpose of a computer on its own, of a desktop, which is to allow access to information. If Microsoft or someone else has muddled that, that's a separate issue.

    Whether those are simply poor design choices or intentional backdoors does not change the purpose of a desktop computer.

    I also wouldn't call those services similar to a server. They are enabled only for LAN, as far as I know.
     
  12. luciddream

    luciddream Registered Member

    Joined:
    Mar 22, 2007
    Posts:
    2,545
    Yep. This is why I consider hardening & limiting attack surface (I consider them in the same boat) as the place to start. Well, second to end user know-how/discretion, which should always be #1 IMO. In my case I have no Java, PDF, .Net Framework, Flash, or IE on my box. And NoScript denying globally allowing scripts only temporarily as needed, sandboxed. And pretty much all would-be vulnerabilities associated with XP stripped out. Insecure an OS as it may be under normal circumstances... what are the odds any of that affects me? Even finding 1 vector to exploit would be difficult, let alone multiple. I also makes EMET far less useful in my case.

    This and isolation are my primary goals. I would view a real-time AV as increasing my possible attack surface and adding a vector to worry about. Why do that when I can scan new/changed files within an isolated/sandboxed environment before moving it to my system anyway? In the changing landscape of security real-time AV's are going the way of the dinosaur.
     
  13. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Not sure how much one can really strip out from XP. In the end you're still running a system with enough libraries to do font / image parsing (incredibly vulnerable and complicated) and a full network stack for wifi, etc.

    While you can disable services, there is so much left over between drivers (often vulnerable) for wifi chips and gpus, your browser, and your kernel.

    Certainly it's better than having it though.
     
  14. luciddream

    luciddream Registered Member

    Joined:
    Mar 22, 2007
    Posts:
    2,545
    No wifi, hard wired to SPI router. The browser is sandboxed and persistence is a lost cause with Shadow Defender & images. And the isloated/boxed environment would stop any leakage before I were able to reboot.

    So far nothing has been able to get the better of this supposed weak kernel yet in my 10 years of being an XP user. So I must be doing something right... or this OS must be more secure than it's given credit for. Probably a bit of both.

    And I love it for it's privacy and anonymity as well, which OS's since are lacking in IMO. I also trust it more flat out, namely that there are no backdoors in it, among other things. And I don't like the idea of svchost.exe perpetually leaking through my firewall doing God knows what.

    If they'd continue support on it indefinitely I'd use it forever. But as much of a fanboy as I am even I see the writing on the wall once support ends.
     
  15. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    Ethernet drivers, then.

    Yes, the problem is that privilege escalation on XP is not hard.

    The Adobe reader sandbox exploit, which attacked the XP kernel, would have succeeded fine with only a minor amount of tweaking.

    That said, you will probably continue to get lucky, as attackers don't care that much.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.