Food for thought: safe browsing and blocking scripts

Discussion in 'other anti-malware software' started by Windows_Security, Feb 10, 2015.

  1. Rmus

    Rmus Exploit Analyst

    Joined:
    Mar 16, 2005
    Posts:
    4,020
    Location:
    California
    Hello again, kees!

    I've always been a fan of Default-Deny security procedures.
    I don't know what you mean. Which type of White List? If I install a new program, it has to be added to my White List of executables, else it will not be able to run!
    Well, most of this about script blocking is over my head. I have understood very little of what you folks are discussing in this thread -- it all seems so complicated! If I had to spend that much time configuring script blocking and tweaking and this and that, I wouldn't have time to enjoy computing!

    I posted to your thread only in response to questions by Gullible Jones and wat0114 about the running of the exploits mentioned earlier. That is about all that I can speak to!

    ----
    rich
     
    Last edited: Feb 12, 2015
  2. 142395

    142395 Guest

    Tho I basically don't oppose what Kees and his friends told and admit security via script blocking is more on statistical sense, I want to give my thought against some points.
    Firstly, most of domain names are self-explanatory so I rarely need to test by "allow one, allow another" approach, and it will be true to others after several months experience. (BTW, sorry for nitpicking but correct math is C_(7, 2) = 7!/(7 - 2)!2! = (7 * 6)/(2 * 1) = 21. "!" means factorial.)

    Then, what if when I have no clue about domains? This is why I stressed "exception or whitelist should not be based on making a website to work, but based on trust." If a site doesn't work even after allowing 1st party script (www.xyz.com), subdomains and seemingly related domains (e.g. xyz.net, xyzimages.com), and other trustworthy major/well-known domains, then I will most likely give the site up. Such website should most likely be worthless for me, just like some web sites which try to connect via TCP port 81 that will be blocked by my firewall. If I went to the site via Google search, I will go to Google cache. But I admit in some rare cases which I still need to go to the site, I temporarily allow other scripts with taking some risks.

    But I don't fear RCE or malware much, except targeted attack coping with those common RCE/malware is not much hard. Even for 0day which is quite rare, as long as it is drive-by download, anti-exe is enough. For fileless, still all known fileless malware can be blocked by strictly configured HIPS or sandbox, tho still theoretically damage can be done unless that HIPS mess in memory activity. So when you have layered security, RCE is not much important. But other threats, especially XSS, CSRF, click-jacking, and DNS rebinding can only be blocked via those addons such as NS, RP, Policeman, uMatrix, Kiss Privacy, etc. AND filtering proxy like Proxomitron, Privoxy, etc. (I know some AV actually detect some of them, tho) besides best practice (Do not browse within the same browser while you're logging in, do not use auto-login, do not permanetly save cookie, set strong password for your router, use browser which have counter measures for those attack). I don't permanently allow any site I need to login in Noscript, thus prevent most of XSS even when I wrongly trusted a malicious domain. Also I combine RP for firefox (I'll switch to Policeman as it finally added redirect protection), and Kiss Privacy for Chrome tho uMatrix itself have XHR protection. I use NS for full-domain basis but RP for 2nd level domain, which is my compromise btwn security and usability.

    As MisterB said, most websites work w/out script, but I think it depends on your internet usage. More than 90% of my internet is for text info or documents. I don't play online-game, don't use SNS, don't watch movie much.
     
    Last edited by a moderator: Feb 12, 2015
  3. 142395

    142395 Guest

    I haven't experienced noticeable conflicts by using them together. It seems there's priority in which some addon precedes to other.
     
    Last edited by a moderator: Feb 12, 2015
  4. 142395

    142395 Guest

    That's true and you can even realize Request Policy via ABE, but to be honest that will be too much pain for even those who understand ABE rule. ABE don't have any usable UI so you always have to manually add or edit rule sets. I only made rule for compatibility with a filtering program so far, but won't go more.
     
  5. Thanks Rich,

    I meant when you install your programs you set the white list. The white list is maintained, but white listing stronghold is that when you have the programs needed to do the thing you want to do WITH the computer, it should be kept as static as possible with only functional additions. So when you do things TO your computer (like installing a lot of new software) than this frequent changes undermine the idea of whitelisting (one should consider other defenses).

    Regards Kees
     
  6. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    Now that you put it this way, kees, you make a very compelling argument. Script control is not going to make for perfect security at all, especially given the realistic scenario you present. I think it illustrates the importance of the layered approach as noone alluded to earlier, in that if one does inadvertantly allow the malicious script, their other security measures in place should stop its progress somewhere along the line (anti-exec, sandbox). At least, however, many scripts are blocked by default, especially the advertisement types, and aren't these latter types the ones usually compromised? So it's not perfect, maybe even far from it, but there is some measure of security, especailly with a prudent, careful user, and of course there's the added benefit of faster loading and cleaner web pages.
     
  7. Compu KTed

    Compu KTed Registered Member

    Joined:
    Dec 18, 2013
    Posts:
    1,412
    For strictness I use the full address level of classification in RP. If I whitelist a domain in NoScript allowing
    JavaScript then I still have Request Policy to restrict third-party sites unless I allow those cross-site requests.
    Both extensions start with empty whitelists. Do have some sites blacklisted in NoScript.
    Requires work, but default-deny is policy I prefer.
    For more protection the browser is sandboxed (Sandboxie) and everything is run in restricted account.
    Browser connects to Internet through proxy settings which does the filtering of unwanted garbage.
    Also available is system wide virtualization, file protection , SRP and anti-exe.
    Why would I need to worry about scripting, besides nothing is done on permanent basis?
    One can temporarily allow then revoke permissions and finally delete sandbox on session end.
    If I were to play the lottery I have a better chance of winning it then getting hit by a zero-day exploit that
    would succeed.
     
  8. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    The web content itself will often dictate the sequence. Links contained in javascript won't be seen by Request Policy until that javascript is processed. Whether or not those links stop at Request Policy depends on the browser and the intentions of the developer.
     
  9. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    The first time ever I visited the following page this morning:

    -http://mortoray.com/2014/09/15/blocking-javascript-seems-like-a-good-idea/

    I didn't have to do a thing for the page to render what I need to see. uMatrix with its current settings and rules blocked what's seen in the attached. That's a fair chunk of content from a site that doesn't have much active content in the first place. From my limited pov, this is not only reducing clutter, but it's also providing at least some measure of security, because chances are pretty good some of the sites blocked could be and probably have been targeted before, and may be be targeted again.
     

    Attached Files:

  10. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    The article is a good summary of the advantages of javascript blocking. From my month long experiment in having it totally disabled in Opera, I will add one more thing: Javascript uses a lot of CPU cycles, network bandwidth and memory and your computer will be a lot faster with it blocked. Passive Html and CSS formatting are only active when you click on a page and load it. Javascript can keep executing as long as the page is loaded in the browser and keep using up resources until the page is closed. The CPU and memory use are pretty significant. 25-40% more at times with javascript enabled.
     
  11. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    @wat0114
    I had similar results here. The content that mattered was available without having to make any allowances, even though there was plenty one could allow, 44 scripts, one noscript, and 3 iframes. 7 of the scripts were to 3rd party websites. This doesn't include any additional material that the 3rd party sites would have added.
    contents.png
    Options.png
    Between Prox and RP
    With the content filtered, Request Policy shows the site itself added 3 links without using scripts, iframes, etc. With Proxomitrons filters bypassed, RP showed 7 links.
    options2.png
     
  12. Compu KTed

    Compu KTed Registered Member

    Joined:
    Dec 18, 2013
    Posts:
    1,412
    -http://mortoray.com/2014/09/15/blocking-javascript-seems-like-a-good-idea/
    I'm seeing 12 blocked destinations from Request Policy

    widgets.wp.com
    pixel.wp.com
    s2.wp.com
    mortoray.files.wordpress.com (http & https-2)
    1.gravatar.com
    i1.wp.com
    2.gravatar.com
    0.gravatar.com
    s1.wp.com
    s0.wp.com
    s2.wp.com

    NoScript: mark as untrusted (same site)

    c.amazon-adsystem.com
    mortoray.com
    partner.googleadservices.com
    s.skimresources.com
    r-login.wordpress.com
    stats.wp.com
     
  13. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    Yeah, those types of sites where no or few allowances are required make it easy on the user. It's those sites that have tons of content, where so much is required just to play a video that drive me nuts. In many cases, I end up ditching them altogether and move on to something else that contains what I need without all the fuss of allowing tons of 3rd party crap. The way I'm seeing it, is if a content provider can't build a website without requiring cascades of 3rd party scripts, then it's not worthy of viewership.
     
  14. bo elam

    bo elam Registered Member

    Joined:
    Jun 15, 2010
    Posts:
    6,146
    Location:
    Nicaragua
    Hi Wat, in my opinion, that is the kind of site that makes it worthwhile to use programs like NoScript. The ones that are like you describe that I bookmark, once I whitelist the two or three domains that are required to get the content that I am looking for and untrust most of the rest, next time I visit the site, the work is already done and the site works fine. It is extremely rare for me to whitelist something new when I visit a site again. Sometimes I might find a new tracker, I usually add then to my black list with a couple of clicks. But thats it.

    Bo
     
  15. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    I find it's the sites that require far more than just 3-4 scripts to get them working, especially for a video, such as a few news and sports sites that need several dubious 3rd party scripts to get things working right. I remember recently having this type issue with nfl dot com. I finally got things sorted but I was close to giving up on it.
     
  16. bo elam

    bo elam Registered Member

    Joined:
    Jun 15, 2010
    Posts:
    6,146
    Location:
    Nicaragua
    I love the NFL but I dont bookmark nfl.com. Wat, a site like that is very easy to figure. To watch videos there all I had to do was allow conviva.com and nfl.com. Nothing else. I saw three other domains there, two were already in my untrusted list and I blacklisted the other one. You need to practice.;)

    Bo
     
  17. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,561
    Location:
    The Netherlands
    Wow, you guys are really into this subject. I personally use script blockers not for security but for speed and a little bit of extra privacy. And I only block third party scripts, this way about 80% of all sites keep working correctly. For the other 20% I need to do some white-listing. And when I don't feel like fine-tuning, I use only Ghostery, without a dedicated script-blocker.
     
  18. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,065
    Location:
    Canada
    Oh I get lots of practice, no worries there :D it was some video post super bowl, unusual it was so difficult to play without allowing so much. Normally the site worked no issues.

    You might be gaining more security than you think. The nice thing about script blockers, especially the popular ones, uMatrix in particular, is one can set them up for light - aggressive blocking, depending on the user's wants/needs and willingness to spend time fine-tuning them. I guess for your needs you could set it up somewhere in the light-moderate range, whereas in my case it's fairly aggressive. I've spent a lot of time fine-tuning, whitelisting content, but as time marches on, the time managing it has decreased significantly.
     
  19. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,561
    Location:
    The Netherlands
    When it comes to security I rather rely on sandboxing and anti-exploit, but if script-blocking helps I'm all for it. And if speed was no problem for me, I'd only use Ghostery, but I discovered that ScriptKeeper (NoScript for Opera) blocks scripts faster, and because it blocks more, websites load faster. Of course the drawback is that it might break some sites, while Ghostery generally does not. Not a big deal, because nowadays I mostly browse to my favorite and other "known" sites. If I need to do some "general browsing", I switch to "allow all scripts" and let Ghostery do the blocking.
     
  20. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    How script blocking is handled should be a function of your overall security policy. Ultimately, your overall security depends on 3 factors:
    1, Control over what is allowed to run on your system, how these processes can interact with each other and your hardware.
    2, Control over the traffic in and out of your system.
    3, Control over the content of the traffic that is allowed.
    Your core security policy should dictate your choice of tools and how they'll be configured. That core policy is selected to best enforce your needs and to best match your skills and the amount of time you're able to devote to it. It is pointless to try to implement a default-deny policy over web content if the rest of your system isn't built to the same policy.
    On a system using a default-permit policy, applications and their traffic are allowed unless they're identified as malicious or undesired. Your control over content should follow the same policy. The core policy of Ghostery blocks the known undesirables, making it a good match. In such a package, SandBoxie is a good option for dealing with the unknowns and undetected.

    On a system using a default-deny policy, only whitelisted applications can run. Internet access is also whitelisted. On such systems, Request Policy, NoScript, and Proxomitron can be used to whitelist content and connections.

    On default-deny systems, the user has 2 basic options.
    1, They can build their own whitelists.
    2, They can use applications that rely on vendor whitelists.
    Each is tailored to a different set of user needs. Your choice of applications or methods for content and connection control should match the rest of your package. Request Policy for example comes with several whitelists. You can choose one of them or build your own. Proxomitron offers an almost unlimited number of options for control, varying from very light to blocking most everything. Most of the script blocking extensions have options that govern the amount of blocking, filtering, and user interaction that's required. Choose the option that best matches your core policy. For users who want to manage every process and application on their PC as well as how they're allowed to interact, classic HIPS are good options. Request Policy and Proxomitron can give the user the same degree of control over the allowed traffic and content. Like a classic HIPS, this package will require a lot of user interaction and a fair amount of knowledge about the content they're wanting to control. Both can and will break sites until you allow what is needed.

    Your needs, abilities, and time should dictate your security policy. Your security policy should determine your choice of tools and how they'll be configured. There are no "best" applications or tools. There are best matches to your needs. The more that your choices stay within your core security, the better your overall experience will be. Be realistic about your abilities and true to your needs.
     
  21. Sampei Nihira

    Sampei Nihira Registered Member

    Joined:
    Apr 7, 2013
    Posts:
    3,365
    Location:
    Italy
    Hi Rmus.
    Which version of Opera browser?
    TH.
     
  22. Rmus

    Rmus Exploit Analyst

    Joined:
    Mar 16, 2005
    Posts:
    4,020
    Location:
    California
    Hello Sampei Nihira,

    12.10.

    I downloaded 12.14 last year but haven't gotten around to installing it...

    ----
    rich
     
  23. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,561
    Location:
    The Netherlands
  24. Rmus

    Rmus Exploit Analyst

    Joined:
    Mar 16, 2005
    Posts:
    4,020
    Location:
    California
    Thanks for the recommendations, Rasheed187, but I'll pass on them because I don't care for the idea of constantly tweaking and configuring and fiddling with gadgets. That would waste time better spent on the internet.

    As far as pages that are "heavy to load," well, in these days of fast connections, that is not a problem. I just checked my connection:

    speedtest2.jpg

    regards,

    ----
    rich
     
    Last edited: Feb 16, 2015
  25. bo elam

    bo elam Registered Member

    Joined:
    Jun 15, 2010
    Posts:
    6,146
    Location:
    Nicaragua
    Hi Rich, I don't know about others using programs like NoScript but I spend maybe, a few seconds a week, perhaps to add a few domains to my untrusted list. But that's it. Sometimes I go weeks without adding anything new to the white list. The time I spend maintaining NoScript is almost none.

    Bo
     
    Last edited: Feb 16, 2015
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.