HTTP Switchboard for Chrome/Chromium:

Discussion in 'other software & services' started by apathy, Nov 25, 2013.

  1. FleischmannTV

    FleischmannTV Registered Member

    Joined:
    Apr 7, 2013
    Posts:
    1,093
    Location:
    Germany
    I have observed that globally allowed positions are now carried over into site specific scopes. Is this observation correct?
     
  2. gorhill

    gorhill Guest

    This has always been the case: For convenience purpose, global rules (only the relevant ones) are copied to a newly created scope. This was especially convenient when scopes where first introduced, at a time when users had most of their rules in the global scope. But I think it is still convenient. It happens only at scope creation time, and a simple click on the eraser bring back the matrix to the default factory rules.
     
  3. luxi

    luxi Registered Member

    Joined:
    Aug 31, 2013
    Posts:
    74
    I mean it doesn't happen on the front page, it only redirects on a search page (for example, search for "test", then on that page a second search "test 2" will redirect instead of serving results).
     
    Last edited: Apr 27, 2014
  4. gorhill

    gorhill Guest

    I confirm the behavior. No clue why. It's a server thing, somehow it seems to behave differently when it sees the agent as Firefox...

    Edit: I tried with Firefox itself, and the problem doesn't occur. So the startpage.com server "knows" it's a fake user agent string? Or that the server response is different for Firefox and this response trips Chromium?
     
    Last edited by a moderator: Apr 27, 2014
  5. Jarmo P

    Jarmo P Registered Member

    Joined:
    Aug 27, 2005
    Posts:
    1,207
    Latest update unticked my setting: Auto create temporary site-level scope. Is there a reason to have it unchecked?
    Noticed also that you can auto create temporary domain-level scopes. Is that better than site-level temporary scopes?
     
  6. luxi

    luxi Registered Member

    Joined:
    Aug 31, 2013
    Posts:
    74
  7. dogbite

    dogbite Registered Member

    Joined:
    Dec 13, 2012
    Posts:
    1,290
    Location:
    EU
    Last edited: Apr 28, 2014
  8. luxi

    luxi Registered Member

    Joined:
    Aug 31, 2013
    Posts:
    74
    The real issue is with the particular user-agent string that is shipping with HTTPSB; however, I can confirm it works just fine with the user-agent active when using GET instead of POST (configured in Startpage search settings).
     
  9. gorhill

    gorhill Guest

    It should have kept your setting, I put code in there to ensure backward compatibility. Looks like it didn't work. Argh. It annoys me when I fail at respecting user settings. However I still fail to see why the backward compatibility code would not work.

    Edit: Never mind, found the mistake, the code for backward-compatiblity is correct, problem is that I don't read the old setting in the first place, so the code to translate the old to the new cannot do its job.

    Regarding domain- versus site-level scope: https://github.com/gorhill/httpswitchboard/wiki/Change-log#0900
     
    Last edited by a moderator: Apr 28, 2014
  10. apathy

    apathy Registered Member

    Joined:
    Dec 10, 2004
    Posts:
    461
    Location:
    9th Circle of Hell(Florida)
    Gorhill: I'm curious about the auto create temporary 'domain level' scopes. Is javascript the only thing enabled? I'm curious because I don't see any changes on the matrix itself.
     
  11. gorhill

    gorhill Guest

    o_O

    Javascript has nothing to do with the auto-creation of a temporary scope. Can you elaborate?

    Edit: Look, I do not have any particular rules set for http://arstechnica.com/. I enable "Auto-create temporary domain-level scope", and when I visit http://arstechnica.com/, a temporary domain-level scope is created:

    to-forum.png
     
    Last edited by a moderator: Apr 29, 2014
  12. apathy

    apathy Registered Member

    Joined:
    Dec 10, 2004
    Posts:
    461
    Location:
    9th Circle of Hell(Florida)
    Doh, I'm totally off kilter. What does the creation of the temporary scope do? Is it just for allowing temporary changes?
     
  13. gorhill

    gorhill Guest

    Yes, whatever rules in a scope will apply only for that scope. So if I were to whitelist `all` above, it would apply only to pages on http://arstechnica.com/, instead of everywhere. Scoping really enhance security, as whatever rules is created in a scope only apply to that one scope, which scope is activated according to the web site you visit. Other unrelated web pages currently opened are not affected by the rule changes in one particular scope. "Scoping" can be described as "sandboxing rules" to one particular web site. If you do not save the rules, and "Auto-delete temporary scopes" is enabled, these will be flushed from memory after a while, so auto-creation of scopes can be made into a use and forget feature.
     
  14. apathy

    apathy Registered Member

    Joined:
    Dec 10, 2004
    Posts:
    461
    Location:
    9th Circle of Hell(Florida)
    Oh my god that's kick ass!! I see the light now. So I don't have to ruin my other hosts settings now. Wow I want to start from scratch now. It might be beneficial to add that in a faq somewhere because I am sure more people like me were lost on those scopes. Thanks gorhill, 1.0 is going to be superb.
     
  15. gorhill

    gorhill Guest

    Yes, starting from scratch is a good move. I almost shipped the latest version with the feature enabled out of the box, but I changed my mind at the last minute, I figured I need more doc out there and to think hard if there are any complications which could arise. Given that NoScript works in global scope out of the box, I decided for now to keep HTTPSB this way to not confuse new users used to NoScript.

    If ever you want to block some hostnames in all scopes, you can use custom rules in the Ubiquitous rules tab.
     
  16. apathy

    apathy Registered Member

    Joined:
    Dec 10, 2004
    Posts:
    461
    Location:
    9th Circle of Hell(Florida)
    I thought I was keeping up with all your good work and I missed a killer feature although I visit your github page every morning.
     
  17. gorhill

    gorhill Guest

    It's kind of discouraging to see the myth that it is not possible to reliably block javascript execution in chromium is still going strong, regardless of the amount of effort put into trying to dispel the myth.

    So as a definite reference on the topic, here: https://github.com/gorhill/httpswit...execution-reliably-in-Chromium-based-browsers if ever you come up against the myth.

    I do see it popping out here and there all over the net, so hopefully this will contribute to put it to rest eventually.
     
    Last edited by a moderator: Apr 30, 2014
  18. guest

    guest Guest

    Am I correct to say that blacklisting is only work when the strict blocking option is enabled? Also, does the custom block rules support wildcard for the hostnames?
     
  19. gorhill

    gorhill Guest

    Blacklisting works independently of strict blocking. Strict blocking's purpose is to dictate what happens in case of ambiguity. For example, whitelisting "anything from example.com" while blacklisting "frame from anywhere": in such case the cell "frame from example.com" is ambiguous.

    Strict blocking says that not one part of the cell must be blacklisted for the cell to toggle into allow mode. That's a lot of words, better explained with pictures: https://github.com/gorhill/httpswitchboard/wiki/"Strict-blocking"-illustrated

    Wildcards for hostnames is supported if you use ABP filter syntax. For example, `||example.*^` should work (as a block or allow rule.)
     
  20. tlu

    tlu Guest

    Raymond, thanks a lot for the improvements in the latest versions. And yes, a version 1.0 ist definitely justified :)

    What do you think about my suggestion #3 here for a version 1.x?

    BTW: May I suggest that you tag this wiki article as out-of-date?
     
  21. guest

    guest Guest

    Pardon me for being lost here, but how does that differ from greylisting? I understand that strict blocking will prevent blacklisted requests from being allowed even if the whole domain was whitelisted. But if the strict blocking option was disabled, then wouldn't blacklisting be pointless since greylisting already denies non-whitelisted requests by default? I don't know how blacklisting and greylisting being different when the strict blocking was disabled.
     
  22. gorhill

    gorhill Guest

    Blacklisting wouldn't be pointless, it would still accomplish what it already does with strict blocking on: To block requests for whitelisted types (css/img out-of-the-box) for blacklisted hostnames. If you look at the image on the project page, you can see that despite `css` and `img` request types being whitelisted, the css and img cells for `survey.112.2o7.net`, `facebook-web-clients.appspot.com` are blocked (reddish). This will be true whether strict block is on or off.

    The sole purpose of strict blocking is to dictate the matrix how to interpret ambiguity arising for a type/hostname cell when the type itself is whitelisted and the hostname itself is blacklisted. I originally introduced strict blocking because of the malware that contaminated php.net: it became clear to me that even if you trust a particular web site, you may still not trust frames (for example) on that web site.

    Edit: I realize that maybe when you say "blacklisting" you really have in mind "blacklisting request types" (plugin, frame, etc), in which case I can see how blacklisting request types appear pointless with strict blocking disabled. However blacklisting request types is not pointless even with strict blocking disabled: to see why, whitelist the "all" cell.
     
    Last edited by a moderator: May 10, 2014
  23. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,066
    Location:
    Canada
    Last night i discovered a problem with the following website: -www.tsn.ca It does not like the httpsb extension. The webpage initially loads the graphics but then goes all white! The only fix I could find was if I disable httpsb extension; disabling Matrix filtering won't even fix it. Here's the recipe I've been using for the site:

    Code:
    *.tsn.ca
        whitelist
            * 2mdn.net
            * 9c9media.com
            * ctv.ca
            * ctvdigital.net
            * krxd.net
            * newrelic.com
            * r.twimg.com
            * s0.2mdn.net
            * tsn.ca
            * twimg.com
            image *
            object *
            other *
            script 9c9media.com
            script ctv.ca
            script ctvdigital.net
            script tsn.ca
            script twimg.com
            script www.tsn.ca
            stylesheet *
            sub_frame cauth.9c9media.com
            sub_frame esi2.ctv.ca
            xmlhttprequest *
        blacklist
            sub_frame *
            * *
        graylist
            * js-agent.newrelic.com
    I'm using Chrome browser on Linux (based on Ubuntu 12.04 LTS) platform.

    EDIT

    It looks like I found the problem; when I clear the checkbox for "Spoof User-Agent string by randomly picking a new one below every 15 minutes" the website will load normally.

    these log stats might have to do with the problem:

    Code:
    http://axislogger.appspot.com/LogMessage?callback=jQuery17205598731637001038_1399734130767&LogLevel=Error&Message=Akamai+timeout+after+15sec+http%3A%2F%2Fidp.securetve.com%2Frest%2F1.0%2Furn%3Abellmedia%3Acom%3Asp%3Atsn%3Aprod%3A1%2Finit%2F&ApplicationName=axisAuthApi
    script<a>http://axislogger.appspot.com/LogMessage?callback=jQuery17205598731637001038_1399734130766&LogLevel=Error&Message=Akamai+response+time&ApplicationName=axisAuthApi
     
    Last edited: May 10, 2014
  24. gorhill

    gorhill Guest

    Looking into this. Looks like there is a player in there somewhere which fails miserably if it cannot figure which browser is being used. I have to investigate how they get the browser id, and why it is failing.

    Edit: Ok, I see. In order to fix issue #252, I had to create a fake `window.navigator` object. But this was really tedious, as it appeared that my fake navigator object was trashed by the browser at some point when the page was loading. To prevent this, I freeze the fake navigator object, which means that nobody can change it. `tsn.ca`tries to add a property to the object, which fails, and which results in the failure of the whole page.

    I will enter an issue for this and investigate what can be done.
     
    Last edited by a moderator: May 10, 2014
  25. wat0114

    wat0114 Registered Member

    Joined:
    Aug 5, 2012
    Posts:
    4,066
    Location:
    Canada
    Thanks Raymond. "tsn.ca" has given me grief before where it won't render Flash in Firefox on Linux platforms. I guess it's because the Flash version for Linux is older and the website doesn't like it, but of course that's a different, and, I guess, probably unrelated, problem to this one.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.