Websites and other methods to decide if a given domain should be (white/black)listed

Discussion in 'privacy technology' started by MrBrian, Jan 18, 2014.

Thread Status:
Not open for further replies.
  1. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Websites and other methods to decide if a given domain should be whitelisted or blacklisted in browser extensions such as NoScript and RequestPolicy:
    1. http://website.informer.com/ (links to Web of Trust domain review)
    2. Search for domain in Google, Bing, etc.
    3. Web of Trust
    4. NoScript users: middle-click on domain in list for more info on it
    5. Privacy extension users: if domain is common in "baseline" column of "HTTP Requests" tab of spreadsheet at https://www.wilderssecurity.com/showpost.php?p=2329463&postcount=28, then consider blacklisting it

    Any others?
     
  2. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
  3. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    With Request Policy, it's often not clear which of the blocked sites/servers are needed for the page to display or work properly. Some of the items to keep blocking will be obvious, like known adservers. Some of the necessary sites will be obvious as well, example:
    You visit arstechnica.com
    Request Policy blocks connections to arstechnica.net, which contains the page layout and image files.
    Reloading the site will often narrow down the choices for you. Connection requests that disappear or are replaced with new ones are usually ads. Often you can hover over blocked content and see where it was supposed to originate. This often applies to user posted images and files.

    More often than not, the connections that are necessary for the site to work properly will be at or near the top of the list blocked by Request Policy. Quite often, when you allow a connection to another server with necessary content, more links to unnecessary sites will be added.

    If no criteria is available on which to base your decision, start at the top of the list and use the allow temporarily option. If the temproarily allowed connection doesn't fix the problem, block it again, clear the cache and cookies.
     
  4. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    @noone_particular: Thanks for your input. I think my list is perhaps more useful for blacklisting rather than whitelisting. (RequestPolicy v1.x can be used in "default-allow" mode.)
     
  5. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    7. If using Ghostery, see what Ghostery is blocking. You can also enable Ghostery Advanced option "Reveal tracker source URL lists by default (in the findings panel)" to see what domains have blockages.
     
  6. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I haven't looked at the 1.X versions. The default-allow mode is something I'd never use.

    I question just how much benefit there is to blocking connections to a site or server after you've already allowed them by default. If the site was going to drop trackers or something equally undesired, it's already been allowed to do so. It's a lot like internet/trusted/restricted zones that IE6 used. It was pointless to put a site in the restricted zone after connecting to it in the more permissive "internet zone". If the site was malicious, the damage was already done.
     
  7. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    No question that from a security/privacy standpoint, default-deny is better. I'm just not sure if I'd have much hair left after trying that with RequestPolicy for awhile though. I never had that reaction to NoScript.

    So then the question becomes, what's the next best that I can do? 1. Use a blacklist scanner like Ghostery. 2. Supplement that with default-allow RequestPolicy. When browsing a given website for the first time, either clear the entire browser history (cache, cookies etc.), or make sure that you haven't browsed anything sensitive since you last cleared the browser history. Whether using RequestPolicy in default-allow mode adds enough from a cost(including time spent configuring)/benefit analysis point of view, I haven't decided yet.
     
  8. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
    Fortunately, one can use different default policies for different things. For example...

    I think many would find a default deny, selectively allow approach tolerable for Referer and cookies... even Javascript and Flash... if they push past the initial surge of work getting things the way they want them for the sites they frequent. After that, adjustments will be spaced out and far less painful. On the other hand, I think many would strongly prefer a default allow, selectively deny approach for images. Images are often a type of content one is explicitly looking for and they help reveal whether a site is even worth spending time on.

    I think most people visit a relatively small number of sites that absolutely require special attention. Financial sites, shopping sites, web mail sites, social networking sites, forums, other sites where an account and personal information is involved, sites where activity would be too revealing in other ways, etc. It is important to use strong policies for these types of sites and a default deny, selectively allow approach for requests would help meet that objective. On the other hand, when researching something or free-ranging for pleasure one can end up visiting a very large number of "less important" sites. Ones that they haven't visited before and may not visit again or any time soon. For sites that fall into this latter category (sites in general), a default allow, selectively deny approach may be reasonable and it would certainly be much more tolerable.
     
  9. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    @TheWindBringeth: I agree 100%. I wish RequestPolicy had a setting for using default-deny on bookmarked sites and default-allow on other sites. I mentioned a technique for implementing default-deny on certain sites while using default-allow in general in the RequestPolicy 1.x thread; however, a bug prevents it from working well in some cases because there are extraneous "allow redirect" confirmations.
     
  10. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
    I haven't looked at, let alone used, RequestPolicy for a long time. If you mean https://github.com/RequestPolicy/requestpolicy/issues/345, I just read that. Totally shooting from the hip here, but the first description sounds like a situation where RequestPolicy is detecting what it needs to detect but is not reporting the redirects and related block options in the user interface. Can you find where rules are stored and manually create your own that would work in such a situation?
     
  11. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    @TheWindBringeth: Yes, I believe it's due to that same issue. I haven't been able to figure a way around it yet, but I haven't explored it thoroughly either. What happens is the UI asks for permission to redirect, so the user has to confirm for the redirect to happen - i.e. extra button clicks are needed. In the case of the default-deny for some websites method that I proposed, the number of extra redirection clicks can be so many that it's not practical in some cases.
     
  12. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    IMO, it's not much different than setting up a classic HIPS. There's a lot to do at first, but once the core configuration is done, you have much less to do and much more time between them. When I added Request Policy, I deselected all of the lists and started with nothing. For most every site I visited afterwards, I had to set permissions. Once the site was done, no more changes were needed there. Now the only time I set permissions is on sites I haven't been to since installing it. How inconvenient it is will depend greatly on your browsing habits. While I'm liable to browse anywhere at different times, most of the time I visit the same sites.

    I have to wonder how an extension like Request Policy interacts with NoScript or Ghostery. With extensions that have some overlap in function, which gets "first shot" at the content? How is that determined? Do they effectively work "in series" like electrical components or can one bypass or defeat the other? I've often wondered if a browser can be exploited in a way that would cause it to bypass extensions like NoScript. These are some of the reasons I've opted to filter content before it reaches the browser with Proxomitron instead of filtering it with the browser.
    The most I've ever seen is 4. That was a browser test site. Do you have a link handy that shows that behavior?
     
  13. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    In my limited testing of Ghostery+Adblock Plus+RequestPolicy with Network Monitor, if any extension blocks an element, then it's blocked. I haven't seen an exception so far.

    There is a known problem with unwanted request leakage when using HTTPS Everywhere with RequestPolicy. I started a separate thread recently about that issue. Ghostery and Adblock Plus might also have the same leakage issue when used with HTTPS Everywhere.

    The above comment of mine applies only when using RequestPolicy in default-allow mode, and trying to make a given site default-deny by adding appropriate rules; see post #5 in the RequestPolicy 1.x thread for more details.
     
  14. dogbite

    dogbite Registered Member

    Joined:
    Dec 13, 2012
    Posts:
    1,290
    Location:
    EU
    With HTTP Switchboard risks of leak are really limited, since anybody can control exactly what goes out.
     
  15. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I disabled HTTPS Everywhere a while ago. Haven't removed it yet. IMO, HTTPS is broken and needs to be replaced by a system that doesn't rely on certificate authorities. I'm also waiting to see these questions regarding the SSL Observatory answered in depth.

    Regarding Request Policy 1.X versions, if the default-allow options aren't used, how does it perform compared to the earlier version?
     
  16. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    You're missing big ones like VirusTotal and URLVoid. Instead of using all those services, why not only one that aggregates from multiple sources?
     
  17. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    If you're using default-deny, I think I would still go with 1.x; backup your existing ruleset first just in case. You might like the 1.x ability to make multiple changes before a given website refreshes.
     
  18. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Thanks :). For privacy-related issues though, I'm more interested if a given domain has a privacy problem rather than a malware problem.
     
  19. gorhill

    gorhill Guest


    For a site named "privacychoice.org", I really thought their list of trackers could be downloaded from their website for non-commercial use.
     
  20. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
    From http://blog.privacychoice.org/2013/05/21/privacychoice-avg/:
    Domain Name: PRIVACYCHOICE.ORG
    Created On:29-Apr-2008 21:45:59 UTC
    Last Updated On:18-Jul-2013 16:07:27 UTC
    Registrant Organization:AVG Exploit Prevention Labs

    FWIW: http://web.archive.org/web/20120530095836/http://www.privacychoice.org/companies/all
     
    Last edited: Jan 20, 2014
  21. gorhill

    gorhill Guest

  22. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.