One day (soon?) we can uninstall HTTPS Everywhere

Discussion in 'privacy technology' started by Stefan Froberg, Apr 11, 2018.

  1. Stefan Froberg

    Stefan Froberg Registered Member

    Joined:
    Jul 30, 2014
    Posts:
    747
    According to wired article from last year, around 50% of the web is now using HTTPS

    Code:
    https://www.wired.com/2017/01/half-web-now-encrypted-makes-everyone-safer/
    
    Also google sees more and more encrypted traffic
    Code:
    https://transparencyreport.google.com/https/overview?hl=en
    
    So what does that mean? It means that one day we can say sad goodbye to our favorite HTTPS everywhere extension. And the sooner the better. Why? Because as the HTTPS Everywhere rule lists keeps rising (as more and more of the https enabled sites needs to be added to list) so will also the the work of keeping it updated and the overhead that the browser itself takes.

    The list is currently now 90 MB size and has 22 152 entries with each entry having one or more (sub)domain rules.
    Code:
    https://gitweb.torproject.org/https-everywhere.git/tree/src/chrome/content/rules?h=2018.4.3
    
    So several https domains will be likely missing from that list and also, as new domains popup the list needs to be manually keeped updated.

    In otherwords, it probably won't scale in the long run and your browser will probably get slower and slower eating more and more memory...

    Could there be better approach while waiting to reach that 100% HTTPS status?

    EDIT: Miscalculation, right size and entries: 90 MB and 22 152
     
    Last edited: Apr 11, 2018
  2. Minimalist

    Minimalist Registered Member

    Joined:
    Jan 6, 2014
    Posts:
    14,883
    Location:
    Slovenia, EU
    Maybe an addon that would replace all http requests with https and change back to http if htttps desn't work? It would have to collect all errors that can occur while incorrectly changing to https and it would probably slow down browsing to http sites.
    Of course IDK how this could be implemented, it's just an idea.
     
  3. Azure Phoenix

    Azure Phoenix Registered Member

    Joined:
    Nov 22, 2014
    Posts:
    1,559
    Look up SmartHTTPS addon

    Though EFF stated there might be issues with that.
    https://www.eff.org/https-everywher...d-only-fall-back-to-http-if-it-isnt-available
     
  4. Minimalist

    Minimalist Registered Member

    Joined:
    Jan 6, 2014
    Posts:
    14,883
    Location:
    Slovenia, EU
    Thnx. At the moment I don't use addon to switch to https.
     
  5. Stefan Froberg

    Stefan Froberg Registered Member

    Joined:
    Jul 30, 2014
    Posts:
    747
    Thanks @Minimalist and @Azure Phoenix

    I was wondering that too that why could browsers not just default to https and did not know that
    there already is an firefox extensions for that.

    That HTTPS Everyhere FAQ says:

    "There is no guarantee that sites are going to give the same response via HTTPS that they give via HTTP.Also, it's not possible to test for HTTPS in real time without introducing security vulnerabilities (What should the extension do if the HTTPS connection attempt fails? Falling back to insecure HTTP isn't safe). "

    Let's say some browser automatically default to https.
    So if I type:
    www.example.com it will automatically try https://www.example.com first

    If the site has https enabled, the cert is not self-signed and possible some other security parameters (if using TLSv1.0, TLSv1.1, TLSv1.2 what chiper is being used, etc...) pass then everything is dandy and the browser loads the page.

    If any of the above did not pass then it's cold stop. No http page or anything because I am
    only interested of the https version of content, *but* maybe give user an option to see an unencrypted version of the content?
    Just like browsers warn and block connections to sites using self-signed cert but still let users to see them if they really want by adding exception to list ?

    That would solve the following problems:
    - downgrade attack. There is nothing to downgrade to automatically.
    If the https did not pass then it's cold stop and up to user if viewing the content over unencrypted
    channel is worth the risk.
    - no need to keep ever growing list, or maybe a much much smaller list (in those cases where https support for site is so wonky that it needs to have redirecting rules)

    And if I did type the http://www.example.com instead the browser would again give the same warning about unencrypted connection but would let me continue (or cancel) with the click of Ok button (or Cancel) an maybe small checkbox below that could be ticket if wanting to add the site to whitelist of exceptions.

    Basically, just an extension of self-signed cert warning page of browsers.

    That would cover the browser end.
    And as for https site owners, as that HTTPS Everywhere FAQ mentions:

    1. Set up a redirect from HTTP to HTTPS on your site.
    2. Add the Strict-Transport-Security (HSTS) header on your site.
    3. Add your site to the HSTS Preload list.

    So any site that is serious about security of their users should do those things.
    And also check their site against Qualys SSL server test page:
    https://www.ssllabs.com/ssltest/
     
  6. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
    It is easy for a browser to safely detect encryption/cert problems. It is only *sometimes* easy for a browser to safely detect that HTTPS can't be forced. A page forced to HTTPS may load without any errors at all, but be different (in some negative and potentially non-obvious way) than the HTTP version which the site operator expects users to access. The browser can't automatically and safely and reliably perform a comparison of HTTP and HTTPS versions to inform the user when forcing HTTPS breaks something. Most users can't be relied upon to make such calls for themselves either.

    Having said that, some do hard force HTTPS across the board or for whole domains (even TLDs) and that works out OK for them. Partly because they know what to look for, they are technical and put in the time to build up their rules, etc. Partly because most of the sites *they* use now fully support HTTPS and the work involved isn't too burdensome. Thanks to the increased expectation and use of HTTPS, that will get easier. Eventually, the non-forceable cases should be so few and insignificant that broad forcing can be rolled out to even ordinary users. I don't think we are at that point yet. For now, the majority of users still need human-curated include/exclude rules (or at least exclude rules).
     
  7. Stefan Froberg

    Stefan Froberg Registered Member

    Joined:
    Jul 30, 2014
    Posts:
    747
    You mean comparison of the actual content between http delivered channel and https delivered channel? I have an idea but don't know how practical it would be...

    Then maybe again it won't be needed because more and more of the traffic will be available as HTTP/2.
    According to wikipedia article 25% now and I also see on my own server logs more and more HTTP/2 connections.
    https://en.wikipedia.org/wiki/HTTP/2

    Althought the encryption is not mandatory for HTTP/2 by the specs, all the major browsers have stated that they won't support unencrypted connections.
    So maybe that's the way to go when trying to make all net encrypted. Just start gradually force sites use HTTP/2 (really not any different when the browsers started forcing sites to upgrading their poor certs)
     
  8. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,171
    Yes. If both HTTP and HTTPS are available, how do you detect that the HTTPS server is serving what the user wants and it is correct in all important respects? The HTTPS server may host an entirely different site than the HTTP server (maybe what the user wants, maybe what the user doesn't want). Forcing HTTPS can trigger mixed-content blocking that breaks some things. If you force HTTPS for everything (top-level and subresource loads) you may run into an important subresource not being available via HTTPS. Failures to load may be caused by other factors, such as use of an ad blocker. Maybe the HTTPS issue is a presentation problem (significant? insignificant?). Maybe the HTTPS issue only becomes apparent when a certain feature is exercised. Etc.

    A comparison between HTTP and HTTPS content/behavior can be very helpful. But in a context where forcing HTTPS is the objective, HTTP loads are bad. The worst form would be passing a full URL, cookies, or allowing Javascript and access to other client side storage or fingerprintable data. So you don't want to expose a real user and their info to that. Maybe we could perform a sanitized HTTP load? Oh, but by stripping cookies or making other such changes we may change the content that is served to us and prevent our comparison. The comparison can also be thrown off by the different protocol and/or host substrings in absolute URLs, or even something as negligible as "Congratulations! You are accessing this site via HTTPS as you should be!" text or image in the HTTPS version.

    It is an interesting problem. If you think you have a good idea, prototype it and/or run it past the folks working on a related project :)

    I wonder if the HTTPS Everywhere guys are utilizing a scanner, analysis of common crawl, and/or some other means to at least pre-screen for HTTPS forceability.

    Edit:
    Had a moment to find some privacy related HTTP/2 links I had stashed. Something to think about:

    https://tools.ietf.org/html/rfc7540#section-10.8
    https://www.torproject.org/projects/torbrowser/design/#identifier-linkability (11. SPDY and HTTP/2)
    https://www.akamai.com/us/en/multimedia/documents/white-paper/passive-fingerprinting-of-http2-clients-white-paper.pdf
     
    Last edited by a moderator: Apr 13, 2018
  9. Stefan Froberg

    Stefan Froberg Registered Member

    Joined:
    Jul 30, 2014
    Posts:
    747
    Well, I just immediately tought about Christian Hascheks work and how he checked bad proxies (aka proxies that modified HTML by injecting ads, scripts etc).

    Basically, his checker script takes a checksum of the content of both HTTPS deliver content (that is, if the proxy even allows HTTPS in the first place) and HTTP deliver content and just compares them to see if they have been tampered with.

    https://blog.haschek.at/post/fd854
    https://github.com/chrisiaut/proxycheck_script

    But as you noted, it's not just HTML but also cookies, scripts and other stuff that have to be interacted with...so maybe it won't work in this case.

    Or at the very least, there should maybe be some way to take those checksums of the HTTP deliver HTML content without any traceability (helper server? one that keeps list of good content checksum? similar to Steven Gibson SSL cert fingerprint service but for HTML pages? trust issues?) ...

    They had at least something called "SSL Observatory" but latest version is missing it?


    Damn :(
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.