CDN tracking by sites like CloudFront, Akamai, Limelight?

Discussion in 'privacy technology' started by gaiko, Aug 7, 2014.

Thread Status:
Not open for further replies.
  1. gaiko

    gaiko Registered Member

    Joined:
    Jan 14, 2014
    Posts:
    9
    Location:
    moldova
    I don't know too much about CDN but it seems like these networks host a huge chunk of the web and if one is surfing from one "site" to the next which but happens to say all be hosted by cloudfront or Akamai then that sounds like a tracking goldmine hence my main question:

    1) How big of an issue are these sites in terms of vacuuming up data (like so many sites probably do anyway, these guys just *seem* ideally positioned)

    2) Assuming they are collecting info, how would one go about protecting ones self from this? (hopefully a way that doesn't involve "don't surf the net anymore")
     
  2. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,030
    If they can, you must assume that they are, no?
     
  3. gaiko

    gaiko Registered Member

    Joined:
    Jan 14, 2014
    Posts:
    9
    Location:
    moldova
    Touché, though I had kinda assumed they were I still like the "confirmation of the crowds" to affirm that its not just all in my head ;)

    So, the next question, what are some ways that people browsing sites that are hosted can minimize info collected about them?
     
  4. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,103
    Location:
    Southern Rocky Mountains USA
    Adblockers, script blockers, VPNs, domain blacklists, deleting cookies each session among others.
     
  5. gaiko

    gaiko Registered Member

    Joined:
    Jan 14, 2014
    Posts:
    9
    Location:
    moldova
    Hmmmm, I have most of those things going (plus some) but the script blocking and/or controlling of cross-site requests is what i was really wondering about. What really seems to "count" are those two things, and incidentally those seem to be the biggest "holes" in terms of info leak (I am no guru, feel free to enlighten me if i am wrong here).

    Thing is, while i can turn on/off adblocking, VPNs, delete/block cookies and history few other things seem to cripple usage of sites more than blocking and controlling of cross-site requests. It seems like these things *must* be enabled, but do they? I mean is there a way that instead of being forced to use some cloud service that my browser can auto-redirect to the original source? (assuming many individual sources are less able to track than a few "super cloud sources")(besides I am not *that* concerned about speed).
     
  6. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,030
    For the most part, I hinder the collection of information in typical ways: using private browsing, having the browser delete third party cookies at shutdown, forging referer, blocking all ads, blocking nonessential scripts, and so on. But mostly I focus on minimizing the value of collected information, by fragmenting my online activity among multiple personas and semi-anonymous sessions.

    For core personas like mirimir, where potential risk from compromise is low, I use persona-specific VMs. They share host machines, but not with anything that involves my true name. I use other machines, on a separate LAN, for all true-name stuff. Each persona-specific VM reaches the Internet through a unique pattern of nested VPN chains. Overall, I use branching chains, which typically use the same initial VPN.

    I use semi-anonymous sessions for high-risk activity, generally using Tor, sometimes using Whonix, sometimes Tails, and sometimes custom Tor gateway VMs and workspace VMs. At times, when it seems appropriate, I segregate those VMs on separate host machines, on a separate LAN. All of my host machines use dm-crypt/LUKS full-disk encryption.

    Much of that is covered in my guides, which are published by iVPN.
     
  7. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,088
    Lets say a company serves www.example.com from its own dedicated systems, but uses a third-party CDN to serve heavy content from cdn.example.com. Theoretically, that heavy content might be available via requests to www.example.com. Theoretically, you might be able to dns map cdn.example.com to the same IP Address as www.example.com and fetch the heavy content that way.

    Then again, perhaps such approaches would work but only by accident and you'd end up retrieving stale if not partially broken content. Perhaps there is some form of load balancing/DOS protection going on that would make it work sometimes and sometimes not. Perhaps the heavy content only exists on the third-party CDN and some internal servers the company uses for development.

    IOW, it would seem an ugly and unreliable path to pursue. Plus, a huge percentage of sites are entirely third-party hosted. Which raises some of the same concerns and there is nothing that can be done about that. Except press the operators to use their own infrastructure rather than third-party infrastructure, when the nature of the site/service is such that doing so would be warranted.
     
  8. gaiko

    gaiko Registered Member

    Joined:
    Jan 14, 2014
    Posts:
    9
    Location:
    moldova
    @mirimir: Kudos for your discipline. For regular posts do pretty much as you beyond that I do a bit of what you do but not nearly as much, but it does kind of answer my question i guess that really getting around these clouds is not really an option, unfortunate but i guess not a surprise these days. Amazing how broken/crippled the browsing experience is if one is even partially shielded from all the cruft on the net nowadays. ah well thanks for the input!
     
  9. gaiko

    gaiko Registered Member

    Joined:
    Jan 14, 2014
    Posts:
    9
    Location:
    moldova
    Well that answers that. thanks for the explanation, unfortunate that it wouldn't work but oh well.
     
  10. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    From http://www.net-security.org/secworld.php?id=17297:
     
  11. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,030
    It might be fun to see how much content differs among different users.

    Do we know what sites create user-specific sub-sub-domains "to accelerate and optimise content delivery and enable individual visitor tracking"?
     
Loading...
Thread Status:
Not open for further replies.