Discussion in 'privacy technology' started by ronjor, Sep 21, 2010.
Well.......maybe it's not widely used because M$ closes the frontgate while leaving a backdoor open....
And maybe it's not used because BitLocker requires a TPM chip. While they're not uncommon, not everyone has one (I don't) and not a lot of people trust them. Besides, TPM has security flaws.
Of course it is not widely used.
First, as mentioned before, you need a TPM-module.
Although there are ways to make BitLocker work without a TPM-module.
Second, BitLocker doesn't come with all versions of Windows Vista or Windows 7.
And there you have a margin within a margin that explains the small percentage of users.
It's a shame when security features are specific to the version of an OS; it'd be nice to see this available across the board. Microsoft published an adaptation of some analysis of mine regarding BitLocker, some years ago, and it's unfortunate that it [BitLocker] hasn't amassed a stronger fan base by capitalizing on Windows' market share.
I've not seen a better cryptographic design philosophy in any other disk encryption application. If nothing else, TrueCrypt could take a few design cues from it; after all, with all of the momentum TrueCrypt has -- by all means, take advantage of other good design elements in order to maintain it.
Open-source the code or it doesn't matter. Why should we trust this closed source nonsense? Sure, maybe M$ let you sign an NDA or something, but who's to say you're trustworthy (speaking hypothetically of course)?
Why should we trust this open source nonsense? It's not like you or I have or ever will read the source code, it's just something we blindly trust. Sure, other people may claim to have and say everything's peachy, but who's to say they're trustworthy?
For what it's worth, this is my word -- I have neither been involved with Microsoft nor did I actually see any of BitLocker's source code. Therefore, I cannot vouch for the integrity of their internal security practices and source code.
After getting lost in the verbosity of the open-source versus closed-source debate, I've been able to succinctly capture the essence of it all; that is, the open-source model has the potential to be more secure, but neither model -- open or closed -- is inherently secure. It's not the number of eyes that matter; it's the quality of eyes. Of course, context greatly shapes public perception, and Microsoft has a history of public distrust; it's unfortunate, really, since they have quite a few geniuses on staff.
My point is that there's cryptographic muscle behind BitLocker -- in particular, Niels Ferguson -- and I tend to trust his motivations; he made sensible design decisions that could serve TrueCrypt well. That's what I meant when I said they [TrueCrypt] could take a few design cues. These design decisions are regarding BitLocker's use of CBC+Elephant. What can be seen is that BitLocker's design team understands the importance of authentication in addition to encryption, and they took a gamble with designing their own cryptographic function to address it in such a way that it's secure (i.e., security reduces to that of AES-CBC) and performs well (i.e., after all, Windows is a consumer product).
I wouldn't expect this from TrueCrypt, simply because there are cryptographers behind BitLocker, and my feeling is that TrueCrypt is a group of developers that are good at following trends instead of setting them; in the case of cryptography, this is a really good thing. XTS isn't a bad choice, by any means; it's a standard and nobody will ever blame you for using a standard. I don't see TrueCrypt going the MacGuyver route, like BitLocker, but they might consider looking into wide-block modes, like EME, as opposed to narrow-block modes, like XTS; the former has coarser granularity, which makes it much harder to manipulate data in a semantically useful way, as opposed to the latter's finer granularity. Of course, there can be significant performance hits when migrating from narrow to wide blocks, but for an application that offers cascades of multiple block ciphers, I doubt that's a deal breaker; in this case, you'd actually be getting an increase in cryptographic security.
No, but I wish I did. Despite numerous attempts at communicating with them, I've never received a response. My perception of who they might be, from a cryptographic competence standpoint, is based solely on the design decisions they've made and how the software has evolved.
Although not a user, I root for them as well. The fan base -- some of it, at least -- can, at times, get a wee bit defensive when it comes to criticism; then again, critique is a good thing. When the moment comes that you're not getting any, that often means nobody cares.
TrueCrypt is the PGP of this era, in terms of popularity, and they're in the magnificent position to do a lot more, if they'd find some way to communicate. They need feedback and they need to interact with those who provide it. Security is a constant game of critique.
Well, I know they have Niels Ferguson working for them, and he is apparently one of the foremost names in cryptologic circles. I have heard him say that he would quit in protest if MS ever asked him or anyone else to implement backdoors. I have no reason to think he's lying.
I would agree with this, especially in an esoteric field like cryptography. However, any vendor or developer that keeps their code open is, de facto, more trustworthy to me. They are essentially saying "Look we have nothing to hide. We can't say we are more secure than the closed-sourced vendors, but at least we are open for review." This says a lot to me.
I am not an expert but it seems more sensible to me to follow established trends in this field than it is to go the "MacGuyver route." Of course, what Niels and company did might be perfectly secure, but as Schneier says it's never a good idea, no matter how brilliant you are, to invent your own cryptologic primitives.
By the way, Justin, what's your opinion on dm-crypt/LUKS? It is often overlooked in these discussions since it is for Linux.
Oh, and, I watched the "Cryptographer's Panel" at the 2009 RSA conference and Brian Snow mentioned your name, Justin, as a co-author of a paper he found interesting related to new progress in the field. Actually, I found that whole panel discussion quite fascinating. It was interesting to see how testy Snow got when the others starting asking him about NSA's factoring capabilities. He said something like "I've already done this song and dance before and wont do it again." It seemed to me like none of the other guys (Rivest, Shamir, Hellman, and Diffie) really care for him. Shamir and Diffie especially, as they kept talking about NSA after Snow had made it clear he wanted to move to another topic.
It makes a world of difference to have someone on board with such integrity when it comes to cryptography; they, as a company continuously working towards an improved security posture, needed that.
Indeed, the internal aspects of an open-source are more transparent; this certainly makes it easier to establish a trust relationship. There's a tricky balance between competitive advantage and transparency.
Keep in mind that Niels co-developed Twofish alongside Bruce, so he's certainly capable; in the case of BitLocker's dedicated diffuser algorithm, Elephant, he designed it such that its security reduces to that of AES in CBC mode. In other words, if the "Elephant" in "AES-CBC+Elephant" doesn't work, then you're left with just the "AES-CBC" part -- nothing less.
As you'll read in their paper [PDF] outlining the design rationale behind BitLocker, you'll see that he's well aware of the risk, but engineers the risk in such a way that if it fails, it does so gracefully, without reducing security below what you'd expect had they not taken that risk at all.
But it's definitely a given that it doesn't pay to be different in cryptography. Following trends (i.e., standards) is a good thing.
It has been a while since I've looked at either; if you don't mind, I'll get back to you on this one.
What a panel, it was. The tension definitely added to the intrigue. It's too easy to pick on the NSA in our line of work. There comes a point when you expect very little in the way of answers, given the classified nature of most everything they do involving cryptography; still, some want to poke anyway. In regards to his mention of our work -- it was an honor, really, to get the thumbs up from someone who worked within the highest of standards, cryptographically-speaking, which is what we aim for with our work.
The work he's referring to is that of "Green Cryptography," which I co-developed along with the co-designer of the AES, Vincent Rijmen; essentially, it's about recycling the AES within a framework for authenticated encryption that preserves confidentiality and integrity under the strongest notions we have (IND-CCA2 /\ INT-CTXT), with an emphasis on minimizing implementation complexity while maximizing cryptographic security. With information assurance as the sanctified goal, having an open dialogue with someone of Mr. Snow's caliber is indispensable.
Green cryptography reflects on the real-world notion that cryptographic failure almost always occurs at the implementation level and exploits the gap of understanding between cryptographers, developers, and users. We're working on another project now, dubbed Mackerel, which not only looks at the relationship between cryptographic implementation and its stewards -- cryptographers, developers, and users -- but also cryptography's relationship with communication, and how the two diverge, much to security's dismay.
Mackerel will be introduced early next year, but much of it is still in its infancy; however, segments of it, such as AES-based authenticated encryption in the encrypt-then-authenticate generic composition, are quite ready for standardization, and are currently being drafted to be proposals as such.
I find encryption software:
to not fit my needs
a general PITA.
We -- the cryptographic world -- focus on this because of the subtleties of implementing cryptography that aren't obvious to developers. Many expected encryption to provide integrity, which it clearly doesn't, which is why a vast majority of implementations I come across fail to implement an mechanism for preserving integrity. (I've interviewed 40+ "vendors," mostly small and a mix of non-profit open-source and commercial closed-source.) It's not the cryptographers fault that they misunderstood the fundamentals of cryptography, but then again, we shouldn't expect them to understand cryptography in order to use it. Ensuring that cryptography is deployed properly is just as much of a concern to cryptographers and ensuring that it's designed properly.
Real-world cryptography differs greatly from theoretical cryptography, and so do the players, so as prudent cryptographic engineering would have it, cryptographers must champion best practice methodologies that take cryptographic decision out of the hands of developers and place the reigns in the hands of a cryptographic framework that makes all of these important decisions for them. By standardizing modes that combine encryption and authentication, which NIST has already done, as well as compositions that specify them together, as separate functions, which we're working towards proposing for standardization, we're simply making it easier for developers to get what they expect, and get it right, without being asked to play cryptographer.
On a side note, if you haven't already, I recommend reading Brian Snow's paper, titled, We Need Assurance!, from a 2005 address.
That is exactly my problem!
I ordered from Dell a W7 64 bit notebook with Ultimate on it so I could use the full drive encryption and Dell (made in China) didn't include the TPM chip and told me it is impossible to install it (BS)
They say none of their new products have a TPM chip.
Does ANY PC vendor sell TPM chip on their products? I really want to know.
BitLocker can also be used without a TPM. To use BitLocker on a computer without a TPM, you must change the default behavior of the BitLocker setup wizard by using Group Policy, or configure BitLocker by using a script. When BitLocker is used without a TPM, the required encryption keys are stored on a USB flash drive that must be presented to unlock the data stored on a volume.
Yes I recall reading something like that in W7 help.
What I need is the step wise detail tasks to actually do the work.
What changes in Group policy? Will this work with 3rd party firewalls.
I recall you need a certificate or some such validation but can find nothing (yet) on how/where to get that or how that works with the USB method.
But my main issue is I want the vendor names that do have a TMP chip!
Toshiba, Dell, Lenovo, HP, etc.
I'm surprised Dell is on your list as it is Dell who sold me the notebook I'm on!
Do you have a link to them that shows Dell can sell the TMP chip?
Here is a link to a Dell laptop that includes TPM: http://www.dell.com/us/en/business/notebooks/latitude-e6510/pd.aspx?refid=latitude-e6510&s=bsd&cs=04.
You tend to find it included on laptops with other security features like fingerprint readers, etc. You don't find it on netbooks or most mainstream consumer laptops.
Some desktop motherboard manufacturers like MSI include an interface on their motherboards for an optional TPM module which they sell separately.
It is interesting. I just completed a chat with Dell and yes the e6510 and other latitudes do have the TPM chip.
BUT THEY CAN'T SHIP ONE OUTSIDE THE USA.
I'm just north of that!
The thing ~Phrase removed~ about Bitlocket is that you cant use it with a simple password entry at boot.
Heaving to have the USB key with you all the time is practically pointless.
I don't have that much experience with using dm-crypt/LUKS, but I do appreciate many of the design ideas; with that in mind, I'll make an effort to look into more.
Separate names with a comma.