Truecrypt versus built-in SSD encryption

Discussion in 'privacy technology' started by T-RHex, Apr 7, 2013.

Thread Status:
Not open for further replies.
  1. T-RHex

    T-RHex Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    97
    Hello,

    I've been using Truecrypt to protect client-specific data; primary reason would be protection from theft, not deniability, so I haven't been using hidden partitions, just simple partition encryption.

    I'm now looking at getting an SSD (Plextor M5Pro) with built-in AES 256-bit encryption. What are the pros/cons of using one versus the other? I would guess using hardware encryption would have a performance edge given that it's built-in. Any downsides or other issues I should be aware of?

    Thanks,
    tr
     
  2. Taliscicero

    Taliscicero Registered Member

    Joined:
    Feb 7, 2008
    Posts:
    1,439
    TrueCrypt is OpenSource, so you know there is no backdoor. Your SSD manufacturer probably has its own closed project. I would bet TrueCrypts encryption with AES would be the same as your SSD's hardware encryption as if its got hardware AES bonus it should carry over to TrueCrypt too.
     
  3. T-RHex

    T-RHex Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    97
    Thanks for the input, Taliscicero. I've been happy with Truecrypt performance, but I didn't know if the new hardware would be an opportunity to go with something simpler (ie. not third party software) but just as secure. I also presume (but haven't had the time to find out) that the hardware encryption is drive-based whereas Truecrypt can be partition based (ie. I'll have 3 partitions, so with Truecrypt that's 3 passwords to enter whereas if drive-based it would only be one). Too much to learn, too little time.
     
  4. LockBox

    LockBox Registered Member

    Joined:
    Nov 20, 2004
    Posts:
    2,275
    Location:
    Here, There and Everywhere
    Well, a drive encrypted only with TrueCrypt can (and will) be imaged and an adversary can attack the volume millions of times with that image. A drive with hardware-based encryption cannot be imaged as the key is actually processed, not through software, but on the hardware itself with its own crypto processor. Hardware-based encryption also allows for a X number of tries and the device wipes the data securely. Again, this can only be done effectively on hardware-based encryption. Hey, feel free to use both! Using both would certainly be some heavy-duty encryption.
     
  5. anniew

    anniew Registered Member

    Joined:
    Mar 15, 2013
    Posts:
    92
    Excellent point...did not know that.

    But to Taliscicero's point, there may be a backdoor means of access. Though we suspect this is possible, is it probable or likely? Perhaps as a government requirement?
     
  6. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    As to likelihood of a backdoor in SSD encryption, chances are well above the "likely" level that if a backdoor exists, any evidence of such will be very, very, very well hidden. Nevertheless, it is likely that one or more officials of government(s) somewhere (US or foreign), did exert pressure for a backdoor in the SSD encryption, given that Microsoft has acknowledged government pressure for a backdoor in Bitlocker:


    As a matter of logic, its relatively easy to conclude that any such backdoor would become useless should knowledge of its existence reach the level of "likely". Thus such a backdoor would be used only in extremely rare events -- and unless you personally represent a very high level national security threat, chances are very good that any possible backdoor will never be used on you.

    Other paths of logic would further suggest that if a foolproof decryption backdoor is possible in connection with the SSD, then the same SSD drive could also be equipped with a foolproof mechanism to capture other decryption keys, e.g., from software such as TrueCrypt. Hence using a different encryption approach might not provide substantially improved encryption safety.

    All of which is to say; (i) no one remotely likely to speak would know if the SSD encryption has a backdoor, but, (ii) unless your data constitutes an extremely dangerous national security threat, its unlikely that any SSD encryption backdoor could pose a realistic threat to you.

    So check the features of the SSL encryption vs TrueCrypt. If one is better for you than the other (performance, ease of use, etc.) then that's probably a good basis for your choice.

    __
     
    Last edited: May 6, 2013
  7. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,089
    Where encryption/decryption is done within a peripheral device it would be easy for firmware and/or hardware designers to build in a backdoor which would range from "unlikely to be discovered" to "extremely unlikely to be discovered". How many independent reviewers study firmware? How many dig into the hardware design looking for programmable/custom devices? How many open up chips to inspect their circuitry? Companies/employees that specialize in advanced data recovery and servicing, agency labs, etc would seem the most likely parties to discover such a thing. Even if they did discover it though, would they disclose it publicly? Recovery/servicing companies want a friendly relationship with manufacturers and could profit from offering decryption services to selective customers. Various agencies would probably want to keep it quiet and exploit it for their own purposes.

    There is a difference between using a backdoor and (publicly) disclosing that you've used such a backdoor. In various scenarios simply accessing the data would be enough, the device could be returned as though the effort failed thus potentially miss-leading more people to believe that there is no backdoor, and the information exploited in ways which would not (clearly) reveal to the victim that the device had actually been decrypted.

    In order for such a device to go after thorough host based encryption/decryption (where only encrypted data is written to the device) I think it would have to do something like:

    1) Contain a built-in cracking engine that would work against the encrypted data it stores. This would be hidden from the outside world but it also seems very challenging and likely to fail if the host tool and keys are strong
    2) Be able to exploit the host<->peripheral interface in a way which allows it to grab the host tool's keys from memory. I'm not sure how viable this is for various interfaces, but this is something that would by nature be observable from the outside.
    3) Be able to put malware on the host or lure the user into installing it. A rigged driver and/or extra software for example. This approach would also be observable from the outside.

    There are creative ways to try to minimize the chances of #2 and #3 being caught, but particularly in the case of #3 there would be various antimalware companies, researches, users, etc on the lookout. I'm not sure it is reasonable to assume it is just as easy for an SSD to grab host based encryption passwords/keys as it is to have a built-in backdoor for its own internal encryption.

    One of the things that would concern me is the potential for such a device to be stolen and make its way into the hands of someone who knows someone who could get the decryption done. Particularly if there is the possibility that a thief will have some idea of what they have, for example in cases where there is evidence that it comes from an individual or company that is likely to have things of value. There must be some networks of criminals who specialize in stolen storage devices, and I'd be surprised if there aren't some shady employees who sell information to and/or moonlight for such people.
     
  8. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    The legal and practical liabilities associated with a backdoor would be carefully considered by any company considering intentionally implementing this sort of thing, and should be considered here. In a nutshell, a company who knowingly and intentionally sabotaged a product they were selling to the consumer would risk billions, possibly trillions, of dollars in damages, along with possible criminal sanctions (for promoting fraud, trade secret theft, etc).

    Such considerations would almost definitely preclude a scenario in which more than just a few officers/employees would know of the deception, and/or in which the backdoor could be used for purposes other than protection of national security interests. Should a more general backdoor be implemented and become known to other company management, and/or used by other employees, a company would have no practical choice except to issue an immediate recall for all of the defective SSDs.

    Again, the potential for existence and use of an encryption backdoor posing a danger to any normal consumer/purchaser can realistically be taken to be in the range of extremely unlikely to virtually impossible.

    EDIT - Check the fine print. It is within the realm of possibility that a company could protect itself by informing the consumer/purchaser that company will cooperate with governmental authorities to assist in de-encrypting data in the event of criminal charges/court orders, or the like. So read any product fine print carefully.
    __
     
    Last edited: May 7, 2013
  9. anniew

    anniew Registered Member

    Joined:
    Mar 15, 2013
    Posts:
    92
    Great points! This got me thinking...

    If such a thing existed, as you mentioned, there would never be acknowledgement that it was used.

    I would think that the existence of a backdoor in a commonly distributed technology would be open to organized hackers to want to exploit.

    Therein lies some game theory. One issue properly being liability. The other, of the risk that an adversary has the capability to reverse engineer or otherwise hack their way to discover the existence of a backdoor and then to exploit it.

    Just today a LEA/DoD agency announced specifically that China is a major active source of hacking attacks on US government and business. Many of these devices are made in China. So, there is well financed capability.

    I would think the correct decision is to not intentionally build a backdoor, as the consequence of exposure of the content to adversaries is greater than the value of exercising it for baddies here, even for having "first crack" opportunity for espionage on the adversaries.

    It is like creating an atomic bomb that only you have the key to detonate, and you give (or sell to) everyone several of those bombs to possess. Eventually, someone will figure it out. Then where would we be.
     
  10. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    On further thought, it seems to me the more realistic issue here is the possibility of a selectively creatable backdoor. In particular, addressable firmware could provide a possibility to modify specific encrypted SSDs pursuant to a Court Order, National Security Order, or the like. This would conform to current practices of installing keyloggers selectively on individual computers pursuant to valid Court Orders. A firmware modification should be less detectable than a physical keylogger or a software keylogger.

    Similarly, selective modification to encryption software could provide selectively creatable backdoors, although small package software with verifiable hash keys, such as TrueCrypt, should be more or less immune to invisible injection of any such backdoor.

    Of course, given that conventional computer hardware typically includes addressable fimware in multiple locations, there is also the possibility of firmware based keyloggers that could already be in use, and could be exceptionally hard to detect.

    Likewise, Secure Boot probably doesn't provide any realistic defense against various firmware modifications particularly when implemented outside of the OS, or even against manufacturer-sanctioned selective modification of the firmware-like efi code constituting the basis for Secure Boot.

    All of which is to say that an across the board backdoor in hardware encrypted SSD drives is not only unlikely, but generally unnecessary...

    __
     
    Last edited: May 7, 2013
  11. LockBox

    LockBox Registered Member

    Joined:
    Nov 20, 2004
    Posts:
    2,275
    Location:
    Here, There and Everywhere
    Guys, That's not how it works. There is no firmware that could decrypt hardware-based encryption. These things are used by the biggest companies in the world to hide their secrets, governments use them. The liability of companies either (a) Putting in a back door or (b) Creating some way to decrypt the information remotely, is simply an easy way to completely lose your entire business. If you really think that's less secure than the free TrueCrypt, that's not free of issues btw, is just crazy. And I'm a fan of TrueCrypt! But, I'll secure my data with hardware encryption, or both, before software alone.
     
  12. anniew

    anniew Registered Member

    Joined:
    Mar 15, 2013
    Posts:
    92
    Not really disputing this...admitedly, I don't know enough to do more than speculate the possibilities.
    True. The unsaid question was if the manufacturer would be compelled to insert a backdoor. I think the danger of that being abused by political adversaries is far greater impact than any "good" that could come of it.
    Agreed.
     
  13. TheWindBringeth

    TheWindBringeth Registered Member

    Joined:
    Feb 29, 2012
    Posts:
    2,089
    Although the discussion has zeroed in on "knowingly and purposefully installing a backdoor for backdoor purposes", the more general concern is that the device itself has some... any... kind of (feature) vulnerability that could be used by an unauthorized party to carryout decryption of its contents. Lets not forget that.

    WRT to liability, it seems to me that numerous factors would have to be taken into consideration. What is the nature of the vulnerability? Is it perfectly obvious that someone went out of their way to create it? Can the guilty party actually be determined? Could a manufacturer claim it was an inadvertent mistake? Does it relate to a weakness that resulted from a decision that the manufacturer could claim was reasonable on "ease of use" or some other grounds? Does it relate to factory diagnostic capabilities that have legitimate purposes? What are the applicable liability laws and warranty/liability terms that were agreed to? Even if one could prove it was purposeful backdooring for backdooring sake and identify the perp and the penalties would be high, government granted immunity or other interference could theoretically come into play and provide a shield of protection which assures no significant damage will be done to them.

    @anniew: Indeed it would not seem wise to unleash a weapon that could be turned around and used against you or those you care about. On the other hand, sometimes people take risks or just have too much confidence in their ability to hide or control something. Some might think the greater risk is NOT backdooring such devices. Theoretically speaking, backdoored devices could be selectively deployed so as to try to keep them away from friendlies. They could also be made identifiable to friendlies without also granting the friendlies the ability to decrypt them.

    @S.B: I think it good to understand other ways in which someone might approach the problem. However, I'm not sure the existence of other sometimes useful approaches would negate the desire to backdoor storage devices themselves. There would not always be an ability to install a keylogger or otherwise rig the host. Example: a courier carrying a bare storage device, a raid which captures equipment but the targets are gone and not coming back, etc.

    @LockBox: The term "hardware based encryption" is too vague to base conclusions on I think. Particularly if we are talking in broader terms so as to encompass the wide range of storage devices that are advertised as having hardware based encryption. It would be reasonable to assume that there is hardware involved and it is doing the heavy lifting, but that doesn't mean there isn't a programmable controller involved. FWIW, it appears to me that the Plextor M5Pro SSD is built around a Marvel 88SS9187 (dual-core device with integrated SATA interface, AES engine, etc) running custom Plextor firmware. The internals of that chip and the firmware capabilities would be central to the question of how secure stored data is against unauthorized decryption. Is there any means to access the media encryption key? Is there any means to predict what it will be when (re)generated? Is any password/authentication related information stored internally such that the media encryption key could be decrypted and the data accessed in a normal manner? Via not only released firmware but also custom builds, JTAG interface, whatever.

    A self-encrypting drive doesn't have to be perfect in order to be popular. Encryption straight out of the box without having to install extra software and wait for it to encrypt everything is mighty appealing. The ability to delete/change a media encryption key and instantly render previously stored data inaccessible is mighty appealing. A solution that is strong enough to foil most people, particularly when combined with physical protections, will often be considered good enough. IOW, I suspect such devices would sell well even if every manufacturer built recovery capabilities into the drive and came right out and said that.

    As for what level of protection the OP and his/her clients would want, I don't know. Frankly, I liked the "use both" idea.
     
  14. S.B.

    S.B. Registered Member

    Joined:
    Jan 20, 2003
    Posts:
    150
    Lots of intelligent and knowledgeable comments here.

    On hardware encryption in general, there are clearly benefits to be had; see for example, this AnandTech article, and this Press Release by the makers of Winmagick. The question in my mind is whether the benefits would be noticeable in my everyday life.

    Although hardware-based encryption might benefit corporate entities, it may not have the same benefits for individuals. One study monetized costs of employees entering a password every day over the course of a year. But the daily time cost of entering a password notwithstanding, I'll stick with my password-only solution. Thank you very much. There's nothing to lose or be stolen. Everything is in my head. Decrypting my important data isn't hardware dependent or dependent on availability of a usb or backup of tpm data.

    Something else may work better for you. No problem.

    As to SSD and other computer firmware -- I believe there are some risks and dangers there. To my way of thinking its a bit naive to assume that highly skilled individuals and organizations (governments and criminals included) can't or won't under any circumstances, adapt existing and/or new firmware to aid their efforts in capturing code and decrypting data. I'll also restate my view that risks of normal computer users being targeted by any such attack is essentially nil.

    Best regards to all.

    __
     
    Last edited: May 8, 2013
  15. dantz

    dantz Registered Member

    Joined:
    Jan 19, 2007
    Posts:
    994
    Location:
    Hawaii
    The fact that a program is open source doesn't mean that it has no backdoors, it merely means that you are able to view (and perhaps modify) the code. Backdoors can be incredibly subtle. They even have backdoor hiding and finding contests at Defcon, with prizes going to those hackers whose backdoors are the most difficult to detect. All code is fully viewable, of course.

    Plus, there are many other locations (aside from a program's actual source code) where a backdoor or its equivalent might exist. How about the compilers that were used to assemble it? Or the OS that it runs on? Or the hardware that the OS runs on? And so on.

    Anyway, backdoors aren't the biggest risk; it's the bugs and design flaws that you really have to worry about. Although it's logically impossible to prove that a particular encryption product (whether hardware or software based) is reliable, secure, effective and thus 'trustworthy', sometimes you can prove that it isn't. The following links show some relatively recent examples of both hardware and software-based encryption products by major providers that went horribly wrong:

    Hardware (flash drives):
    http://www.zdnet.com/blog/hardware/...on-sandisk-and-verbatim-usb-flash-drives/6655

    Software:
    A package maintainer removed a couple of seemingly insignificant lines of code from a Debian software release, thus seriously crippling the all-important seeding process for the PRNG. This created a huge OpenSSL security flaw which affected many important servers worldwide and put a great deal of encrypted data at risk:
    http://www.debian.org/security/2008/dsa-1571

    And installing the patch wasn't enough. Notice this quote:
    "It is strongly recommended that all cryptographic key material which has been generated by OpenSSL versions starting with 0.9.8c-1 on Debian systems is recreated from scratch. Furthermore, all DSA keys ever used on affected Debian systems for signing or authentication purposes should be considered compromised"

    More discussion here:
    http://www.schneier.com/blog/archives/2008/05/random_number_b.html

    The bottom line is, screwups abound and you can't trust any encryption solutions 100%, even if they come from some of the top providers.

    But to answer the OP's original question, Given the choice I would suggest using the flash drive's built-in hardware encryption, if it can be shown to be both reliable and secure (and good luck confirming that.)

    I would tend to steer away from software encryption products such as TrueCrypt unless you are willing to be extremely careful with your encrypted data. I say this because a great many TrueCrypt users end up losing access to their encrypted data due to various end-user screwups. TrueCrypt's encryption scheme is delicate by design, and this makes it highly vulnerable. One wrong move and it's gone! And the reality is, human beings have been shown to be hopelessly inept on almost every level, and pretty much everyone screws up at one time or another. In fact, I'm amazed that we can even keep our passenger jets flying safely. I think we "fly by luck" more often than we might realize.

    Anyway, whichever route you choose, if you care about your data then the most important thing you can do is to keep backups.
     
  16. caspian

    caspian Registered Member

    Joined:
    Jun 17, 2007
    Posts:
    2,301
    Location:
    Oz
    That's interesting. But I guess you'd have to trust the product with the hardware encryption.
     
  17. caspian

    caspian Registered Member

    Joined:
    Jun 17, 2007
    Posts:
    2,301
    Location:
    Oz
    Hardware encryption like an external hard drive or USB stick? Do you have any recommendations?
     
  18. PaulyDefran

    PaulyDefran Registered Member

    Joined:
    Dec 1, 2011
    Posts:
    1,163
    dantz, thanks for the flash drive link, I forgot that one. I was looking for the FreeAgent/GoFlex/Passport link where that hardware encryption was busted too. I couldn't find it though...but I remember Scott Moulton talking about it.

    And as always CryptoAG! :D

    I trust software more, but you can use both on the same drive. Of course, if the hardware is purposely tampered with, it could be looking for software stuff too... Trust sucks :D

    PD
     
  19. dantz

    dantz Registered Member

    Joined:
    Jan 19, 2007
    Posts:
    994
    Location:
    Hawaii
    I would feel comfortable trusting the majority of reputable, well-known encryption products to protect low-risk, low-level data such as personal files, non-critical client data, etc., but I sure as s*** wouldn't expect any of them to withstand major scrutiny. No, not even TrueCrypt. I realize that a lot of users think otherwise, but in my opinion they are misguided. My advice to them would be to get rid of all illegal, dangerous or incriminating data ASAP. Just get rid of it! Life is better without it anyway.
     
  20. Enigm

    Enigm Registered Member

    Joined:
    Dec 11, 2008
    Posts:
    188
    re : . reportedly cracked the AES 256-bit hardware-based encryption used on flash drives manufactured by Kingston, SanDisk and Verbatim.

    I do not believe they 'cracked AES 256', they found either the static key
    or the ATA 'master-password' !
    The user-provided password is NOT used as encryption-key,
    it only locks access to the storage-area, like this :
    http://en.wikipedia.org/wiki/Parallel_ATA#HDD_passwords_and_security

    I am basing my opinion mostly on 2 factors :
    1 : My experience with the Mass-production tools used to 'program' the controller of flash-drives .
    They all have a box where you can enter the encryption-key if the controller supports encryption.
    2 : Sandisks U3-drives ALL supported hardware-encryption and on some production-runs it was enabled by default,
    even if the user did not password-protect the device .

    I have this information from a trustworthy data-recovery company .
    They remove the NAND-chips from defective units and place them in this 'magic' reader they have and extract all the data from the NAND
    (provided it isn't the NAND that is toasted) .
    They then noticed that on certain versions of the U3-drives, there was nothing but 'random garbage' in the NAND,
    no matter if the user had set a password or not .
    (Some time after I learned this, Sandisk started to sell 'Secure' Flash-drives, at a premium price !, but that's another story ..)

    But still an interesting read :
    http://janusseal.com/prod/as/pdf/SySS_Cracks_Kingston_USB_Flash_Drive.pdf

    I fear that many SSD's are doing exactly the same thing ..
     
    Last edited: May 9, 2013
  21. PaulyDefran

    PaulyDefran Registered Member

    Joined:
    Dec 1, 2011
    Posts:
    1,163
    The problem is, that all over the world today, what is considered illegal, dangerous, or incriminating...isn't necessarily known to the people. A lot of it is determined in secret. Look at Thomas Drake.

    PD
     
  22. anniew

    anniew Registered Member

    Joined:
    Mar 15, 2013
    Posts:
    92
    Thanks for your reply.
    Good links, thanks.

    Between TheWindBringeth and S.B., I was hit with this thought:

    What about turning our assumptions on their head...maybe people would like the convenience of a backdoor as an "affordable" means to recover a drive where the password is forgotten.

    Not to make light of it, but with our aging population, there are many people where this may become an increasing real problem.
    Right. Like my point above, human error may be a driver to seek out the convenience that a backdoor process would benefit some. The encapsulation within hardware to hide and automate some of the processes would certainly make it more "fail-safe". However, as LockBox mentioned, and Caspian highlighted, there may be some downside (e.g. ability to back up and unencrypt from other devices, and auto wipe on frequent password failure).

    Good links, btw.
    Is this just a plug for the company, or do they offer a USB solution?...I couldn't find one.

    EDIT ON:
    Thanks to chronomatic, I understand your reference PD!
    EDIT OFF:
    Thanks for the simplified explanation.
    Agree...Thomas Drake is a holy cr*p example! :eek:

    To reinforce...Sometimes, it might simply be just a peon who may not have anyone specific in mind for a target, for example, the FBI Tampa Field Officer doing a "favor" for a Florida socialite and Patreaus. One may be guilty of something or not, but one can get blindsided should the PTB decide to cast a wide net to "find" targets.

    I wonder what the Boston Police did with the "information" they "found" when they did the sweep? Do we accept that they never "remembered" anything...
    http://www.slate.com/blogs/quora/20...ou_have_refused_to_let_police_enter_your.html

    One could imagine a "data sweep" occurring under similar "emergencies", so dantz' warning is well heeded for anyone knowingly involved in questionable activity.

    But, yes, we might not even know what could put us in the "wrong place" or on the "wrong side"...not sure what one does about that.
     
    Last edited: May 13, 2013
  23. chronomatic

    chronomatic Registered Member

    Joined:
    Apr 9, 2009
    Posts:
    1,343
    Not really. Go do a Google search on "CryptoAG." They were a Swiss crypto company back in the 80's and 90's (they are actually still around). They sold crypto hardware to governments and enterprises. Guess what? It turns out that the NSA had a secret deal with them the entire time and had a backdoor in all the hardware.

    CryptoAG sold hardware to Iran and other mid-east countries, so it is pretty obvious the NSA had a vested interest. It turns out that this backdoored hardware gave the NSA a treasure trove of intelligence including intelligence on who did various bombings. Indeed this is how the Iranians found out their hardware was tapped -- they noticed that the press was reporting who the suspects were like a day after various attacks (because certain officials in Washington had told the press that they had intercepted communications).

    As a result of this, the next time a CryptoAG employee visited Iran he was kidnapped and held hostage. Iran demanded several million from the company of which they eventually paid.

    My point is, even though the government probably has no interest in Joe Sixpack, I would NOT be so sure that such hardware is not backdoored. We do have a precedent for it already. The damages and liabilities CryptoAG would face if ever caught didn't seem to stop them from entering into an alliance with various intelligence agencies. After they were found out, it essentially destroyed their business (even though they still exist today).
     
  24. T-RHex

    T-RHex Registered Member

    Joined:
    Jun 10, 2009
    Posts:
    97
    Very interesting discussion, thanks all for your input. It's taken a much different path than I had even considered when I posed the topic in the first place; my concerns were with performance and convenience and I hadn't even considered the implications of design.

    My overriding concern (the reason I even use Truecrypt) is to protect personal and client information (including source code) in the event of theft. I like to think the chance of theft is non-existent but unfortunately -- like Jack Sparrow -- it's merely improbable and not impossible. I have nothing to hide from the authorities, so they're welcome to it. However, like many have brought up, an "official" backdoor could eventually be compromised (and more than could, it eventually would), and if someone were to steal encrypted data they might think it's worth more because something valuable is being hidden.

    Yes, that's one of the problems I'm starting to see: how to trust that the manufacturer has actually implemented the encryption properly (backdoors notwithstanding). After reading:

    http://www.plextoramericas.com/index.php/forum/27-ssd/7534-m5p-encryption

    and

    http://vxlabs.com/2012/12/22/ssds-with-usable-built-in-hardware-based-full-disk-encryption/

    I'm not so sure I want to trust any consumer-level hardware-based implementation (as opposed to industrial-level, which I would be more inclined to trust but would cost a lot more).

    I'm also a bit leery of hardware-based encryption going awry after a firmware upgrade (if necessary; I don't like to upgrade firmware when there are no issues) -- what is the probability of buggy firmware breaking encryption, or, worse, corrupting the encryption key(s) and losing all data?

    dantz, what do you mean there? Why is Truecrypt "delicate"? I've used it for about 5 years now and have only lost one partition (due, of course, to me forgetting the password). Other than forgetting passwords, what would be a wrong move?

    I often read something like "Truecrypt is trustworthy because it's open source" ... but unless we download the code, read it thoroughly, and build the .exes ourselves, how do we know the binaries come from the same publicly-accessible source, free from modification? For the sake of convenience, I just download the binaries and so trust that they are built from the same publicly-accessible code base.

    And of course, the backups are encrypted too, right? :blink:

    I basically have all of my data on an external drive with Truecrypt encrypted partitions and backup my data to another external drive with Truecrypt encrypted partitions. I use the same passwords on both drives -- is there anything exploitable in doing that? My thought on using the SSD encryption was then it would be one password for the entire drive instead of one per partition. (Then again, I kind of like keeping partitions offline when I don't need them, reducing their exposure -- at least momentarily -- to runtime malware).

    Thanks, all; this has been most illuminating.
     
  25. dantz

    dantz Registered Member

    Joined:
    Jan 19, 2007
    Posts:
    994
    Location:
    Hawaii
    One of TrueCrypt's major design goals is to serve the needs of those users whose primary requirements are secrecy, deniability and deception. In order to meet those goals the TrueCrypt team has deliberately avoided leaving identifiable signatures anywhere in the data, among other things.

    That's all fine if you're trying to deny the presence of your encrypted data altogether ("It's just random data, officer, I swear! I wiped the disk and left it random. Doesn't everybody?"), or if you wish to lie about the existence of your hidden volume ("No sir, I don't have a hidden volume in there, and you can't prove that I do. It's undetectable! You don't believe me? Here, let me show you.")

    Perhaps you are able to detect a certain amount of skepticism in my writing style.;) I understand how TrueCyrypt works fairly well, and in my opinion most users don't really understand what they're doing or how to properly (if you can call it that) make use of the so-called deniability features. They would be better off relying on the encryption portion alone and leaving the sneakiness to others. Or, to put things another way, let's just say that there are a lot of unsophisticated users out there who still feel that "the dog ate my homework" would be a believable excuse.

    Anyway, the downside to all this sneakiness is that accidents resulting in data loss are much more likely to happen (since even Windows is fooled and is unable to recognize or protect most of TrueCrypt's encrypted data), and the accidents are much harder to recover from when they do occur, since there are no markers showing where the data starts or stops, etc. A great many robustness and recoverablility features that could have been added to the program are deliberately missing in order to meet the above-mentioned goals of secrecy and deniability.

    So you've only had one big screwup so far and you haven't lost any vital data? It sounds like you've been lucky up until now, but all you need to do is spend some time browsing through the TrueCrypt forums to see how many users have gotten themselves into serious and frequently non-recoverable situations due to a wide range of various self-inflicted goofups that they didn't even know to avoid. I'd rather not try to list them all, but just take my word for it that they are extremely plentiful, plus the users are finding (creating?) new ones all the time.

    The program won't stop you, it won't warn you (everything is supposed to be a big secret, remember?), it'll just sit there and let you totally screw up and destroy your data, and afterwards it will barely lift a finger trying to help you get your data back again. That's why I strongly recommend backing up all TC-encrypted data.

    And yes, of course the backups can be encrypted. Just make sure they are stored independently from your computer and do not remain connected.
     
Loading...
Thread Status:
Not open for further replies.