Discussion in 'privacy technology' started by TheMozart, Apr 28, 2012.
Trying it now... report back later.
I even still have a 32MB USB that still works, i still use it to save Office files for college.
I only had 1 HDD fail me and it was a 5GB HDD back in 2001.
At the moment my current 40GB still works like a champ BUT its starting to have some issues. (After 9 years of use)
Every now and then when i take that HDD out and it's moved around it starts to fail because it seems that the internal disks are loose but if you shake it a few times carefully it starts working again . . . Go wonder
lol....just shake it ....oh you crack me up.....
Hope you've got an image of that drive somewhere...
Another option is TrueCrypt within Dropbox etc etc...
You never indicated budget, do you want a paid service or a free slow service?
Other than that, yes you can use zip software to encrypt in which case most use AES. In which case a password of 15+ characters would need to be used to secure the data. If I were in your shoes I would use a Truecrypt container to store my files in as you would have more options in terms of encryption and can adjust accordingly.
Call me an old fart though I tend to not migrate my information to “cloud” services as it still is a grey area in terms of legal applicable laws and the fact of cloud server farms over a wide range of countries. I practice redundancy and diversity though own all external drives and off-site storage locations.
Perhaps you should explain yourself better.
Here's a Link to the TrueCrypt Forms with a better explanation from a long-time member, pepak.
Is cascading about making it harder to break the encryption? Simply because data is encrypted several times?
It is assumed so, but no real research has been done. It could possibly be weakening the encryption, too. But hopefully cascades don't weaken the encryption, and it does not matter whether they strengthen it (because even alone each algorithm is unbreakable by brute force, so it doesn't matter if it becomes twice as unbreakable).
Why are cascading using 2 or 3 different algorithms? Why not an option like AES-AES?
Because the main motivation to use the cascade is, "If an algorithm shows a fatal flaw that makes it breakable quickly, we have one or two more algorithms in the cascade as a fallback".
The idea is that using cascade encryption can potentially lead to significant weaknesses in the encryption due to bad implementation methods (the actual programming code that is responsible for carrying out the encryption algorithm). As Justin Troutman and others pointed out in recent threads, the majority of encryption methods are broken not due to a weakness in the math, but due to a weakness in the implementation.
You might want to take a look at this thread https://www.wilderssecurity.com/showthread.php?t=321583 , paying particular attention to Justin Troutman's posts. Take a look at this thread as well https://www.wilderssecurity.com/showthread.php?t=321749 .
But I have decided not to use Cloud services or online backing up services, because it takes forever to upload files, it's so slow, that it's not possible to upload all the data.
You could try Cloudfogger (Thanks to Radeon101):
Password: a sequence of 10 "easy to memorize" characters with at least a ciffer, at least a lowercase letter, at least an uppercase letter, and at least a "special character" such as ß Ç Æ ¿ õ¹ ...
OR... ʇɐɥʇ ʎɹʇ uɐɔ noʎ ǝqʎɐɯ
That majorgeeks SSL certificate is being shown as invalid by Chrome.
You know sometimes solutions might look stup . . . Oh nevermind
I got all the files backed up to my main HDD and soon to my new external HDD also.
SpiderOak Work's great. I back up TrueCrypt containers there encase they get damaged or the drive gets wiped out/lost/damaged.
On the topic of cascades, I remember reading on these forums (can't find it now) a post with a link to a whitepaper that concluded there is a ~1-2% increase in security in using cascades and no likely weakness in using them (assuming that since the developer could effectively implement each separately they can probably manage the cascade too). That said since reading and realizing how good AES really is, all new drives are AES256 only. Cascades just add slow-down and not much real security considering it's already uncrackable.
I do 'double-encrypt' some files by using more than one program (7-Zip and ccrypt/axcrypt), this can't cause issues as you are using two (2) independent programs and basically doing this:
Plaintext ---> Program 1 ----> Cipher text ---> Program 2 ---> Cipher text
The cipher text from Program 1 becomes the Plain text when encrypted with program 2 the second cipher text is just random (encrypted) data - same as the first.
This is useful if you don't trust the program/method fully. For example I don't trust file level encryption as much as Block level (truecrypt) encryption. Mainly because of plausible deniability. An encrypted file with axcrypt/7-zip can be deduced as a text file/zip file/etc by it's size and other characteristics (file name isn't encrypted), while truecrypt produces a completely Pseudo-Random file where no information can be gleaned (in practicality).
Plaintext ---> Program 1 ----> Cipher text ---> Program 2 ---> Cipher text
I think WilderSecurity's certificate itself could be seen as "invalid", as it is self-signed (if I don't make any mistake).
For me, an "invalid" certificate means nothing terrible in itself, particularly concerning a well known site. There is nevertheless concerns leading to some examination when Certificate Patrol pops up, saying to me that something is wrong in the new certificate of the well known site.
By the way, notice you would have not worried about majorgeeks's certificate at all if I had written its http URL.
Yeah, I am not trying to scare everyone, just making an observation. There are many legit reasons why a cert may show up as untrusted, but there are also malicious reasons too.
Actually, I think SSL, as it is currently implemented, is pretty worthless in general. You should go watch some of Moxie Marlinspike's videos on youtube (at DEFCON, BlackHat, etc.) where he outlines the problems of SSL and provides some great alternatives.
Basically the problem is that one CA can sign and certify 25% of the entire Internet's websites. If this CA gets breached, then anyone with a fake cert can intercept 25% of the world's "secure" web traffic. I will guarantee you that the U.S. government has been doing this for a long time. All they have to do is tell any CA "hey give us the private key or sign over to us our own cert." Most companies will agree without asking questions. And if they refuse, all the government has to do is issue them an NSL and they have to comply and cannot tell anyone about it. But the government is not the only possible culprit. The guy who hacked Comodo, for instance, was very unsophisticated and was able to intercept tons of traffic in Iran.
SSL is broken, and broken badly. You should *never* rely on it for any real security.
There actually has been some research in this area (i.e., code-based, game-playing proofs), which basically tells us that a cascade of three is the sweet spot; interestingly, it also tells us that two doesn't do much of anything, and four doesn't do much beyond that of three. As far as I know, that's where we're at.
Also, I'm not so much worried about mathematical interactions between cascaded primitives that could make the composite less secure. I could build a few primitives that would exhibit this, but it would be contrived and unrealistic. With modern block ciphers, I can't see this happening. Cryptographically, I don't have a problem with cascades.
Lastly, and to clarify my stance, when you introduce more code to an implementation, the attack space becomes larger, and you have more analytical ground to cover. The likelihood of a fluke in the implementation becomes greater. If I were going to burden an implementation with the extra code of an elaborate cascade construction, I'd want to know that it's an essential design decision. But the fact of the matter is that it's not; designs won't fall apart because you used a single primitive. But they will, and have, fallen apart because of the complexity of cramming in too much.
Of course, while more code doesn't automagically cause problems, it does up the risk. To conclude, I've seen several implementations fail because of multiple primitives where a single primitive would have sufficed, but I've never seen an implementation fail because only a single primitive was used. I go with what experience tells me. It's imperative that you consider what really happens, and what happens often, more so than what could happen, and what rarely, or never, happens.
Serpent's co-designer, Ross Anderson, echoes this. Read Ross's thoughts on this on page 94, in Chapter 5 of his book, Security Engineering: A Guide To Building Dependable Distributed Systems. Quoting Ross:
I want folks to be concerned, but about the right stuff, which is why I keep trying to drive this message home.
Great post Justin, thanks.
Separate names with a comma.