Decryption challenge

Discussion in 'privacy technology' started by berndroellgen, Sep 13, 2011.

Thread Status:
Not open for further replies.
  1. Nebulus

    Nebulus Registered Member

    Joined:
    Jan 20, 2007
    Posts:
    1,613
    Location:
    European Union
    Yes, it would be good, but unfortunately it's not possible.
     
  2. x942

    x942 Guest

    You completely misunderstood the post. The file is not encrypted with 200 char. key. it's protected with a 6 char. key just like yours is. The 200 char. string is the proof that you actually broke in. If you crack it and post the string than we know you actually cracked it.

    Either way, get some one reputable to attempt at cracking your cipher.

    Also you keep saying it's opensource. So, your saying I can write my own program that uses your algorithm?

    I'm not trying to be rude or offensive. I just can't trust something with little to no evidence it actually works. Just posting a challenge doesn't mean anything. If it had as much attention as AES does and it was still uncrackable than I would believe you, but thus far it does not.
     
  3. berndroellgen

    berndroellgen Registered Member

    Joined:
    Nov 5, 2010
    Posts:
    59
    Yep: 6 character key (ASCII or the full 8 bit per character?) and 200 char message (pure noise).

    By increasing the # of rounds you create a closed source cipher and even if I knew how you've modified AES, I'd have more than a tough time to identify the key.

    It is even impossible to identify the key as there is no way for anybody who wants to break your construction as there is no way to tell correct from wrong key.
    In other words: keys can only be broken by brute force if the plaintext is noticeably different from randomness. This is usually the case as web pages have headers, MP3 files have headers, ASCII text has a very low entropy, etc.

    To my algorithm: Yes, you can write your own program(s) with it. It's pure C++, which should compile on any standard compiler.

    It will never be able to get similar attention as AES, at least not in the next 10 years or so. With time some ideas will be adopted by others who also write encryption algorithms. Some of the underlying theory might make sense to them. Governments will altough never promote encryption algorithms that feature a long key setup time and that require lots of resources in terms of chip space as their primary goal is to keep the fragile balance between "good enough for civil use" and "possibility to access encrypted data when it's necessary". DES became weak after 20 years and AES already shows signs of weakness (http://www.schneier.com/blog/archives/2009/07/another_new_aes.html).

    In other words: Every 20 years or so we'll see a new "standard" cipher, which is of course heavily promoted by certain certification bodies that all depend heavily on their respective governments.

    In that link above, Bruce S. writes " At this point, I suggest AES-128 at 16 rounds, AES-192 at 20 rounds, and AES-256 at 28 rounds. Or maybe even more; we don't want to be revising the standard again and again."

    Well, why not increasing the number of rounds in the first place? Hmm.. obviously I'm 100% right !!!
    But the very same Bruce would certainly have said until 2009 "anybody who increases the number of rounds must be totally stupid". That man has the tendency to use very massive words and he's obviously very successful with the way he does things. There's the saying "One barking dog sets the street barking".

    Real progress is made behind closed doors. That's totally ok. New ideas become the object of ridicule. Works phantastically well in this field and costs almost nothing. That's also ok.

    But sometimes even civil applications require a non-standard approach to solve a technical requirement. In the field of cipher design there's simply not much out there.

    Let's imagine the U.S. embassy in Moscow utilizing DES in 1991: Stored messages would today be protected by NOTHING, but the content might still in some cases be compromising today! It is almost 100% certain that those people have NOT used the "official standard" at that time and it is just as certain that they are using something more ultimate today. Anything else would be highly unprofessional.
     
  4. Serapis

    Serapis Registered Member

    Joined:
    Nov 15, 2009
    Posts:
    241
    DES was developed by the NSA behind closed doors and was not subject to much analysis prior to it being released. While it's debated if the NSA thought it was faulty to begin with (the "key size" argument) it's development was not an open discussion.

    AES however, was developed from dozens of openly submitted algorithms that where analysed by any party that wanted to take a look at it. Rijndael was selected because it seemed to be the best at the time. But there was no conspiracy to push Rijndael as the algorithm because it was known to have flaws. As good as they are, it's hard to assume that the NSA's cryptographers are so good that they would find things that the rest of the world's crypto experts would miss.

    The Schneier paper you just linked to shows the good results of transparent debate on the breakability of AES. I would hope that your work gets the same coverage and exposure by the relevant people. You would have to do it yourself however, by discussing it directly with the expert circles, since crypto afterall is your profession. The burden of proof in this case lies on you and not the reader to prove it. Merely saying that the criteria was rigged in order to create a weak design doesn't sense since its design is not a black box. The world of academia aims to discover and publish new knowledge in a neutral manner and does not adhere to any hidden agendas.

    Your pmc encryption is provided as opensource, but what is important is your specific implementation or program be provided as opensource too. This is essentiall what defines a strong crypto program. In this case I don't get why anyone would pay for the program when they could just compile it from source.

    A cipher that requires more time to generate longer keys doesn't necessarily mean its better. It could point to inefficiency of design. if there is enough latency in the encryption process it won't be 'transparent' anymore in the sense that the user will feel a performance hit. Thats why the NIST competition was looking for a fast algorithm that could work on small chips. At the time there weren't any powerful smartphones available as today and encryption would have been needed on a wide range of devices that varied greatly in respect to their computing power. Scalability is a very important factor taken into account.
     
  5. berndroellgen

    berndroellgen Registered Member

    Joined:
    Nov 5, 2010
    Posts:
    59
    It's always fascinating to see how opinions differ. In this case totally opposite opinions sound both conclusive, depending on the individual point of view.

    That 32 bit codebreaking challenge could prove to be a important milestone. Perception of people differs greatly. Culture, experience of life, age and profession all play a role. It's exciting to see how a number of (in my opinion) good ideas already spread.
    There's obviously very little discussion about alternative ciphers. The reasons for this are easy to understand. The future replacement for AES is although inevitably an "alternative cipher" when looking at it from today's perspective. .. and there will be a replacement for that in 5, 10, 20 or 30 years or more assuming that there is technical progress in the future.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.