Consider a block cipher algorithm with the properties: - Input, output block length is 64 bits and key size is 56 bits. - Given a key K, the key scheduling requires 2 microseconds. - After the key scheduling produces all subkeys, the encryption or decryption of a single 64-bit block requires 0.5 microseconds.
- Compute the total time (in microseconds) required to encrypt 1 megabytes of data ?
- Given two known values C and M such that C = EK(M) under an unknown key K, how many years (at most) are required to crack the cipher on a single computer (runs at 3 GHz)? Notes: 1 microseconds is 10^-6 seconds. 1 megabytes is 2^20 bytes.
My answer: 64-bit requires 0.5 ms (2^-1) ==> then 1-bit requires 2^-1/2^6 = 2^-7 .
now we need 2^-7 for each bit, how many for 1 megabytes of data:
1 megabytes = 2^23 bit ==> so 2^23 * 2^-7 = 2^16 + 2 microseconds.
now for the second part, consider the unknown key, 2^56 / 2 = 2^55
I'm wondering if my answers are correct ? could anybody please advice ?
Thanks - Alaa
I guess you are implementing $\mathsf{DES}$ algorithm. In that case, there is no provable security guaranteed. The security for all symmetric key algorithms have heuristic arguments and rely on the fact that there has been no successful attack on them. Otherwise, they are seen in ideal-cipher model, which is equivalent to random oracle model and hence prone to same issues.
So for the second question, I assume a perfectly secure encryption scheme in the Shannon's sense. Now you are trying to perform a chosen ciphertext attack, then the number of attempts you need to do is $2^{56}$ attempts which is the key size.