Complexity of inversion by exhaustive search attack

69 Views Asked by At

Consider the following algorithm to perform the inversion of certain function F given a witness F(K) = W so that we can recover the key. It basically checks all the possible values and yield those who map to the witness value. Here N is the size of the domain of F and M is the size of the image of F.

   Input: an image w
1: shuffle K with a random permutation
2: for all i = 1 to N do
3:  if F(ki) = w then
4:   yield ki and stop
5:  end if
6: end for

While proving that the expected complexity is M for $N >> M$ I have trouble understanding the following claimed equality:

$\sum_{i=0}^{N-1} i P[\text{complexity} = i] = \sum_{i = 0}^{N-1} P[\text{complexity} > i]$

Why is it the case? I understand that $P[\text{complexity} > i] = \big(1-\frac{1}{M}\big)^i$ and although not stated I think that $P[\text{complexity} = i]$ should be $ \big(1-\frac{1}{M}\big)^i\frac{1}{M}$

2

There are 2 best solutions below

0
On BEST ANSWER

If $X$ is a random variable that assumes nonnegative integer values, then:

$$\begin{align*} \sum \limits_{i = 0}^{N - 1} P(X > i) &= \sum \limits_{i = 0}^{N - 1} \sum \limits_{j = 0}^\infty I\{j > i\}P(X = j) = \sum \limits_{j = 0}^\infty \sum \limits_{i = 0}^{N - 1} I\{j > i\} P(X = j) \\ &= \sum \limits_{j = 0}^\infty P(X = j)\sum \limits_{i = 0}^{ \min(N, j) - 1} 1 = \sum \limits_{j = 0}^\infty \min(N, j)P(X = j) \end{align*}$$

This means that your equation holds iff the complexity is at most $N$.

0
On

It is actually two ways of writing the mean of a random variable. given your discrete random variable $X$ taking value on $\mathbb{N}$, write $\mathbb{P}$ its law of probability (and the associated mean $\mathbb{E}$), by definition, you can write :

$$\mathbb{E}[X]=\sum_{i\geqslant 0 } i\times \mathbb{P}(X=i) $$

Furthermore, for any integrable positive random variable, you can write that $X=\int_{0}^{+\infty } 1_{X>t} dt$. Plugging this equality in the mean, you obtain:

$$\mathbb{E}[X]=\mathbb{E}[\int_{0}^{+\infty } 1_{X>t} dt]$$

From there you can invert the integral and the mean (by Fubini), use the equality $\mathbb{E}[1_{X>t}]=\mathbb{P}[X>t]$ to obtain : $$\mathbb{E}[\int_{0}^{+\infty } 1_{X\geqslant t} dt] =\int_{0}^{+\infty } \mathbb{E}[1_{X\geqslant t}] dt =\int_{0}^{+\infty } \mathbb{P}[X\geqslant t] dt$$

Then, since your random variable $X$ takes value on $\mathbb{N}$, for all $t\in]k;k+1]$ the equality $\mathbb{P}[X\geqslant t]=\mathbb{P}[X\geqslant k]$ holds, you obtain : \begin{align} \int_{0}^{+\infty } \mathbb{P}[X\geqslant t] dt &=\sum_{k>0} \int_{k}^{k+1}\mathbb{P}[X\geqslant t] dt\\ &=\sum_{k>0} \mathbb{P}[X\geqslant k] \int_{k}^{k+1} 1dt\\ &=\sum_{k>0} \mathbb{P}[X\geqslant k] =\sum_{k\geqslant 0} \mathbb{P}[X>k] \end{align}

Which is the desired result. Another way to see it which might be more intuitive, is by reorganizing the terms in the definition of the mean : \begin{align} \mathbb{E}[X]&=\sum_{i\geqslant 1 } i\times \mathbb{P}(X=i) \\ &=\sum_{j\geqslant 1} \mathbb{P}(X=j) + \sum_{i\geqslant 1} (i-1) \mathbb{P}(X=i) \\ &=\mathbb{P}(X\geqslant 1) + \sum_{i\geqslant 1} (i-1) \mathbb{P}(X=i) \\ &=\mathbb{P}(X\geqslant 1) + \sum_{i\geqslant 2} i \mathbb{P}(X=i) \\ &=\mathbb{P}(X\geqslant 1) + \sum_{j\geqslant 2} \mathbb{P}(X=j) \sum_{i\geqslant 2} (i-1) \mathbb{P}(X=i) \\ &=\mathbb{P}(X\geqslant 1) + \mathbb{P}(X\geqslant 2) + \sum_{i\geqslant 3} i \mathbb{P}(X=i)\\ &= ...\\ &=\sum_{j\geqslant 1} \mathbb{P}(X\geqslant j)=\sum_{j>0} \mathbb{P}(X\geqslant j) \end{align}