Rate calculation for sparse coding (significance coding)

58 Views Asked by At

How can I establish the approximate equality below?

Let $M \ll N$ be the number of significant coefficients that is coded with $\log N$ bits. There are $\binom{N}{M}$ different sets of $M$ coefficients chosen among $N$. To code an approximation support of the $M$ coded coefficients without any other prior information requires a number of bits:

$R_0 = \log_2 N + \log_2$ $\binom{N}{M}$ $\approx M\left(1 + \log_2\frac{N}{M}\right)$

1

There are 1 best solutions below

3
On

Fix $M$. Take $f(N) = \log_2(N) + \log_2 \binom{N}{M}$ and $g(N) = M(1+\log_2(\frac{N}{M})$. Clearly,

$$ \lim_\limits{N \to +\infty} f(N) = \lim_\limits{N \to +\infty} g(N) = +\infty $$

Apply L'Hopital's rule:

$$ \lim_\limits{N \to +\infty} \frac{f'(N)}{g'(N)} = \frac{M+1}{M} \approx 1 $$

This doesn't really give any insight as to where the approximation on the right arises though. Perhaps it could be better established through some combinatorial argument?