Let $r+1$ be the number of digits (bits) needed to represent the positive integer $n$ in binary as in:
For all $n \ge 1$ there exists a unique sequence $c_0, c_1, \dotsc , c_r$ such that $r$ is a non-negative integer, $c_k \in \{ 0, 1\}$ for all $k = 0, 1, \dotsc ,r$, $c_r = 1$, and $$n = c_r × 2^r + c_r−1 × 2^{r−1} + \dotsb + c_1 × 2^1 + c_0 × 2^0.$$
Prove that $r = [\log_2(n)]$.
I am trying to prove this without using induction
$n=1.2^r+...$. $\ $ So $n\ge 2^r$ and $log^{n}_{2}\geq log^{2^r}_{2}=r$. So $[log^{n}_{2}]\geq r $.
On the other hand $n\leq 2^r+2^{r-1}+...+2^0=\frac{2^0(1-2^{r+1})}{1-2}=2^{r+1}-1<2^{r+1} $. Therefore $\log n< \log^{2^{r+1}}_{2}=r+1$. Therefore $$\log ^{n}_{2}-1<r\le [\log ^{n}_{2}] $$ and $r\in \mathbb{N}$. So r=$[\log ^{n}_{2}]$.