For example:
say $n = 2$.
The numbers from $1$ to $2^2$ are $1, 2, 3, 4$.
i.e. $1, 10, 11, 100$ in binary.
So the result is $1$, because only one number i.e. $3$ is there such that it has 11 in it.
For $n = 3$, $3, 6, 7$ have '11', so the result is $3$.
Let $E_n$ be the set of integers between $1$ and $2^n$ for which $11$ doesn't appear in the binary representation. If $k$ is such a number, then either
The above actually describes a one to one mapping between $E_n$ and $E_{n-1} \sqcup E_{n-2}$ , where $\sqcup$ represents a disjoint union. From this we get $|E_n| = |E_{n-1}| + |E_{n-2}|$. Noting $|E_2| = 3$ and $|E_1| = 2$, we finally have $|E_n| = Fib(n+2)$, where $Fib$ is the Fibonnaci sequence.
Back to your original question, the answer is now obviously $2^n - |E_n|$ = $2^n - Fib(n+2)$