Lets say $p_k$ is the $k_{\text{th}}$ prime number. Also, lets say $p_1 \cdot p_2\cdot \cdots p_{k-2}\cdot p_{k-1} < 2^{n}$ and $p_1 \cdot p_2\cdot \cdots p_{k-1}\cdot p_{k} > 2^n$.
How does one obtain a good estimate of $k$ in terms of $n$?
Lets say $p_k$ is the $k_{\text{th}}$ prime number. Also, lets say $p_1 \cdot p_2\cdot \cdots p_{k-2}\cdot p_{k-1} < 2^{n}$ and $p_1 \cdot p_2\cdot \cdots p_{k-1}\cdot p_{k} > 2^n$.
How does one obtain a good estimate of $k$ in terms of $n$?
Copyright © 2021 JogjaFile Inc.
That the threshold you happen to be interested in is $2^n$ doesn't open up any special ways, as far as I'm aware, so let's more generally estimate the index $k$ such that $$p_1\cdot p_2\cdot \ldots \cdot p_{k-1} < x \leqslant p_1\cdot p_2 \cdot \ldots \cdot p_k \tag{1}$$ for "large enough" $x$. For small $x$ the asymptotic expressions for functions related to the distribution of primes aren't very useful yet, and besides the numbers can with rather little effort be determined exactly.
The first rule of thumb when dealing with products (of positive numbers) is "take logarithms". On the one hand, that transforms the product into a sum, and we have more experience dealing with sums, and on the other it reduces the size of the numbers we deal with. Thus, taking logarithms transforms $(1)$ into $$\sum_{m = 1}^{k-1} \log p_m < \log x \leqslant \sum_{m = 1}^{k} \log p_m\,. \tag{2}$$ The sum of the logarithms of the first primes gives the first Chebyshev function, $$\vartheta(y) := \sum_{p \leqslant y} \log p\,,$$ and we can write $(2)$ as $$\vartheta(p_{k-1}) < \log x \leqslant \vartheta(p_k)\,.$$ To estimate $k$ it is therefore useful to have good estimates for $\vartheta(y)$. The prime number theorem without error bounds is equivalent to $\vartheta(y) \sim y$, i.e. $$\lim_{y \to \infty} \frac{\vartheta(y)}{y} = 1\,.\tag{3}$$ Plugging this into $(2)$, which shall implicitly define $k$, we obtain $$p_k \sim \log x$$ and since $k = \pi(p_k)$ that is $$k \sim \pi(\log x) \sim \frac{\log x}{\log \log x}\,.$$ For $x = 2^n$ we have $\log x = n\log 2$ and $\log \log x = \log n + \log \log 2$, thus $$k \sim \frac{n\log 2}{\log n + \log \log 2} \sim \frac{n\log 2}{\log n}\,.$$ We have however more precise information about $\vartheta$ than $(3)$. In 1899, de la Vallée Poussin proved $$\vartheta(y) = y + O\bigl(ye^{-c\sqrt{\log y}}\bigr) \tag{4}$$ for some positive constant $c$. We now have sharper bounds, but they are similar. If the Riemann hypothesis is true, then $$\lvert\vartheta(y) - y\rvert < \frac{1}{8\pi}\sqrt{y}\,(\log y)^2$$ holds for $y > y_0$ (I forgot the value of $y_0$, it's pretty small, however). But a proof of the Riemann hypothesis is not yet in sight.
Back to what we do know, namely $(4)$. That, in connection with $(2)$ gives us $$p_k = \log x \cdot \bigl(1 + O\bigl(e^{-c\sqrt{\log \log x}}\bigr)\bigr)$$ and — with an error bound for $\pi(y)$ of the same type as $(4)$ — $$k = \pi(p_k) = \operatorname{Li}(\log x) + O\bigl(\log x \cdot e^{-c\sqrt{\log \log x}}\bigr))$$ where $$\operatorname{Li}(y) = \int_2^y \frac{dt}{\log t}$$ is the offset logarithmic integral. Integration by parts yields $$\operatorname{Li}(y) = \frac{y}{\log y}\cdot \sum_{r = 0}^{m} \frac{r!}{(\log y)^r} + O\biggl(\frac{y}{(\log y)^{m+2}}\biggr)$$ for every $m$, and hence $$k = \frac{\log x}{\log \log x} \cdot \sum_{r = 0}^{m} \frac{r!}{(\log \log x)^r} + O\biggl(\frac{\log x}{(\log \log x)^{m+2}}\biggr)\,.\tag{5}$$