Let $S_n = \sum_1^n X_i$ with $P(X_i =1) = p>1/2$ and $P(X_i = -1) = 1-p$ be a biased random walk. Large deviations tell us that $p_0=P(S_n \leq 0) \leq n^a(2 \sqrt{p(1-p)})^n$. We are curious what the exponent $a$ is.
Ignoring parity problems, one can obtain the lower bound $$p_0 \geq P(S_n = 0) = \binom{n}{n/2} p^{n/2}(1-p)^{n/2} \approx n^{-1/2} (2 \sqrt{p(1-p)})^n.$$ However, when we take the sum $p_0 = \sum_{k=0}^{-n} P(S_n = k)$, this might increase the $-1/2$ exponent. Since $\sum_1^{n} k^{-1/2} \approx n^{1/2}$, I am inclined to guess that $p_0 \approx n^{1/2} (2 \sqrt{p(1-p)})^n$. However, there also seems to be some additional penalty for going far behind $0$, so maybe $-1/2$ is sharp.
Here is a partial answer.
A heuristic computation. Let $D(x||p) = x \log\left(\frac{x}{p}\right) + (1-x)\log\left(\frac{1-x}{1-p}\right)$ denote the Kullback-Leibler divergence of $\operatorname{Ber}(x)$ w.r.t. $\operatorname{Ber}(p)$ and write $r = 2\sqrt{p(1-p)}$ for simplicity. Then
$$ \frac{\sqrt{2\pi n}}{r^n} \binom{n}{k}p^k(1-p)^{n-k} \approx \frac{1}{\left(\frac{k}{n}\left(1 - \frac{k}{n}\right) \right)^{1/2}} e^{-n \left( D(\frac{k}{n}||p) - D(\frac{1}{2}||p) \right)} $$
tells that we can expect
\begin{align*} \mathbf{P}[S_n \leq 0] &\approx \frac{r^n}{\sqrt{2\pi n}} \sum_{k=1}^{\frac{n}{2}} \frac{1}{\left( \frac{k}{n}\left(1 - \frac{k}{n}\right) \right)^{1/2}} e^{-n \left( D(\frac{k}{n}||p) - D(\frac{1}{2}||p) \right)} \\ &\approx \frac{r^n}{\sqrt{2\pi n}} \sum_{j\in\{\frac{n}{2}-k:1\leq k\leq\frac{n}{2}\}} 2e^{jD'(\frac{1}{2}||p)}, \qquad (j = \tfrac{n}{2}-k) \\ &\approx \frac{r^n}{\sqrt{2\pi n}} \cdot\frac{2p}{2p-1} \left( \frac{1-p}{p} \right)^{\frac{1}{2}(n \text{ mod } 2)}. \end{align*}
This matches numerical computations as well, and I believe that this can be justified with some efforts. (Obtaining the correct order only up to constants would be easier.) I will update my answer if I come up with a rigorous justification.
A more rigorous computation. We have the following result:
Proof. By the quantitative version of the Stirling's approximation, $n! = \Theta( n^{n+\frac{1}{2}}e^{-n} )$ for $n \geq 1$, where $\Theta$ is the Big-Theta notation. Thus for $1 \leq k \leq \frac{n}{2}$,
\begin{align*} \binom{n}{k} p^k (1-p)^{n-k} &= \Theta \Bigg( \sqrt{\frac{n}{k(n-k)}} \cdot \frac{n^n}{k^k (n-k)^{n-k}} p^k(1-p)^{n-k} \Bigg) \\ &= \Theta \Bigg( \sqrt{\frac{n}{k(n-k)}} e^{-nD(\frac{k}{n}||p)} \Bigg), \end{align*}
where $D(x||p) := x \log\left(\frac{x}{p}\right) + (1-x)\log\left(\frac{1-x}{1-p}\right)$. Plugging this to the summation describing $\mathbf{P}[N_n \leq n/2]$, we obtain
\begin{align*} \mathbf{P}[N_n \leq n/2] &= (1-p)^n + \sum_{k=1}^{\lfloor n/2 \rfloor} \binom{n}{k} p^k (1-p)^{n-k} \\ &= (1-p)^n + \Theta \Bigg( \frac{r^n}{\sqrt{n}} \underbrace{ \sum_{k=1}^{\lfloor n/2 \rfloor} \frac{1}{\left(\frac{k}{n}\left(1 - \frac{k}{n}\right) \right)^{1/2}} e^{n \left( D(\frac{1}{2}||p) - D(\frac{k}{n}||p) \right)} }_{=(*)} \Bigg) \end{align*}
In the last line, we utilized the identity $r = e^{-D(\frac{1}{2}||p)}$. Let $I_n = \{ \frac{n}{2}-k : 1 \leq k \leq \frac{n}{2} \}$ and rewrite the above sum $(*)$ as
\begin{align*} (*) &= \sum_{j \in I_n} \frac{1}{\left(\frac{1}{4} - \frac{j^2}{n^2} \right)^{1/2}} e^{n \left( D(\frac{1}{2}||p) - D(\frac{1}{2}-\frac{j}{n}||p) \right)} \end{align*}
Now notice that $x \mapsto D(x||p)$ is convex and strictly decreasing on $(0, \frac{1}{2}]$. From this, it follows that
$$ n \left( D(\tfrac{1}{2}||p) - D(\tfrac{1}{2}-\tfrac{j}{n}||p) \right) \leq j \frac{\partial D}{\partial x}\left(\tfrac{1}{2}||p\right) = j \log\left( \frac{1-p}{p} \right). $$
Since $0 < \frac{1-p}{p} < 1$, Using this, we can prove that $ (*) = \Theta\left( \sum_{j=0}^{\infty} \left( \frac{1-p}{p} \right)^j \right) = \Theta(1) $ as $n\to\infty$. Therefore the claim follows.