I think that the following inequality holds for all $x > 0$ and all $\nu$ above some constant that is somewhere around 0.2: $$ K_\nu(2 x) \le \frac{2^{2 - 2 \nu}}{\Gamma(\nu)} x^\nu K_\nu^2(x) $$
Here's a plot of $\frac{2^{2 - 2 \nu}}{\Gamma(\nu)} x^\nu K_\nu^2(x) - K_\nu(2 x)$ for different $\nu$s:
And the minimum of these curves over 10,000 points $0 < x \le 10$ while varying $\nu$:
How can this inequality be proved? Ideally it'd include the lower bound on $\nu$ for which it becomes true for all $x$, but doing it just for $\nu \ge \frac12$ or $\nu > 1$ would also be helpful.
I've tried manipulating the integral representation $K_\nu(x) = \int_0^\infty \exp(- x \cosh t) \cosh(\nu t) \,\mathrm{d}t$, but didn't get anywhere.


I ended up taking a different approach to show it only for half-integer $\nu$, which is 90% of the cases I care about. I think it should be possible to extend that to all $\nu > \tfrac12$, but I haven't gotten that to work yet.
Define the Matérn kernel by $$k_\nu(x) = \frac{1}{\Gamma(\nu) 2^{\nu - 1}} \left( \sqrt{2 \nu} x \right)^\nu K_\nu\left( \sqrt{2 \nu} x \right).$$
The desired inequality is equivalent to $$2 k_\nu(x)^2 \ge k_\nu(2 x),$$ which is actually what I originally wanted to show.
When $p$ is a nonnegative integer, we have $$ k_{p+1/2}(x) = \exp\left( - \sqrt{2 p + 1} x \right) \sum_{i=0}^p \underbrace{\frac{p! \, (2 p - i)!}{(2 p)! \, i! \, (p - i)!} \left( 2 \sqrt{2 p + 1} \right)^{i}}_{a_i} \; x^i .$$ This is essentially equation (4.16) of Rasmussen and Williams (2006), Gaussian Processes for Machine Learning; they say it can be derived from Abromowitz and Stegun's (10.2.15). Then \begin{align} 2 k_{p + 1/2}(x)^2 - k_{p + 1/2}(2 x) &= 2 \exp\left(-\sqrt{2p+1} x\right)^2 \sum_{i=0}^p \sum_{j=0}^p a_i a_j x^{i+j} - \exp\left(-2 \sqrt{2p+1} x \right) \sum_{i=0}^p a_i (2 x)^i \\&= \exp\left(-2\sqrt{2p+1} x\right) \left( 2 \sum_{k=0}^{2p} \sum_{i=0}^k a_i a_{k-i} x^k - \sum_{k=0}^p 2^k a_k x^k \right) \\&> \exp\left(-2\sqrt{2p+1} x\right) \sum_{k=0}^p \left( 2 \sum_{i=0}^k \frac{a_i a_{k-i}}{a_k} - 2^k \right) a_k x^k .\end{align} We have that \begin{align} \sum_{i=0}^k \frac{a_i a_{k-i}}{a_k} &= \sum_{i=0}^k \frac{k!}{i! \, (k-i)!} \frac{p!}{(p-i)!} \frac{(p-k)!}{(p-k+i)!} \frac{(2p - k + i)!}{(2p-k)!} \frac{(2p-i)!}{(2p)!} \\&= \sum_{i=0}^k \binom{k}{i} \prod_{j=1}^i \frac{p - i + j}{p - k + j} \frac{2p - k + j}{2p - i + j} \\&\ge \sum_{i=0}^k \binom{k}{i} = 2^k \end{align} because, since $p - i + j \ge p - k + j$, each factor in the product is $$ \frac{p - i + j}{p - k + j} \frac{p + (p - k + j)}{p + (p - i + j)} \ge 1 .$$ Thus $2 k_{p + 1/2}(x)^2 > k_{p + 1/2}(2 x)$.