Chebyshev's inequality gives an upper bound on $P(|X - \mu| \geq k\sigma)$ but I was wondering if there was a way to find a lower bound for this probability or, equivalently an upper bound on $P(|X - \mu| < k\sigma)$.
What about with added assumptions? For example, is there any way to get a lower bound for the probability that a $\textrm{Binomial}(n, p)$ variable takes a value more than a constant $K$ away from the mean?
To answer the first question: no there isn't. Consider a random variable $X$ taking values $\pm n$ each with probability $\frac{1}{2n^2}$ and $0$ otherwise. This has $\mathbb E(X)=0$ and $\mathbb E(X^2)=1$, but for fixed $k$ the probability $\mathbb P(|X|>k)=\frac 1{n^2}$ can be made arbitrarily small by choosing $n>k$ appropriately.
However, in this example the third moment $\mathbb E(X^3)=n$ goes to infinity. It turns out this is a necessary feature of such an example: you can get a lower bound for the tail probability in terms of the mean and second and third moments. See this paper by Rohatgi and Székely. This will give something nontrivial for e.g. binomial distributions.