In a statistics class I took in college, I remember learning about a theorem stating that $50\%$ of the population must be within $1\sigma$ from $\mu$, $75\%$ within $2\sigma$, and so on, regardless of the distribution. The distribution can be tighter, but this much is a guarantee.
People don't seem to be familiar with this, I'm not sure I remembered the numbers right, and I'm not sure this even exists. Is this a thing?
Yes, some of it is true, and comes from Tschebychev inequality. It says that $$P(|X-\mu|\le n\sigma)\ge 1-\frac1{n^2}.$$ This gives the mentioned $0.75$ for $n=2$, but the $0.5$ actually appears if you take $n=\sqrt2=1.41\ldots$. It says nothing for $n=1$.
This is valid for any distribution, provided it has a well defined finite mean and a finite variance/s.d. For other distributions exact values can be determined, which are necessarily equal or greater (quite bigger, for most usual distributions) than those given by this theorem.