I was looking at the following definition of Chebyshev's inequality
$$P(|X - E(X)| \geq r) \leq \frac{Var(X)}{r^2}$$
which includes the expected value and variance of $X$, and then I discovered there's another equivalent Chebyshev's inequality, which involves the standard deviation $\sigma$
$$P(|X - E(X)| \geq r\cdot \sigma) \leq \frac{1}{r^2}$$
but I am not understanding why are these formulas equivalent.
Could you please explain to me why this is the case?
Note that I know what is the standard deviation.
Let's replace $Var(X)$ with $\sigma^2$ in the first equation to give $$P(|X - E(X)| \geq r) \leq \frac{\sigma^2}{r^2}.$$
Now suppose $k= \dfrac{r}{\sigma}$, i.e. $r = k \sigma$, and substitute to give $$P(|X - E(X)| \geq k \sigma) \leq \frac{\sigma^2}{k^2\sigma^2} = \frac{1}{k^2}$$ which is your second equation using $k$ instead of $r$.
You can think of the $r$ in the first equation as having the same units as $X$ and $\sigma$, and the $r$ or $k$ in the second as being a unitless scalar multiple of the standard deviation, but ultimately they say the same thing.