When taking limits, for example as $n \rightarrow \infty$, it is obvious that $f(n)$ and $f(n + 1/2)$ approach the same limit. In both cases, you are essentially taking $f$ at higher and higher values, so the result should be the same.
But what is the formal justification of this rule?
EDIT:
Also, there are situations such as $n \rightarrow \infty$ is identical to $k \rightarrow 0$ in $1/k$. Thus, the limit of $f(n)$ is the same as the limit of $f(1/k)$, but, again, how to show it formally?
$$ \lim\limits_{n\rightarrow\infty}x_n=x \iff (\forall \,\varepsilon>0 \,\exists N\in\mathbb{N} : n>N \implies|x_n-x|<\varepsilon)$$
Thus you can see that if $n>N$, then $n+\frac{1}{2}>N$ too. Similarly $2n$ or some function of $n$ under certain conditions.
EDIT: obviously since $N$ is arbitrary, things like $\frac{n}{2}$ hold too because $2N$ is just as good a constant. And to answer to your edit, it's enough to prove that $\lim\limits_{n\rightarrow\infty}\frac{1}{n}=0$ and $\lim\limits_{k\rightarrow0}\frac{1}{k}=\infty$
EDIT2: This answer is ultimately not correct. See the comments for Clement C. 's counterexample.