I am studying coding theory recently. There is a notion "asymptotic bound" of linear code, like "Asymptotic Singleton Bound". It gives a rate when a relative distance is approached to parameter $\delta$ and the length of code is infinite, where the rate is defined as $k/n$ and the relative distance is defined as $d/n$. (Refer to the book "Fundamentals of Error-Correcting Codes" written by W. Cary Huffman and V.Pless.)
However, I don't know what is the point for this notion, e.g, what is it used for? If $n$ is infinite and relative distance is approached to $\delta$, is it means that the distance $d$ also increase as $n$, for example, suppose $\delta=\frac{1}{2}$, $n \rightarrow 2n $, then $d \rightarrow 2d$?
Errors usually hit each bit/symbol independently. Therefore if each bit is in error with probability $p$ which is less than the rate $\delta$ the errors can be detected. If $p$ is less than $\delta/2,$ they can be corrected.
If $p>\delta$ then no reliable transmission is possible. Letting $n$ get large is to allow the sample error statistics behave by the law of large numbers, when the number of actual errors is with high likelihood near $p n.$