Suppose I have been given two codes $x$ and $y$ such that $x = x_1 x_2 . . . .x_m$ and $y = y_1 y_2 . . . .y_m$ where ${x_i} 's$ and ${y_i} 's$ are binary digits. Then we define the concept of distance between $x$ and $y$ as $$\delta (x,y) = |\{ j : x_j \neq y_j \}|$$
i.e. we compare $x$ and $y$ term by term and count how many digits are different at corresponding positions in $x$ and $y$. That count gives the value of distance between $x$ and $y$ .
I am unable to understand the concept of this distance. From above theory I can only understand the way in which distance between two codes is calculated but not what actually a distance between two codes mean ?
Normally, the distance measure is used to calculate the minimum of distance of any two code words of one and the same code. It is a measure of how well errors in transmitted code words can be detected/corrected.