Let $R = \{(a,b), (b,c), (c,d)\}$
How can I figure out why $R^{2} = \{(a,c), (b,d)\}$? This there a mathematical proof (or formula) to determine this for larger set of relations? Is it always first member of first ordered pair with the second member of the second ordered pair, followed by the first member of the second ordered pair with the second member of the third ordered pair, and so on?
And as for $R^{3} = \{(a,d)\}$ is is always the first member of the first ordered pair to the second member of the third ordered pair and so on?
Your undersanding of the process is accurate. The pair $(x,y)$ is in $R^2$ if there is a $z$ such that $(x,z)$ and $(z,y)$ are both in $R$. So you search for all cases where the second "coordinate" of one ordered pair is equal to the first coordinate of another. So for example if $(a,d)$ is in the relation $R$, and $(d,w)$ is in the relation $R$, then $(a,w)$ is in $R^2$.
There is another way to do it which looks more "mathematical" but amounts to the same thing. However, doing it the way I will describe is not pleasant for humans. (Computers love it.)
In our situation, we have $4$ objects, namely $a$, $b$, $c$ and $d$. Instead, let's call them $a_1$, $a_2$, $a_3$, and $a_4$. Make a $4\times 4$ matrix as follows.
In the position where the $i$-th row and $j$-th column meet, put a $1$ if $R(a_i,a_j)$ and a $0$ otherwise. This is called the adjacency matrix of the relation $R$ (the language comes from the theory of directed graphs).
Square the adjacency matrix. If in the squared matrix, there is a $0$ where the $i$-th row and the $j$-th column meet, then the relation $R^2$ does not hold for $(a_i,a_j)$. If where the $i$-th row and $j$-th column meet, there is a number $\ne 0$, then the relation $R^2$ holds for $(a_i,a_j)$. The number measures the number of $w$ such that $(a_i,w)$ and $(w,a_j)$ are both in $R$, or more geometrically the number of $2$-step "paths" from $a_i$ to $a_j$.
Remark: The same idea works for relations on any finite set, and for $R^n$ (just take the $n$-th power of the adjacency matrix, easier said than done).
More generally, suppose that you have two relations $R$ and $S$ on a set. We can define $RS$ in a way analogous to the definition of $R^2$. And we find $RS$ by multiplying the adjacency matrices of $R$ and $S$, and going through the same process as we did for $R^2$.
The adjacency matrix can be a powerful tool. For we can use matrix theoretic ideas, such as eigenvalues, to obtain useful information about $R$.