Earlier today I came across a topic known as permutation matrices. Now, when going through elementary row operations, if I wanted to switch two rows $R_i\leftrightarrow R_j$ of a matrix $A$, I would swap the $i^{th}$ row of the identity matrix with the $j^{th}$ row which I would denote $T_{i,j}$. I then multiply $T_{i,j}$ on the left by $A$, and I continue with these transformations until I get matrix $A$ into a form that is suitable. This process takes a long time. However, with permutation matrices, I found out that you don't have to do that with row switching! Below is an illustration to see what I mean:
$\textbf{Question:}$ How do all these transformations multiplied on the left by A become this one matrix? Or does anyone know the name of this theorem, so I can look more into it?

Break down the mapping shown by red arrows in the question into two cycles. Recall, the pattern for mutliplying cycles right to left to get a product of transpositions (for example $(14235)=(14)(42)(23)(35)$). For, now we will consider the first two cycles two be disjoint. In the following first sentence paragraph, we will consider the cycles to be the same, and in the last paragraph we will consider there to be one shared element between the cycles. Now, there will be many of these two cycles, but we only have to look at two of them to see what is going on. Consider the following: $P_{(i,j)}(P_{(k,l)}A)$. First note that $P_{(k,l)}A$ becomes $A_{k\leftrightarrow l}=B$, so $k\rightarrow l$ and $l\rightarrow k$. Now, after the second matrix multiplication, we know $P_{(i,j)}B$ becomes $B_{i\leftrightarrow j}=C$, so $i\rightarrow j$, $j\rightarrow i$, and $k\rightarrow l$ and $l\rightarrow k$ as these cycles are disjoint. All other rows in matrix $A$ that are not $i,j, k,$ or $l$ are not affected to get to $C$. Now, consider their permutations in $2$ line Cauchy notation. Consider $\sigma_1(\sigma_2(\alpha))$ when $\alpha\neq i, j, k$ or $l$. Thus, $\sigma_1(\sigma_2(\alpha))=\alpha$ which is precisely what happens to get to matrix $C$ as no rows change. Now, if $\alpha=i$, then $\alpha\rightarrow j$ to get to $C$ and similarly $\sigma_1(\sigma_2(\alpha))=j$ (cycles are disjoint). Analogously, this process holds for cases $\alpha=j$, $\alpha=k$, and $\alpha=l$. Multiplying by another matrix on the left, we know this pattern continues recursively as our multiplied matrices then acts like $A$ did in the beginning.
Now, if the cycles are the same, nothing happens to the matrix $C$, and similarly nothing happens to the cycles (i.e. $\forall \alpha, \sigma_1(\sigma_2(\alpha))=\alpha$). Now, the only other case is for the cycles to share one element. So, simply define $i=k$. Thus, $B$ has $i\rightarrow l$ and $l\rightarrow i$. Then, $C$ has $i\rightarrow j$ (in other words $l$ went to $j$ altogether) and $j\rightarrow i$ (so $j$ went to $i$, and $i$ went to $l$ altogether). No other rows were switched in this process. Now, for the cycles, we want to show $l$ goes to $j$, $j$ goes to $i$, and $i$ goes to $l$.
Well, define $\beta$ to be the product of both cycles, $(i,j)$ and $(k,l)$ respectively when $i=k$. Well, $\beta(l)$ went to k and then to $j$, so $\beta(l)=j$ (showed $l$ went to $j$ all together). Now, $\beta(k)$ went to $l$ and stayed put, so $\beta(k)=l$; in addition, $\beta(i)$ went to $l$ and stayed put, so $\beta(i)=l$. Hence, we showed with this last sentence that $i$ went to $l$ all together (showed $i$ went to $l$ all together). Lastly, $\beta(j)$ stayed put and then went to i, so $\beta(j)=i$ (showed $j$ went to $i$ all together). This pattern continues recursively as our multiplied matrices acts like $A$ in the beginning.