$\forall\sigma,\tau\in S_n$, $P(\sigma\tau)=P(\sigma)P(\tau)$
Definition: To each $\sigma\in S_n$ we may associate an $n\times n$ permutation matrix $P(\sigma)$ given by
$P_{ij}(\sigma)=\delta_{i,\sigma(j)}$
Proof:
$$[P(\sigma)P(\tau)]_{ik}=\sum_{j=1}^nP_{ij}(\sigma)P_{jk}(\tau)=\sum_{j=1}^n\delta_{i,\sigma(j)}\delta_{j,\tau(k)}=\delta_{i,\sigma(\tau(k))}=P_{ik}(\sigma\tau)$$
I dont undetstand the step:
$$\sum_{j=1}^n\delta_{i,\sigma(j)}\delta_{j,\tau(k)}=\delta_{i,\sigma(\tau(k))}$$
By the summation convention we can ignore the summation sign,
How does $$\delta_{i,\sigma(j)}\delta_{j,\tau(k)}=\delta_{i,\sigma(\tau(k))}$$
$$\delta_{i,\sigma(j)}\delta_{j,\tau(k)}=1\iff i=\sigma(j),j=\tau(k)\iff i=\sigma(\tau(k))\iff\delta_{i,\sigma(\tau(k))}=1$$