For each vector $x=(x_1,\dots,x_n)$ of an $n$-dimensional vector space $V$, and for each permutation $s$ of the symmetric group on the $n$-element set $S_n$, put $s(x)=(x_{s(1)},\dots,x_{s(n)})$. Then the anti-symmetrization and symmetrization operators defined bellow are mutually orthogonal.
$$A={1 \over n!}\sum_{s\in S_n}\text{sgn}(s)s$$ $$S={1 \over n!}\sum_{s\in S_n}s$$ Here "sgn" is the parity function.
I could not justify the following equation:
$$\sum_{s\in S_n}\sum_{p\in S_n}\text{sgn}(p)sp=\sum_{t\in S_n}t\sum_{p\in S_n}\text{sgn}(p)$$
Apparently $t=sp$ but why can we take $t$ out of the sum?
Edit for clarity, I'm using parentheses and $\cdot$ to denote multiplication by scalars.
First, note $$\sum_{s\in S_n}\left(\sum_{p\in S_n}\mathrm{sgn}(p)\cdot sp\right) = \sum_{p\in S_n}\left(\mathrm{sgn}(p)\cdot\sum_{s\in S_n}sp\right).$$ Second, for a given $p\in S_n$ and every $t\in S_p$ there exists $s\in S_n$ such that $sp=t$ (namely, $s=tp^{-1}$), and vice versa. Therefore $$\sum_{s\in S_n}sp = \sum_{t\in S_n}t$$
However, writing the last equality the other way around, i.e., $$\sum_{p\in S_n}\left(\mathrm{sgn}(p)\cdot\sum_{t\in S_n}t\right) = \sum_{t\in S_n}\left(t\sum_{p\in S_n}\mathrm{sgn}(p)\right),$$ might be confusing, as $\sum_{p\in S_n}\mathrm{sgn}(p)$ is simply the zero scalar (with the field addition), whereas all other sums are additions in $V$. It might be better to denote $$\sum_{p\in S_n}\left(\mathrm{sgn}(p)\cdot\sum_{t\in S_n}t\right) = \left(\sum_{p\in S_n}\mathrm{sgn}(p)\right)\cdot\sum_{t\in S_n}t = 0\cdot\sum_{t\in S_n}t = \underline{0}$$