Let ${\pmb{x}_1}$ and ${\pmb{x}_2}$ be vectors in $\mathbb{R}^2$ such that they are orthogonal and have length 1. Let $P$ be the 2x2 matrix whose column vectors are ${\pmb{x}_1}$ and ${\pmb{x}_2}$.
Now, I need to shot that $P^TP=I$ and $PP^T=I$.
Showing $P^TP=I$ was easy because row $i$ of $P^T$ is actually column $i$ of column $i$ of $P$, so when I did the multiplication, it is evident that it is just:
$ \left[ \begin{array} {cc} <\pmb{x}_1, \pmb{x}_1> &<\pmb{x}_1, \pmb{x}_2> \\ <\pmb{x}_2, \pmb{x}_1> &<\pmb{x}_2, \pmb{x}_2> \\ \end{array} \right]$
and so the identity matrix follows since we are told the length of each vector is 1 and the two vectors are pairwise orthogonal.
However, I can't use this same approach to show $PP^T$ equals to 1. I've tried filling in some values for $P$ and doing the multiplication but I don't see any patterns that allow me to express the multiplication in terms of the inner product of the column vectors of $P$. Please help.
For square matrices it holds that whenever $AB=I$, then also $BA=I$, so you're actually done. One way to see this is that $AB=I$ certainly implies that $\det(A)\neq0$, so $A$ has inverse $A^{-1}$, and multiplying the equation $AB=I$ to the left by $A^{-1}$ gives $B=A^{-1}$, whence $BA=I$ is an instance of $A^{-1}A=I$.
Note that this would fail for rectangular matrices. Indeed if $P'$ is for instance obtained from an orthogonal matrix$~P$ (of size larger than $2\times2$) by selecting several but not all columns, then your proof shows that $(P')^TP'$ is (still) an identity matrix (though a smaller one than $P^TP$). However $P'(P')^T$ will certainly not be an identity matrix (its rank is too small to be invertible).