Page 37 of Continuum mechanics by C. S. Jog lists the following formulae as the "spectral resolutions" of an orthogonal tensor $\bf R$ as having the eigenvectors ${\bf e} , \, {\bf n} , \, {\bf \hat{n}}$ and the corresponding eigenvalues $1, \lambda , \hat{\lambda}$, where ${\bf n}$ and ${\bf \hat{n}}$ are complex conjugates.
$$ \begin{equation}\begin{aligned} {\bf I} & = {\bf e} \otimes {\bf e} + {\bf n} \otimes {\bf \hat{n}} + {\bf \hat{n}} \otimes {\bf n} \\ {\bf R} & = {\bf e} \otimes {\bf e} + \lambda \; {\bf n} \otimes {\bf \hat{n}} + \hat{\lambda} \; {\bf \hat{n}} \otimes {\bf n} \\ {\bf R}^2 & = {\bf e} \otimes {\bf e} + \lambda^2 \; {\bf n} \otimes {\bf \hat{n}} + \; \hat{\lambda}^2 {\bf \hat{n}} \otimes {\bf n} \\ \end{aligned}\end{equation} $$
I'd appreciate help deriving these "spectral resolutions".
I finally have the answer with help from the author. What follows is a sketch of the proof.
First, because the eigenvalues $1$, $\lambda$ and $\hat{\lambda}$ are distinct they are linearly independent (see a simple proof here). Next because the eigenvalues of $\bf R$ are linearly independent it can be expressed as the product of three matrices
$$ {\bf R} = {\bf P} \, {\bf \Lambda} \, {\bf P}^{-1} $$
Proof given here, where $\bf P$ is the the matrix whose columns consist of the eigenvectors $\bf e$, $\bf n$ and $\bf \hat{n}$ and $\bf \Lambda$ is the diagonal matrix whose diagonal elements are the eigenvalues $1$, $\lambda$ and $\hat{\lambda}$ i.e.
$$ {\bf P} = [ \, {\bf e} \, , \, {\bf n} \, , \, {\bf \hat{n}} \, ] $$
$$ \Lambda = \begin{pmatrix} 1 & 0 & 0 \\ 0 & \lambda & 0 \\ 0 & 0 & \hat{\lambda} \end{pmatrix} $$
Because
$${\bf P} = e_1 \otimes {\bf e} \, + \,e_2 \otimes {\bf n} \, + \, e_3 \otimes {\bf \hat{n}}$$
where $e_1 = [1 \, , \, 0 \, , \, 0]^T$, $e_2 = [0 \, , \, 1 \, , \, 0]^T$ and $e_3 = [0 \, , \, 0 \, , \, 1]^T$.
The transpose conjugate is
$${\bf P}^T = {\bf e} \otimes e_1 \, + {\bf \hat{n}} \otimes \,e_2 \, + \, {\bf n}\otimes e_3 $$
Note that the conjugate is taken because $\bf P$ is complex valued. It follows that:
$${\bf P}^T {\bf P} \, = \, {\bf P} \, {\bf P}^T = \, {\bf I} = \, {\bf e} \otimes {\bf e} \, + \,{\bf n} \otimes {\bf \hat{n}} \, + \, {\bf \hat{n}} \otimes {\bf n}^T $$
Therefore ${\bf P}^T$ is the inverse of ${\bf P}$, i.e.
$${\bf P}^T = \, {\bf P}^{-1} = \, [ \, {\bf e}^T \, , \, {\bf n}^T \, , \, {\bf \hat{n}}^T \, ]$$
Therefore
$$ \begin{equation}\begin{aligned} {\bf R} &= {\bf P} \, {\bf \Lambda} \, {\bf P}^T \\ &= {\bf e} \, {\bf e}^T \, + \, \lambda \, {\bf n}\, {\bf \hat{n}}^T \, + \, \hat{\lambda}\, {\bf \hat{n}} \, {\bf n} \\ &= {\bf e} \otimes {\bf e} \, + \, \lambda \, {\bf n} \otimes {\bf \hat{n}} \, + \, \hat{\lambda}\, {\bf \hat{n}} \otimes {\bf n}^T \end{aligned}\end{equation} $$