Consider an element $A \in$ SO($n$). I am trying to find its corresponding block diagonal matrix. I prove this by induction on $n$. For $n = 2$, we know that every element is like a rotation matrix with angle $\theta$. I am having trouble for SO(3) and SO(4) for few cases.
each eigenvalue of $A \in$ SO($n$) has modulus 1 as for an eigenvector $v$ corresponding to the eigenvalue $\lambda$, $|Av| = |\lambda v| = |v|$, and hence $|\lambda| = 1$.
Moreover, If $n$ is odd, then $A$ definitely has 1 as an eigenvalue because of the following argument:
det($I-A$) = det($A$)det($I-A$) = det($A^T$)det($I-A$) = det($A^T-I$) = det($A-I$) = $-$det($I-A$) which implies det($I-A$) = 0.
Suppose $n=3$. We have the cases:
(Case 1) The eigenvalues are (1,1,1).
(Case 2) The eigenvalues are (1,-1,-1).
(Case 3) The eigenvalues are (1,$e^{i\theta}$,$e^{-i\theta}$)
My question is, how to ensure that inspite of 1 being a repeated eigenvalue in case 1, and -1 being a repeated eigenvalue in case 2, how do we get distinct independent eigenvectors of $A$? I have a similar question in SO(4) as well. For instance, if the eigenvalues of an element in SO(4) are -1,-1,1,1, how do we ensure that we obtain four independent eigenvectors?
An orthogonal matrix (indeed any normal matrix) is diagonalizable over the complexes by the spectral theorem, so we are guaranteed to have independent eigenvectors for our repeated eigenvalues.
An easy way to see this is to consider the action of $A$ on the orthogonal complement of the span of its first eigenvector. It restricts to an operator on this subspace, so it has an eigenvector. Repeat the process and you have three orthogonal eigenvectors.
In more detail, suppose we know that $A$ has one eigenvalue $\lambda_1$ (which as you point out, is guaranteed for an $n\times n$ matrix if $n$ is odd). It must have at least one eigenvector $v_1$ for that eigenvalue. Then $A$ preserves the orthogonal complement of $v_1$, so we may consider the action of $A$ restricted to this orthogonal complement, call it $A_1$.
By the way, the fact that $A$ preserves the orthogonal complement of its eigenvector is a fact which you proved yourself in the comments for orthogonal matrices, but it is worth noting perhaps that it is enough to assume $A$ is normal, i.e. it commutes with its adjoint.
In a basis with $v_1$ as first vector, the matrix representation of $A$ will be in block diagonal form, with $\lambda_1$ in the first $1\times 1$ block, and $A_1$ in the $(n-1)\times(n-1)$ block. Determinants play nice with block diagonal matrices, so we have $\det A = \lambda_1\det A_1$ and $\det(A-\lambda)=(\lambda_1-\lambda)\det(A_1-\lambda)$. Consequently the eigenvalues of $A$ are the eigenvalues of $A_1$, but with $\lambda_1$ added in (or increased by one in multiplicity).
With that information in hand, an iterative argument allows us to complete the job. $A$ has an $n$ eigenvalues (possibly repeated), so it has at least an eigenvector $v_1$. Then $A_1$ has $n-1$ eigenvalues, so it has at least an eigenvector $v_2$, which is orthogonal to $v_1$. Then $A_2$ (which is $A$ restricted to the orthogonal complement of $v_1$ and $v_2$), has $n-2$ eigenvalues, so it has at least one eigenvector $v_3$, which is orthogonal to $v_1$ and $v_2$. Iterate this process until you have $n$ orthogonal eigenvectors, and bob's your uncle.
This is a proof of the "if" part of the spectral theorem for finite dimensional operators, which states that an $n\times n$ matrix over the complexes is diagonalizable by orthogonal eigenvectors if and only if it is normal.