In the lecture notes at http://www.math.ias.edu/QFT/fall/lect2.ps (page 2) there is a "standard" lemma:

In this lemma $G = \mathrm{SO}(n,\mathbb{R})$ and $G_{\mathbb{C}} = \mathrm{SO}(n,\mathbb{C})$ and $V = \mathbb{R}^n$. Actually, the $G$ is the orthogonal group for signature $(1,n-1)$, but I assume that this is immaterial for the validity of the lemma. The above lecture notes don't contain a proof, so I've tried to prove it, but I've been unsuccessful so far.
My attempts so far are as follows. I assume that the lemma is equivalent to showing that there are real orthonormal bases $e_i$ and $e_i'$, s.t. $$ M_{ij} = e'_i \cdot g e_j$$ is block diagonal with each block having size $\leq 2$. Then, use induction over $n$. If we have a real $v$ with $v^2 = 1$ and $g v$ real, take $e_1 = gv$ and $e'_1 = v$. Then $g$ maps $\lbrace v\rbrace^\perp \rightarrow \lbrace gv\rbrace^\perp$. Hence, we can continue with the restriction of $g$ to $\lbrace v\rbrace^\perp$ and may assume that $g v$ is properly complex for any real vector of unit norm.
Define now, $ ge_i' \equiv \xi_i + \mathrm{i} \eta_i$. Here, $\xi_i$ and $\eta_i$ are all non-zero by the previous reasoning for any real orthonormal basis $e_i'$. Orthogonality now means: $$ \xi_i \cdot \xi_j = \eta_i \cdot \eta_j + \delta_{ij}$$ $$ \eta_i \cdot \xi_j + \xi_i \cdot \eta_j = 0$$
This is where I got stuck. I know that $\xi_1$ and $\eta_1$ span a proper two-dimensional plane $V_1$ as they are orthogonal. My conjecture is that the plane generated by $\xi_i$ and $\eta_i$ for $i > 1$ either is orthogonal to $V_1$ or coincides with it. But I have been unsuccessful in proving it.
Is there anything I am missing here? I suppose that the result is a statement on the double cosets of $G_{\mathbb{C}}$ under $G$.
I've analyzed the problem further and, at second sight, it's resolution is obvious. Let us start with the further assumption that $g$ has no real unit vector whose image is real. In the notation of the question, this means that the $\eta_i$ are linearly independent, hence the matrix $\eta_i \cdot \eta_j$ is invertible and therefore diagonalizable by an orthogonal transformation. We can therefore assume that: $$ \eta_i \cdot \eta_j = \lambda_i^2 \delta_{ij} $$ $$ \xi_i \cdot \xi_j = \mu_i^2 \delta_{ij}, $$ where $\mu_i^2 = \lambda_i^2 + 1$ and $\lambda_i > 0$ and $\mu_i > 0$. Because both $\xi_i/\mu_i$ and $\eta_i/\lambda_i$ are orthonormal bases, we can find an orthogonal matrix $R$, s.t. $$ \frac{\xi_i}{\mu_i} = R \frac{\eta_i}{\lambda_i}.$$
Define now $A_{ij} = \xi_i \cdot \eta_j = \mu_i \lambda_j R_{ij}$, where the numbers $R_{ij}$ denote the matrix elements of $R$ in the basis $\eta_i/\lambda_i$. $A$ is antisymmetric. Define $G = \mathrm{diag}(\mu_i/\lambda_i)$, then we have $$ G R = - R^T G. $$ This implies that $G^2 R = - G R^T G = - R R^T G R^T G = R G^2$. Hence, $R$ must preserve eigenspaces of $G^2$, which are generically two dimensional.
Reordering the indices by an orthogonal transformation, we find that: \begin{align} \eta_{2i+1} &= \pm \frac{\lambda_{2i+1}}{\mu_{2i+1}} \xi_{2i+2} \\ \eta_{2i+2} &= \mp \frac{\lambda_{2i+1}}{\mu_{2i+1}} \xi_{2i+1}. \end{align}
Or put differently: \begin{align} ge_{2i+1} &= \xi_{2i+1} \pm \mathrm{i} \frac{\lambda_{2i+1}}{\mu_{2i+1}} \xi_{2i+2}\\ ge_{2i+2} &= \xi_{2i+2} \mp \mathrm{i} \frac{\lambda_{2i+1}}{\mu_{2i+1}} \xi_{2i+1} \end{align}
Therefore, generic $g$ can be written as $g = g'h$ with $h \in \mathrm{SO}(n,\mathbb{R})$ and $g'$ preserves the complexification of a real orthogonal decomposition of $\mathbb{R}^{n}$ into one-and two-dimensional blocks.
But I think there might be a more elegant proof.