Let $A=A_R+iA_I$ be a $n\times n$ complex matrix. If we want to solve a linear system with regard to $A$ and do not want to take complex arithmetic, then we often generate the following real matrix: \begin{equation} \widehat{A} = \begin{bmatrix}A_R&-A_I\\ A_I&A_R \end{bmatrix}. \end{equation} Furthermore, when $A$ is Hermitian, the eigenpairs of $A$ can also be calculated from eigenpairs of $\widehat{A}$.
My question is, for a general complex matrix $A$, do eigenvalues of $A$ have relation with eigenvalues of $\widehat{A}$? By numerical experiments, it shows that eig($A$) belongs to eig($\widehat{A}$), and the other $n$ eigenvalues are their conjuate. But how to prove and how to recover to eig($A$)?
Partial Answer
Consider the map $S: \mathbb C^n \to \mathbb R^{2n}$ defined on each coordinate by sending $x_j + i y_j$ to $x_j$ and $y_j$, separated by $n$ indices. As an example, when $n = 2$, we have $$ S\pmatrix{3 + 4i \\ 7 - 2i} = \pmatrix{3 \\7 \\4 \\-2}. $$
This map is $\mathbb R$-linear, but not $\mathbb C$-linear. For $p$ and $q$ real numbers, and $v$ a vector in $\mathbb C^n$, we have $$ S(pv) = p S(v) $$ but $$ S(qi v) = q H(S(v)) $$ where $$ H(w) = \pmatrix{0_n & -I_n \\ I_n & O_n}w $$ so that $$ S((p + qi)v) = pS(v) + qiH(S(v)) $$
There's an obvious inverse to $S$ -- let's call it $T$ -- as well.
If $v$ is a vector in $\mathbb C^n$, I believe a little experimentation in the $n = 2$ case will convince you that $$ S(Av) = A' S(v)). $$ where I'm using prime instead of "hat" because it's easier to type.
If experimentation doesn't convince you, writing out everything in terms of $A_R, A_I, v_R,$ and $v_I$ (where these are the real and imaginary parts of the vector $v$) should suffice. BTW, it's possible that I have the minus sign in the wrong place in my definition of $H$.
Now suppose that $\lambda$ is a real eigenvalue of $A$, with corresponding eigenvector $v$. Then we have $$ S(Av) = S(\lambda v) = \lambda S(v) $$ because $S$ is $\mathbb R$-linear. On the other hand, we have $$ S(Av) = A' S(v) $$ and combining, this gives us $$ A' S(v) = \lambda S(v), $$ so that a real eigenvalue of $A'$ is also an eigenvalue of $A$.
If we look at a pure-imaginary eigenvalue of $A$, say $ti$, where $t$ is real, and an eigenvector $v = v_r + i t_I$, we can write out what it means for $Av$ to equal $tiv$:
\begin{align} Av &= A(v_R + i v_I) \\ &= (A_R + i A_I)(v_R + i v_I) \\ &= (A_Rv_R - A_I v_I) + i (A_Iv_R + A_R v_I) \\ &= ti(v_R+ iv_I) = -tv_I + itv_R \end{align} Setting real and imaginary parts equal, we get \begin{align} (A_Rv_R - A_I v_I) = -tv_I \\ (A_Iv_R + A_R v_I) = tv_R \end{align} which is nearly the same as saying that $$ A' S(v) = t S(v) $$ except for the sign-change in the first equation.
I therefore suspect that if you look at the 2x2 complex matrix $$ \pmatrix{ i & 0 \\ 0 & 2 } $$ you may find that the eigenvalues of $A'$ and $A$ don't quite match up nicely. I've run out of time to do any more on this.