Let $A$ and $B$ be any $n \times n$ square matrices.
If we take the elements of the two matrices and combine them in the following way to obtain a $2n \times 2n$ square matrix, would it have any nice properties?
$$ \begin{pmatrix} A & -B \\ B & A \end{pmatrix} $$
I think of this matrix when trying to think about singularity of complex matrix and its determinant and eigenvalue. If I have a complex matrix $E$, I separate the imaginary part and the real part to obtain $B$ and $A$ respectively. $B$ represents the imaginary part and $A$ represents the real part. If we have any $n$-tuple column complex vector, we write out first the real part and then the imaginary part to get a $2n$-tuple real vector. Namely, $x=(a+bi,c+di)$ becomes, $(a,c,b,d)$
According to the rule of complex multiplication, the determinant of this matrix should tell me whether the complex matrix $E$ has null space containing not only the $0$ vector?
It seems intuitive to me that this would work. How is this compare to the usual way of calculating the determinant of a complex matrix? Does the usual complex determinant tells singularity of a matrix? If the matrix is complex, does the usual way of calculating the eigenvector work? Namely, solving for the root of the characteristic equation for $A-\lambda I$, and then solve for the non trivial homogeneous solution?