Let $C$ be a fixed real $n\times n$ matrix, $X$ be an arbitrary real $n\times n$ matrix. Find the minimum value of:
$$|\det(X+iC)|=\sqrt{\det(X+iC)\det(X-iC)}$$
When $n=1$ it's clear that the minimum value equal $|C|$, however it seems like with $n\ge2$ the minimum value is exactly zero, but I don't know how to prove it.
I have found out that this is relating to the minimum of $\det(I+Q^2)$ with $I$ is identity and $Q$ is arbitrary real matrix. If you can evaluate either:
$|\det(X+iC)|$
$\det(I+Q^2)$
it would be appreciated!
For any $C$, one can always find an $X$ such that $\det(X+iC)=0$ : if $C$ is not invertible, simply take $X=0$. If $C$ is invertible, take $X=CA$ where $A$ is any real matrix having a two-dimensional invariant subspace ${\sf span}(v_1,v_2)$ with $Av_1=v_2$, $Av_2=-v_1$ (so that $\pm i$ are eigenvalues of $A$). Then
$$ \det(X+iC)=\det(C(A+iI_n))=\det(C)\times 0=0. $$
Update I’ve just realized that my second case construction works in all cases, there’s no need to assume that $C$ is invertible.