Let $A$ be a skew symmetric $n \times n$ matrix, then what is the least possible value of $\mid{I + \lambda A^2}\mid$, for any real value of $\lambda$?
I verified that its least value is $0$, by taking a $2 \times 2$ skew symmetric matrix with off diagonal elements as $1$, $-1$. But I need a formal proof for same. Any help will be appreciated.
There are two components to this proof:
For the first part, your idea should suffice. In particular, we can always take $$ \lambda = 1, \quad A = \pmatrix{ 0&-1&0&\cdots\\ 1&0&0&\cdots\\ 0&0&0\\ \vdots&\vdots&&\ddots} . $$ Note that $I + \lambda A^2$ is diagonal with zeros on the diagonal and therefore has determinant zero.
For the second part, one approach is to use the spectral theorem. In particular, since all eigenvalues of $A$ are either $0$ or of the form $\lambda = \pm \mu i$, we can deduce that $A^2$ has real eigenvalues, and all negative eigenvalues of $A^2$ have even multiplicity. It follows that $I + \lambda A^2$ has real eigenvalues, and its negative eigenvalues of also have even multiplicity. So, $|I + \lambda A^2|$ (which is the product of these eigenvalues) is necessarily non-negative.
Another approach is as follows: note that for $\lambda \leq 0$, we have $$ (I + \sqrt{|\lambda|}A)^T(I + \sqrt{|\lambda|}A) = (I - \sqrt{|\lambda|}A)(I + \sqrt{|\lambda|}A) = I - |\lambda| A^2 = I + \lambda A^2. $$ It follows that $$ |I + \lambda A^2| = |(I + \sqrt{|\lambda|}A)^T|\cdot |I + \sqrt{|\lambda|}A| = |I + \sqrt{|\lambda|}A|^2 \geq 0. $$