Suppose $G$ is a complex $n\times n$ matrix Could anyone help me to prove the following where $\sigma$'s are singular values of $G$?
$\det G\ne 0 \Leftrightarrow \sigma_{\min}[G]>0$.
$\sigma_{\max}[G^{-1}]=\frac{1}{\sigma_{\min}[G]}$ if $\sigma_{\min}[G]>0$
$\sigma_{\min}[I+G]\ge 1-\sigma_{\max}[G]$
$\sigma_{\max}[G_1G_2]\le \sigma_{\max}[G_1]\sigma_{\max}[G_2]$ for complex matrices $G_1,G_2$
$det\ G \neq 0 \Rightarrow \sigma_{min} \gt 0$ using contrapositive.
$$ \begin{align} \sigma_{min} \not\gt 0 &\Rightarrow \sigma_{min}=0\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } G^*Gv = 0\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } Gv=0 \text{ or } G^*w = 0 \text{ where }w=Gv \neq 0\\ &\Rightarrow G \text{ is not injective and hence singular}\\ &\Rightarrow det\ G=0 \end{align} $$
The other 2 inequalities involve more complex proofs. I will leave here pointers to find those:
This follows from a more general property: $\sigma_i(A+B) \geqslant \sigma_i(A) - \sigma_1(B)$, where $\sigma_i$'s are in decreasing order. From this it directly follows $\sigma_{min}[I+G] \geqslant \sigma_{min}[I] - \sigma_1[G] = 1 - \sigma_{max}[G] \quad [\text{all singular values of I are 1}]$
This also follows from a more general property (sometimes refered to as Horn's lemma) which states that: $\prod_{i=1}^{k}\sigma_i[G_1G_2] \leqslant \prod_{i=1}^{k}\sigma_i[G_1]\sigma_i[G_2]$. If both $G_1$ and $G_2$ are square matrices of size $n$, the equality holds for $k=n$.
In your case, just taking $i=1$, proves the required result.
The proofs of both these inequalities can be found in: R.A. Horn, C.R. Johnson, Topics in Matrix Analysis, Cambridge University Press, 1991