Define $\gamma:\Re^{n\times n}\to\Re$ that maps a positive definite matrix to the ratio between geometric and arithmetic mean of its eigenvalues: \begin{align} \gamma(M)=\frac{(\det M)^{1/n}}{\frac1n Tr(M)}=\frac{(\prod_i^n \lambda_i)^{1/n}}{\frac1n\sum_i \lambda_i} && M\in\Re^{n \times n}, \text{eigenvalues}(M)=\lambda_1,\dots,\lambda_n>0 \end{align} Let $x\sim N(0,A)$ be a $n$-dimensional Gaussian and $y=(\tanh(x_i))_{i\le n}$ be result of applying $\tanh$ with covariance $B:=E y y^\top$.
Question: Can we prove that $\gamma(A) \le \gamma(B)$?
Implication: the geometric mean is always smaller than the arithmetic mean, therefore $\lambda\le 1$ for all $M\succ 0$. However, if $\lambda_i$s are all equal, we will have $\lambda(M)=1$. Therefore, $\lambda$ captures how singular ($\lambda(M)\to 0$) or non-singular $\gamma(M)\to 1$) is. Therefore, the claim in the question basically implies that applying the hyperbolic tangent on a random normal vector does not increase the singularity, i.e., $\lambda$, of its covariance.