So I'm given two functions $f$ and $g$, which are bounded and increasing. I need to prove that $Cov[f(G_1),g(-G_1)] \leq 0 $, where $G_1$ is a standard normal random variable (meaning that $-G_1 \sim StandardNormal()$ as well, and is the antethitical variable of $G_1$).
Later on in the problem I need to extend the statement to multivariate case (so $F,G: \mathbb{R}^n\to\mathbb{R}$ and are bounded and increasing on all variables). I need to show that $Cov[f(G_1,G_2,...,G_n),g(-G_1,-G_2,...,-G_n)]\leq0$ by using induction. But I can't even figure out how to prove it for the single-variable case, perhaps after I understand how to prove that I will be able to guess the inductive step.
This can be tackled in the one-dimensional example using integration by parts as follows...
Suppose $f(x)$ is an increasing and $h(x)$ a decreasing function, both of which are bounded, and of zero expectation value $E[f(G)]=E[h(G)]=0$. Consider the function
$$ I_h(x)=\int_{-\infty}^{x}h(t)e^{-t^2/2}dt$$
We note that $I_{h}(\infty)=I_h(-\infty)=0$. Also $I_h'(x)=h(x)e^{-x^2/2}$.
Since $\int_{-\infty}^{\infty} h(x)e^{-x^2/2}dx=0$ we conclude that $h$ has to change sign. Since it is decreasing, there's only one point at which it changes sign, and suppose that happens at $x=a$, where $h(a)=0$. Then $h(x)>0, x<a$ and $h(x)<0, x>a$. Given this observation we see that
$$I_h(a)=\int_{-\infty}^a h(t)e^{-t^2/2}dt>0$$
but since $I_h'(a)=0$, we conclude that $I_h$ has a maximum at that point. Since $I_h$ is increasing for $x<a$ and decreasing otherwise, we find that the range of this function is $(0,I_h(a)]$ and thus we finally infer that $I_h(x)>0 ~\forall x$.
Now perform integration by parts on the following quantity
$$E[f(G)h(G)]=\int_{-\infty}^{\infty}f(t)h(t)e^{-t^2/2}dt=-\int_{-\infty}^{\infty}f'(t)I_h(t)dt<0$$
whence the desired result follows, since $f'>0$ and the boundary terms are zero for $f$ bounded.
It is trivial to generalize to variables of non-zero expectation values by noting that the RV's $A=f(G)-E[f(G)], B=h(G)-E[h(G)]$ satisfy the conditions of the lemma proven above and thus
$$E[AB]=E[(f(G)-E[f(G)])(h(G)-E[h(G)])]=\text{Cov}[f(G),h(G)]<0$$
The 1-dimensional result quoted above follows by noting that if $g(x)$ is increasing and bounded then $h(x)=g(-x)$ is decreasing and bounded and thus satisfies the inequality.