Let $X$ be a random variable and $f, g: \mathbb{R} \rightarrow \mathbb{R}$ be increasing functions. Show that $cov(f(X), g(X)) \ge 0$.
The following hint was also provided: Assume $X$, $Y$ are independent and identically distributed, then show $E[(f(X)-f(Y))(g(X)-g(Y))] \ge 0$.
My attempt: Since $f$ and $g$ are increasing, then $(f(x) - f(y))(g(x) - g(y)) \ge 0$ for all $x, y \in \mathbb{R}$. Thus, $E[(f(X) - f(Y))(g(X) - g(Y))] \ge 0$ by the monotonicity of expectations.
Expanding, we get $E[f(X)g(X)]-E[f(X)g(Y)]-E[f(Y)g(X)]+E[f(Y)g(Y)] \ge 0$. Now due to independence, the LHS becomes $E[f(X)g(X)]-E[f(X)]E[g(Y)]-E[f(Y)]E[g(X)]+E[f(Y)g(Y)]$. Then due to identically distributed, the LHS further becomes $2E[f(X)g(X)]-2E[f(X)]E[g(X)]$. So together we have $2cov(f(X), g(X)) \ge 0$ and we are done.
My main query: This whole proof relies on the fact that $X$ and $Y$ are i.i.d. How can we just simply assume this? If we didn't make this assumption, then we would never have been able to break up the expectations and collect like terms. Is this proof correct or do I need a proof that does not rely on the iid of $X$ and $Y$?
Your proof is correct and complete.
Essential is that whenever there is a random variable $X$ defined on some probability space $\langle\Omega,\mathcal A,P\rangle$ then it is always possible to create a new probability space $\langle\Omega_1,\mathcal A_1,P_1\rangle$ together with random variables $X_1:\Omega_1\to\mathbb R$ and $Y_1:\Omega_1\to\mathbb R$ such that $X_1,Y_1$ are independent and both have the same distribution as $X:\Omega\to\mathbb R$.
(This must considered to be "knowledge" resting in our probability-backpack and also in the probability-backpack of others (e.g. our teachers) and does not have to be proved again and again.)
In that context it can be proved that $\mathsf{Cov}(f(X_1),g(X_1))\geq0$ the way you did.
And of course this also proves that $\mathsf{Cov}(f(X),g(X))\geq0$.