Suppose x is a random variable and f(x) and g(x) are functions. Is there a theorem that says f(x) and g(x) must be dependent so that cov(f(x),g(x)) is non-zero?
The particular problem I'm dealing with is slightly more complicated. Consider a function of the form f(e,x), where e is a deterministic variable and x is random. I want to know if cov(f(e,x), g(e,x)) is non-zero, where g(e,x) = f_e(e,x) is the partial of f with respect to e.