As is well known, there are many examples of (pairs of) random variables which have covariance zero, but which are not independent.
However, I'm wondering whether there are any general theorems about what having covariance zero does imply. Are there theorems that say something like, "Covariance zero, under additional hypothesis H, implies the following relationship between $X$ and $Y$," or is there essentially no possible general relationship?
By the way, although for certain types of distributions/families the implication "Covariance zero implies independent" is valid, that is not the kind of theorem I am asking about. What relationship can there be that is short of full independence? Any reasonable hypothesis H would be interesting to me, as would references to online resources or standard references.
The covariance is just one number computed from two random variables. You cannot expect it to capture much information.
Saying that the covariance is zero is one equation; by comparison, saying that two RVs are independent is a huge amount of equations: one for each couple of real numbers! It is very strict.
All you can say when the covariance is zero is that a linear regression on a sample large enough will give you a horizontal line: you cannot predict the value of one variable from the other based on a linear model.
I would be happy to learn, but I'm afraid that not much more can be said.