I have looked through some questions here, but could not find any that answers my question. So, here it goes.
I have two random variables, $X$ and $Y$, and their marginal probability density functions (PDF), $p_{X}(x)$ and $p_{Y}(y)$, are known. Their joint PDF $p_{XY}(x, y)$, however, is unknown.
I know that $X$ and $Y$ are independent if both are Gaussian and $p_{XY}(x, y) = p_{X}(x)p_{Y}(y)$. Or, equivalently, $E\left[XY\right] = E\left[X\right]E\left[Y\right]$.
But what if $X$ and $Y$ are not Gaussian? I know that if they are independent, the conditions mentioned above are true, but I am not sure if the other way around is true when the random variables are not Gaussian.
If not, Is there a way to find out if they are independent without knowing $p_{XY}(x,y)$?
Thank you.
There is not a way to determine independence without knowing the joint distribution.
You could possibly show dependence without the joint distribution. For example, if you could show that the variables were correlated, this would imply they were dependent. But showing they are uncorrelated does not necessarily imply independence.