Let $X$,$Y$ be real valued random variables. Assume that for all $i,j\in\mathbb{N}$ we know $\mathbb{E}(X^iY^j)$. Is there a way to express $\mathbb{E}(X\,1\{Y>0\})$ in terms of these joint moments? ($1\{\cdot\}$ is the indicator function)
Is there any assumption about the distribution of X and Y that can simplify this problem?
EDIT: Assume also that $\mathbb{N}$ includes point zero.
EDIT (2nd): So, I have a half idea which helps me make some progress, but there are multiple points of [extreme?] difficulty which compromise the whole thing.
Take the Heaviside function $H(x)$, where
$H(x)=\frac{d}{dx}\max(x,1)=\textbf{1}[x\geq 1]$
Therefore:
$\mathbb{E}[XH(Y)]=\mathbb{E}[\frac{d}{dY}\max(XY,X)]=\mathbb{E}[\frac{d}{dY}\min(-XY,-X)]$ (1)
Now, if we assume $X$ and $XY$ are bivariate normal, we have [1]:
$\mathbb{E}[\min(-XY,-X)]=\theta\,\phi\left(\frac{\mathbb{E}[XY]-\mathbb{E}[X]}{\theta}\right)-\mathbb{E}[XY]\Phi\left(\frac{\mathbb{E}[XY]-\mathbb{E}[X]}{\theta}\right)-\mathbb{E}[X]\Phi\left(\frac{\mathbb{E}[X]-\mathbb{E}[XY]}{\theta}\right)$ (2)
where
$\theta = \sqrt{Var[X]-2Cov[X,XY]+Var[XY]}.$
Even with the above assumption, this is far from complete. I fail to be able to connect formula (1) and (2), since I cannot apply the Leibniz rule on a derivative involving a random variable.
[1] M. Cain, The moment-generating function of the minimum of bivariate normal random variables, The American Statistician, 1994