I am attempting to create an inference model, such that given any $y$, I can output an estimated probability density function of $x$.
Given $X,Y$ where $f_X$ and $f_Y$ are probability density function estimates (in my application, made though a kernel estimation), I can compute the estimated conditional density of $x$ given $Y=y$ by:
$$P_{x|Y=y} = \frac{P_{X,Y}(X_1,y), P_{X,Y}(X_2,y), \dots, P_{X,Y}(X_n,y)}{P_Y{y}}$$
Where $P_Y{y}$ is the marginal density when $Y=y$.
For example, in the following (taken from here):
$$X=[1,2,0],Y=[1,0,0]$$
$$\implies P_{x|y=0} = \frac{P_{X,Y}(1,0), P_{X,Y}(2,0), P_{X,Y}(0,0)}{2/3}$$
$$= \left[\frac{1}{2},0,\frac{1}{2}\right]$$
We can say then, that given $Y=0$, the most likely values of $x$ are 0 and 2, both with 50% probability.
However, my question is, how can I make a prediction given a $y \notin Y$, for example $P_{x|y=1.5}$. I think I can possibly do this with a kernel estimation, but I can't seem to find any good references on how to do this. Any help/hints would be very appreciated!
OP here, in case anyone has a similar question, I believe this can be done with parameter estimation through the use of "expectation maximisation algorithm", see here