Assume we have the marginal pdf's $f_{X}(x)$ and $f_{Y}(y)$, but not necessarily their analytical expressions; they do not exist in my case even if I can compute their values numerically. $Y$ is always positive.
EDIT: I analyzed my problem better and I think we can assume that X and Y are independent (the numerical approximation of the pdfs do not share any parameters).
Now, we have a new random variable $Z := XY + d$, where $d$ is just a constant. How can I calculate values of $f_Z(z)$ using $f_X(x)$ and $f_Y(y)$?
I started with the reasoning that $$f_Z(z) = \int_{\mathbb{R}}f_{Z,Y}(z,y)dy = \int_{\mathbb{R}}f_{Z|Y}(z,y)f_Y(y)dy$$
The function $g_1(x) := f_X(x-d)$ is a pdf, but the function $g_2(x) = f_X\left(\frac{x-d}{y}\right)$ is not, and in order to obtain a proxy for $f_{Z|Y}(z,y)$ I seem to need to normalize by calculating the integral $\int_{\mathbb{R}}f_{X}\left(\frac{x-d}{y}\right)dy$. (Is there any easier way to calculate the normalization constant?)
And then I am using $$f_{Z|Y}(z,y) = \frac{f_{X}\left(\frac{z-d}{y}\right)}{\int_{\mathbb{R}}f_{X}\left(\frac{z'-d}{y}\right)dz'}$$
Finally, I was hoping this would imply that
$$f_Z(z) = \int_{\mathbb{R}} \left[\frac{f_{X}\left(\frac{z-d}{y}\right)}{\int_{\mathbb{R}}f_{X}\left(\frac{z'-d}{y}\right)dz'} f_Y(y)\right]dy$$
And that the joint density would be written as
$$f_{Z,Y}(z,y) = \frac{f_{X}\left(\frac{z-d}{y}\right)}{\int_{\mathbb{R}}f_{X}\left(\frac{z'-d}{y}\right)dz'} f_Y(y)$$
This came up because I have numerical approximations for $f_X(x)$ and $f_Y(y)$. This expression means I would have to calculate a lot of integrals for the normalization constants because of $Y$, but I managed to make that work with a discrete grid approximation, so it's fine for calculating a single value $f_Z(z^*)$. The real problem is that when I do a cubature to calculate the integral of the joint density, it doesn't add up to one, so I don't trust my results. This cubature is computationally prohibitive in this application (fitting a a given model, calculating the likelihood point by point); so I am wondering if there's anything wrong in my reasoning and that's why I am not getting a pdf at the end.
Should I do it using this formula here? Distribution of the product of two random variables $$f_{Z}(z)=\int _{\mathbb{R}}f_{X}\left(\frac{z}{y}\right){\frac {f_{Y}(y)}{|y|}}\,dy$$ If so, why would my approach not work as opposed to this one (I have yet to try this one out)?
General rule of thumb: if you don't know exactly what you are doing, don't do any manipulation directly with densities. Always pass via CDFs and/or probabilities of events.
Here, you want $f_Z$, so you want to get $F_Z$ and then differentiate that. From the total probability formula we can connect that to the conditional CDF:
$$F_Z(z)=\int F_{Z \mid Y}(z \mid y) f_Y(y) dy.$$
Next we connect the conditional CDF back to the CDF of $X$. Note that because you have to divide by $Y$ to get an inequality with $X$ isolated, the formula depends on the sign of $y$:
$$F_{Z \mid Y}(z \mid y)=P(XY+d \leq z \mid Y=y) \\ =\begin{cases} P(X \leq (z-d)/y \mid Y=y)=F_X((z-d)/y) & y>0 \\ P(X \geq (z-d)/y \mid Y=y)=1-F_X((z-d)/y) & y<0 \end{cases}.$$
Note that the last equations in the chain used independence. Now you can substitute into our first equation to get
$$F_Z(z)=\int_{-\infty}^0 F_X((z-d)/y) f_Y(y) dy + \int_0^\infty (1-F_X((z-d)/y)) f_Y(y) dy.$$
Now you can differentiate with respect to $z$ on both sides to recover $f_Z$.