Given two independent Gaussian random variables $X \sim \mathcal{N}(\mu_x,\sigma_x^2)$ and $Y \sim \mathcal{N}(\mu_y,\sigma_y^2)$. We look at the product distribution of these two random variables $Z=XY$.
My question is, what is the differential entropy $h(Z)$?
The differential entropy is defined as
$h(Z)=\int_z f(z) \log (f(z)) dz$,
where $f(z)$ is the probability density function of $Z$.
The probability density function can be computed to be $f(z)=\frac{1}{\pi\sigma_x\sigma_y} K_0(\frac{|z|}{\sigma_x\sigma_y})$, where $K_0$ is the modified Bessel function second order. (see: Wikipedia: product distribtuion, Wolfram: Normal product distribution. )
However, I couldn't find any information on the entropy of the modified bessel function. There seems to be a special case for $K_{\frac{1}{2}}$, but I could find a closed form solution for $K_0$. Does anybody know how to compute $h(Z)$? A good approximation of $K_0$ such that $h(Z)$ can be lower bounded, would be sufficient, in case an exact solution is known to be intractable.
It seems that even the most basic lower bounds fail to yield some insights, for example:
$h(XY)\geq h(XY|Y) = \int f(y)H(Xy|Y=y) dy = \int f(y) (\log|y|+ h(X))dy=h(X)+\int f(y) \log |y|dy$, where the relation to $h(X)$ is shadowed by the last term.
Thanks in advance,
Rick