Likelihood of a function of different types of random variables

89 Views Asked by At

Is there a general way of expressing the likelihood of some known, but non-trivial function of several random varaibles. For example, suppose that we need to calculate the parameters of a process $Y_t$ which equals $Y_t=b_1 \frac{X_t^m}{Z_t} + e_t $. Suppose that $X$ is Gaussian with parameters $\mu_X$ and $\sigma_X$; $Z$ is Gaussian with $\mu_Z$ and $\sigma_Z$;and $e$ is Gaussian noise with mean $0$ and $\sigma_e$ .

We don't observe $X_t$ and $Z_t$ -- only $Y_t$. We want to estimate $b_1, m, \mu_X, \sigma_X, \mu_Z, \sigma_Z$ and $\sigma_e$.

The difficulty I am having is that I can't always express the density of $Y$ so that I can write the likelihood and estimate the parameters either via MLE or use it to simulate from posterior distributions via MCMC methods. In some cases when the function is trivial (e.g. sum of Gaussian variables) we can easily write the likelihood because we know the properties of the sum. But this is not the case here.

Note that in the example above even taking the logarithm of $Y$ would give us a sum of non-normal (log-normal) random variables. The density of this sum is not known (I think?). Is there a general way of approaching this kind of problem?

1

There are 1 best solutions below

2
On BEST ANSWER

To compute the MLE of the parameters, you need to marginalize out the latent variables $(X,Z)$ which is difficult. However, MCMC does not require this. With MCMC, you can jointly sample the parameters and $(X,Z)$, then ignore the latter.