Linear combination of non-identically distributed, independent exponential random variables

296 Views Asked by At

I am working on the following homework assignment:

Under the assumptions of the Normal Simple Linear Regression model, $Y_i|X_i \sim N(\beta_0 + \beta_1 X_i, \sigma^2)$. Consider the model where $Y_i|X_i \sim Exp(\frac{1}{\beta_1 X_i})$, that is, where $E[Y_i|X_i] = \beta_1 X_1$. Let $(X_i, Y_i), \,i=1,...,n$ be a random sample.

  1. Find the Maximum Likelihood Estimator, $\hat{\beta_1}$, for $\beta_1$
  2. What is the distribution of $\hat{\beta_1}$?

I have managed to solve the first part of the assignment. However, I am struggling with the second part.

For the first part, my likelihood function was given by

$$\cal{L}(\beta_1) = \prod_{i=1}^n f_{Y_i|X_i}(y_i) = \prod \frac{1}{\beta_1 x_i}e^{-\frac{y_i}{\beta_1 x_i}} = \left(\frac{1}{\beta1}\right)^n \left(\prod_{i=1}^n \frac{1}{x_i}\right) \left(e^{-\sum_{i=1}^n \frac{y_i}{\beta1 x_i}} \right)$$

for $y_i > 0$. I applied the natural logarithm, differentiated with respect to $\beta_1$, set the differential equal to zero and got

$$\hat{\beta_1} = \frac{1}{n} \sum_{i=1}^n \frac{y_i}{x_1} = \sum_{i=1}^n k_iy_i \qquad k=\frac{1}{nx_i}$$

Now, for the second part, we have

$$ \hat{\beta_1} = \frac{1}{n} \sum_{i=1}^n \frac{Y_i}{x_1} = \sum_{i=1}^n k_iY_i \qquad k=\frac{1}{nx_i} $$

where $x_i$ is an observation, $Y_i \sim Exp(\frac{1}{\beta1 x_i})$ and $\beta_1 x_i > 0$. I have searched online and I found that the linear combination of exponential variables is a hyper-exponential random variable when each of the coefficients is a probability.

I'd love for this to be my case, however, I don't have enough evidence to justify that each $k_i \in [0,1]$. In fact, I can't even affirm that $k_i > 0$ for $i = 1,\dots,n$, so I think it's safe to say that it's not a hyper-exponential random variable.

1

There are 1 best solutions below

0
On BEST ANSWER

Using the moment generating function:

$$M_{\hat{\beta_1}}(t) = E[e^{\hat{\beta_1}t}] = E\left[\prod_{i=1}^ne^{k_i Y_i t}\right] = \prod_{i=1}^n E[e^{k_i Y_i t}] = \prod_{i=1}^n M_{Y_i}(k_i t) = \prod_{i=1}^n (1-k_i \beta_1 x_i t)^{-1}$$

Finally, by substituting $k_i$, I get

$$M_{\hat{\beta_1}}(t) = \left(1 - \dfrac{\beta_1}{n}t \right)^{-n}$$

So $\hat{\beta_1} \sim Gamma(n, \frac{\beta_1}{n})$.