I am currently working on a problem where I need to derive the maximum likelihood estimators for a linear model with an exponential error term. Here's the problem:
A machine sequentially performs two tasks. The first task requires an unobservable exponentially distributed amount of time $\epsilon$, with unknown mean $\frac{1}{\lambda}$. The second task requires an amount of time, proportional to an observable positive-valued variable $x$ which is independent of $\epsilon$; the proportionality constant is positive, but otherwise unknown, and can be represented by $\beta$. The total time y required to process a single job is the sum of the times it takes to complete the two tasks.
Given the problem, I've been tasked with:
- Writing down a linear statistical model relating $y$ to $x$.
- Deriving maximum likelihood estimators for $\beta$ and $\lambda$.
For the first part, I believe the model would be $y_i = \beta x_i + \epsilon_i$, where $\epsilon$ is the error term. For the second part I estimated value of lambda:
$L(\lambda, \beta) = \prod_{i=1}^n ( \lambda e^{-\lambda (y_i - \beta x_i)}$
$L(\lambda, \beta) = \lambda^n e^{-\lambda \sum (y_i - \beta x_i)}$
$\ln(L(\lambda)) = n \ln(\lambda) - \lambda \sum (y_i - \beta x_i)$
$\hat{\lambda} = \frac{n}{\sum_{i=1}^n (y_i - \beta x_i)}$
But I don't know how to estimate value for beta:
$\lambda \sum_{i=1}^n(x_i)=0$
Which doesn't make sense to me. Any help on how to tackle the estimation for β or insights into the problem would be greatly appreciated!