If we observe data $\{{x_t}\}^n_t$ from the model, $X_t \sim Pois(t\mu)$ independently.
Then how would I derive the maximum likelihood estimate of parameter $\mu$?
So far, I have done this:
Let $$X_t \sim Pois(\lambda_t)$$
Then $$f(x_t, t\mu) = \frac{t\mu^{x_t} \exp(-t\mu)}{x_t!}$$
Thus I determined the Likelihood function to be
$$ L(\mu) = \frac{tu \sum_{t=1}^{n} x_t \exp(-t\mu)}{\prod_{t=1}^n x_t}$$
So my next step would be finding the log liklihood. However I am confused as to what the notation for the log likelihood function would be. Would I be taking its as $l(\mu)$ since $\mu$ is the unknown parameter?
Thus I got $$ l(\mu)= \log (t\mu)\sum_{t=1}^n x_t - nt\mu +c$$
How would I derive the log likelihood function to give the MLE? Also would that mean that the expectation of the MLE would just be $\mu$? Thank you in advance
The likelihood function is (drop terms that do not depend on $\mu$
$$ L(\mu)=\prod_{i=1}^{n}\frac{\left(i\mu\right)^{x_{i}}e^{-i\mu}}{x_{i}!}\propto \mu^{\sum_{i=1}^{n}x_{i}}e^{-\sum_{i=1}^{n}i\mu} $$ Now take logs $$ \log L(\mu)=\log \mu \sum_{i=1}^{n}x_{i}-\sum_{i=1}^{n}i\mu $$ Take the derivative to see that $$ \mu^{MLE}=\frac{\sum_{i=1}^{n}x_{i}}{i} $$ Another way to see it is that if $X_{i}\stackrel{indep}{\sim}Po(\lambda_{i})$ then $T=\sum_{i=1}^{n}X_{i}\sim Po\left(\lambda_{T}=\sum_{i=1}^{n}\lambda_{i}\right)$. In this case $\lambda_{T}=\mu\sum_{i=1}^{n}i$, and since $T$ is a sufficient statistic for $\lambda_{T}$ then the MLE of $\lambda_{T}$ is $T$. Then do the proper conversion to get the same estimator as before.