I need to find $\frac{\delta \ l(\hat{\lambda}_\beta, \beta)}{\delta \beta}$ where $l(\lambda, \beta)$ is the log-likelihood function for the model $X_i \sim po(\lambda_i)$ where $\lambda_i=\lambda \cdot exp(\beta \cdot i)$ for i = 1,2,...,12
I can calculate $\hat{\lambda}_\beta $ as $\hat{\lambda}_\beta= \frac{x.}{\sum\limits_{i=1}^{12} exp(\beta i)}=\frac{x.}{m.}$, where $x.$ is the sum of the observations based on a formula from my textbook.
So far i have the likelihood function and log-likelihood function as
$L(\lambda, \beta)= \prod\limits_{i=1}^k e^{-\lambda_i} \frac{\lambda_i ^{x_i}}{x_i !} = \prod\limits_{i=1}^k e^{-\lambda e^{\beta i}} \frac{(\lambda e^{\beta i})^{x_i}}{x_i !}$
$ l(\lambda,\beta)= \sum_{i=1}^{12}\left( -\lambda e^{\beta i}+x_i \cdot log(\lambda e^{\beta i})-log(x_i!) \right)= -\lambda m_.+x.log(\lambda)+\sum_{i=1}^{12} x_i \beta i-\sum_{i=1}^{12}log(x_i !)$
From here im kinda stuck, as my differation always to lead to strange results.
EDIT: as the text is kinda small, its $\beta \cdot i $ not $\beta_i$ all the way through the text
from your log-likelihood I just did the derivative normally and got $$\beta_i = log(x_i)-log(\lambda)$$ What is wrong with the derivation?