I am given n random variables
$ y_1 = \theta_1 \theta_2 + e_1$
$ y_i = \theta_1 + e_i$ for $ \space i = 2,...,n$
where $\theta = [ \theta_1 \space \theta_2] $ is a vector and $e_i$ are Gaussian independent variables $e_i \sim N(0,1)$
Find the maximum likelihood estimator $\theta^{ML}$ of $\theta$
To solve it I would do this:
$\theta^{ML} = argmax_\theta P[ y = y_i | \theta] = argmin_\theta -log(P[ y = y_i | \theta])$
$P[ y = y_i | \theta] = \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}(y_1-\theta_1\theta_2)^2} * \prod_{i=2}^n \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}(y_i-\theta_1)^2}$
where the mean is 0 and the variance 1 so
$-log(P[ y = y_i | \theta]) = \frac{1}{2}log(2\pi)+\frac{1}{2}(y_1-\theta_1\theta_2)^2+\frac{n-1}{2}log(2\pi) + \sum_{i=2}^n\frac{1}{2}(y_i-\theta_1)^2$
now I need to calculate the minimum so I need the first derivative:
$\frac{d(-log(P[ y = y_i | \theta]))}{d\theta_1} = -\theta_2(y_1-\theta_1\theta_2) - \sum_{i=2}^n(y_i-\theta_1) = $
$=\theta_1\theta_2^2-\theta_2 y_1 +(n-1)\theta_1 -\sum_{i=2}^ny_i$
$\frac{d(-log(P[ y = y_i | \theta]))}{d\theta_2} = -\theta_1(y_1-\theta_1\theta_2) = \theta_1^2\theta_2-\theta_1y_1 $
Now making the derivatives equal to 0 I can solve for $\theta_1$ and $\theta_2$
I would like to know if this exercise is solved correctly