I have a question considering the proof that "something" is the best maximum likehood estimator. I have no idea how to continue next with this problem, so far I just wrote the normal distribution, and tried to somehow rewrite it to the logaritmus and then my idea was to partialy derivate it , so I obtain the maximum.
Can someone give me a hint how to continue? Thanks a lot and please see a file attached
Firstly note that $\log \prod_{k=0}^{n-1} \frac1{2\pi\sigma^2}=\log \left( \left( \frac1{2\pi\sigma^2}\right)^n\right)=n\cdot \log \left( \frac1{2\pi\sigma^2}\right)$
But this is not important for the result since this constant disappearing when you differentiate. At your second last line you have
$$L=\log\left(\prod_\limits{k=0}^{n-1}\frac1{2\pi \sigma^2}\cdot e^{-\frac{(x_k-A)^2}{2\sigma^2}}\right)=\log \left( \left( \frac1{2\pi\sigma^2}\right)^n\cdot e^{\frac{-\sum_{k=0}^n (x_k-A)^2}{2\sigma^2}}\right)$$
Now use the rule $\log(a\cdot b)=\log(a)+\log(b)$
$$\log L=n\cdot \log \left( \frac1{2\pi\sigma^2}\right)-\frac{\sum_{k=0}^n (x_k-A)^2}{2\sigma^2}$$
Derivative w.r.t $x_k$
$$\frac{d \log L}{dx_k}=-2\cdot \frac1{2\pi\sigma^2}\cdot \sum_{k=0}^{n-1} (x_k-A)=0$$
The constants can be omitted.
$$\frac{d \log L}{dx_k}=\sum_{k=0}^{n-1} (x_k-A)=0$$
$$\sum\limits_{k=0}^{n-1} x_k-\sum_{k=0}^{n-1}A=0$$
$$\sum_{k=0}^{n-1} x_k-n\cdot A=0$$
$$\sum_{k=0}^{n-1} x_k=n\cdot A$$
Now solve for $A$.