We know the pdf is $p(x)=\frac{1+\theta x}{2}$ and as first step is to find $L(x)$ I've started with $L(x) = \prod_{i=0}^n\frac{1+\theta x_i}{2}$
- $L(x) = \prod_{i=1}^n\frac{1+\theta x_i}{2}$
- $L(x) = \frac{1}{2^n}\prod_{i=1}^n1+\theta x_i$
And there I am stuck, I don't know how to simplify the product out so that I can apply log for $l(x)=log(L(x))$ and from there to the 1st derivative that I already know I won't be able to equals 0 and resolve properly.
This is an example given in class that ends in Newton-Rahpson because of that.
Since the teacher did not solve it till the 1st derivative I wanted to try myself but I am stuck.
$L(x_1,\ldots,x_n;\theta) = \prod_{i=0}^n\frac{1+\theta x_i}{2} = (\frac{1}{2})^n\prod_{i=0}^n(1+\theta x_i) $
We only want terms with $\theta$, so ignore everything else so that the log-likelihood is proportional to
$$ \ell(x_1,\ldots,x_n;\theta)\propto \log\left(\prod_{i=0}^n 1+\theta x_i\right)=\sum_{i=0}^nlog(1+\theta x_i)$$ With derivative: $$ \frac{\partial \ell}{\partial \theta} = \frac{1}{\sum_{i=0}^n(1+\theta x_i)}$$ Of course this gives trouble when equating this to 0 ( it does not exist), so we need to use other methods, such as Newton Raphson where you can use a moment estimator to give an initial value etc..