Solving a cubic equation for maximum likelihood estimation

252 Views Asked by At

In maximum likelihood(ML) estimation of parameters,if the distribution of x is given by $$ f(x_j,\theta) = \frac{\theta}{(x_j+\theta)^2 }; 0<x<\infty$$ The ML equation become $$L(\theta) = \prod_{j=1}^{n} \frac{\theta}{(x_j+\theta)^2} $$

When solving for n=3,I got the follwing equation(if I'm not wrong) $$3\theta^3+\theta^2(x_1+x_2+x_3)+\theta(x_1x_2+x_2x_3+x_1x_3)-3x_1x_2x_3 = 0$$

Any suggestions to solve this equation theta?

2

There are 2 best solutions below

4
On

We want to maximize $$\sum_{j=1}^{n} \log\frac{\theta}{(x_j+\theta)^2}= n\log\theta - 2\sum_{j=1}^n \log (x_j+\theta) $$

Differentiating

$$\frac{n}{\theta}=\sum_{j=1}^n\frac{2}{x_j+\theta}$$

$$\frac{n}{2}=\sum_{j=1}^n\frac{1}{\frac{x_j}\theta+1}$$

I will then resort to a numerical method.

0
On

You have little other option than resorting to the general Cardano formulas. Anyway, your equation is missing a first degree term, and you easily turn it to a standard depleted form

$$\lambda^3-\frac{x_1+x_2+x_3}{x_1x_2x_3}\lambda -\frac2{x_1x_2x_3}=\lambda^3+p\lambda+q= 0$$ where $\lambda=1/\theta$.