Let 1 ,… , be independent and identically distributed (i.i.d.) random variables with distribution function

380 Views Asked by At

Apologies for the formatting. I didn't really know how to format this.

Let $_1,\ldots, _n$ be independent and identically distributed (i.i.d.) random variables with distribution function : $$P(Y_i \le y \mid \alpha,\beta) = \begin{cases}0 \quad \text{if } y < 0 \\ (y/\beta)^\alpha \quad \text {if } 0 \le y \le \beta \\ 1 \quad \text{ if } y > \beta\end{cases} $$

where the parameters $\alpha$ and $\beta$ are positive.

Find the Maximum Likelihood Estimators (MLEs) of $\alpha$ and $\beta$.

I integrated the original function. to get $y^{\alpha+1}/ [(\beta^\alpha)(\alpha+1)]$ I take the likelihood function and then take its logarithm to get

$-\alpha n\times \ln(\beta) - n\times\ln(\alpha+1) + (\alpha+1) \sum_1^n \ln(y_i)$

I then take the partials in terms of alpha and beta When I take it in terms of Beta I get $0=-\alpha n/\beta$ I do not know what to do with this if I can do anything at all

Similarly when I take the partials in terms of alpha I get a similar problem. I don't know what to do with it. Would appreciate it if someone could show me how to get to the solution.

1

There are 1 best solutions below

0
On

As hinted in the comments, to compute the density, you have to differentiate the distribution function : $$f_i(y\mid\alpha,\beta) := \partial_y \mathbb P(Y_i \le y \mid \alpha,\beta) =\alpha \frac{ y^{\alpha-1}}{\beta^\alpha} \mathbf{1}_{0\le y \le \beta} $$ Now, the likelihood is given by the product of the conditional densities taken at each observation, and as is common, we'll rather consider the log-likelihood to convert the product into a sum : $$ \begin{align}\mathcal L(\alpha,\beta) &= \ln\left(\prod_i^n f_i(Y_i\mid\alpha,\beta)\right) \\ &= \ln\left(\prod_i^n \alpha \frac{ Y_i^{\alpha-1}}{\beta^\alpha} \mathbf{1}_{0\le Y_i \le \beta}\right) \\ &= n\ln\left(\frac{\alpha}{\beta^\alpha}\right) + (\alpha - 1)\sum_i^n \ln(Y_i \mathbf{1}_{0\le Y_i \le \beta}) \end{align} $$

The MLEs of $\alpha$ and $\beta$ are the values $\hat \alpha$ and $\hat \beta$ which maximize $\mathcal L(\alpha, \beta)$

  • For $\beta$ : We see that, for the likelihood not to be equal to $- \infty$, we necessarily need $\hat \beta \ge \max\{Y_1,\ldots,Y_n\}$. Since $\mathcal L(\alpha,\beta)$ is a non increasing function of $\beta$, we see that $$\hat \beta = \max\{Y_1,\ldots,Y_n\}$$ (the minimal acceptable value) is the MLE of $\beta$.

  • For $\alpha$ : $\partial_\alpha \mathcal L(\alpha, \beta) = \frac{n}{\alpha} - n\ln(\beta) + \sum \ln(Y_i \mathbf{1}_{0\le Y_i\le \beta})$
    We find $$\hat \alpha = \frac{n}{\ln\left(\frac{{\hat \beta}^n}{\prod_i^n Y_i}\right)}$$