Find joint likelihood function of observations $x_1, x_2, \ldots, x_n$ and $y_1, y_2, \ldots, y_m$

151 Views Asked by At

Let $x_1,\ldots,x_n$ be observations from a normal distribution with mean $0$ and s.d $s_1$.

Similarly let $y_1,\ldots,y_m$ be observations from a normal distribution with mean $0$ and s.d $s_2$.

Find the combined likelihood.

I've written down the joint likelihood of the $x$ observations multiplied by the joint likelihood of the $y$ observations. I don't think this is correct however. Many thanks

1

There are 1 best solutions below

6
On

That is correct if $(x_1,\ldots,x_n)$ is independent of $(y_1,\ldots,y_n)$.

For the following to be valid, one must assume all $n+m$ observations are independent. One has \begin{align} L(s_1,s_2) & = \prod_{i=1}^n \left(\frac1{\sqrt{2\pi}\,s_1} \exp\left(\frac{-1}2 \left( \frac{x_i}{s_1} \right)^2 \right)\right) \prod_{j=1}^n \left(\frac1{\sqrt{2\pi}\,s_2} \exp\left(\frac{-1}2 \left( \frac{y_j}{s_2} \right)^2 \right)\right) \\[10pt] & = \frac 1 {(2\pi)^{n+m} s_1^n s_2^m} \exp\left( \frac{-1}2 \left( \frac 1 {s_1^2} \sum_{i=1}^n x_i^2 + \frac 1 {s_2^2} \sum_{j=1}^m y_j^2 \right) \right), \end{align} so $$ \ell(s_1,s_2) = -n\log s_1 - m \log s_2 -\frac 1 2\left( \frac A{s_1^2} + \frac B {s_2^2} \right) + \text{constant} $$ where $A$ and $B$ are of course the sums of squares of the $x$s and $y$s.

PS in response to comments: Under the null hypothesis of equal variances, the log-likelihood is $$ \ell(s) = -(n+m)\log s -\frac 1 2 \left( \frac{A+B}{s^2} \right). $$ Then we have $$ \ell'(s) = \frac{-(n+m)}s + \frac{A+B}{s^3} = \frac{A+B-s^2(n+m)}{s^3} \begin{cases} >0 & \text{if }s^2<\frac{A+B}{n+m}, \\[10pt] = 0 & \text{if }s^2=\frac{A+B}{n+m}, \\[10pt] <0 & \text{if }s^2>\frac{A+B}{n+m}. \end{cases} $$ Therefore the MLE under the null hypothesis is $\widehat{s}= \sqrt{\frac{A+B}{n+m}}$. We have $$ \ell(\widehat{s}) = -(n+m)\log\widehat{s} - \frac 1 2 \frac{A+B}{\widehat{s}^2} = -\frac{n+m}2 \log\frac{A+B}{n+m} - \frac 1 2 \frac{A+B}{\left(\frac{A+B}{n+m}\right)}. \tag 1 $$ Under the alternative hypothesis we have $\widehat{s}_1=\sqrt{A/n}$ and $\widehat{s}_2=\sqrt{B/m}$, and then \begin{align} \ell(\widehat{s}_1,\widehat{s}_2) & = -n\log\widehat{s}_1-m\log\widehat{s}_2 -\frac 1 2 \left( \frac{A}{\widehat{s}_1^2} + \frac{B}{\widehat{s}_2^2}\right) \\[10pt] & = -\frac n 2 \log\frac A n - \frac m 2 \log \frac B m - \frac 1 2\left(\frac A {A/n} + \frac B {B/m} \right) . \tag 2 \end{align} Subtracting $(2)$ from $(1)$ we get the logarithm of the likelihood ratio: $$ -\frac{n+m} 2 \log\frac{A+B}{n+m} + \frac n 2 \log\frac A n + \frac m 2 \log \frac B m + \text{terms not depending on $A$ and $B$}. $$ It is easy (if a bit tedious) to show that this is a monotone function of $$ \frac{A/n}{B/m} $$ and that has an F-distribution with $n$ and $m$ degrees of freedom. You reject if this is either too big or too small.