Finding asymptotic confidence interval if variances are unknown

207 Views Asked by At

For two normal distributions with common mean, $\mathrm{N}(\mu, \sigma_1^2)$ and $\mathrm{N}(\mu, \sigma_2^2)$, let $X_1 ,...,X_m$ and $Y_1, ..., Y_n$ be two independent random samples from the distributions. Here $\sigma_1^2, \sigma_2^2>0$ and $-\infty<\mu<\infty$.

If two variances $\sigma_1^2,\sigma_2^2$ are unknown, give an estimator for $\mu$ and an asymptotic confidence interval for $\mu$ obtained from that estimator. Here $m,n \to \infty$ and $\dfrac{m}{m+n} \to \lambda $ and $\lambda$ is a known constant such that $0<\lambda<1$.

My attempt: I first tried to find MLE of $(\mu, \sigma_1^2, \sigma_2^2)$ such that the log-likelihood $l(\mu, \sigma_1^2 , \sigma_2^2)$ is maximized by solving likelihood equation $\nabla l=0$, but it requires me to solve a cubic equation of $\mu$, so I gave up. Also, finding MLE using properties of exponential family doesn't work here because the space of parameters is not open in $\mathbb{R}^4$.

Does anyone have ideas? Any hints or advices will help a lot!