"Amount of information about parameters in random variables" means what?

30 Views Asked by At

I can't figure out what a question is asking, nor what an answer would look like.

Given $2$ iid normal random variables $X_1$ and $X_2$, "find the amount of information about $\mu$ and $\sigma ^2$ in $X_1 + X_2$ and $X_1 - X_2$"

I have shown that $U = X_1 + X_2$ and $V = X_1 - X_2$ are independent normal random variables with $U$ ~ $N(2 \mu, 2 \sigma^2$) and $V$ ~ $N(0, 2 \sigma^2)$ if that helps.

I know the Fisher information formula: E of square of derivative of log of density of data given parameter. Which I think I could calculate, though its meaning is clear as mud.

Is that what they want? Or are they talking about something else?

Also, if anyone can point me to a textbook that describes this explicitly, not just a footnote or a paragraph or a hand wave, I would be grateful.

1

There are 1 best solutions below

4
On

I might be wrong but I believe it is a very clumsy way of asking to show that $\mu$ and $\sigma^2$ can be deduced from $(X_1+X_2, X_1-X_2)$, hence this transformation of $(X_1,X_2)$ induces no loss of information on the determination of their mean and variance.