A problem in probability theory

63 Views Asked by At

Here is the problem:


g(t) is a monotonically decreasing non-negative continuous function defined on $[0,\infty)$:
$\int_{0}^{\infty}{g(t)} \,{\rm d}t=a,\int_{0}^{\infty}{tg(t)} \,{\rm d}t=b,0<a,b<\infty.$
Suppose the joint density function of random variables X and Y is $p(x,y)=g(x^2+y^2), -\infty<x,y<\infty.$
Assuming that X and Y are independent of each other, prove that both X and Y are subject to normal distribution.
I have calculated the the value of a,expectation and variance of X and Y,and the covariance of X and Y, but have no idea how to prove X and Y are both normal r.v.
It suddenly occurs to me that there's such a proposition:
Suppose r.v. $\xi_1,\dots,\xi_n$ are independent of each other, if for any orthogonal matrix $\boldsymbol U$, the random vector $\boldsymbol \xi=(\xi_1,\dots,\xi_n)'$ have the same distribution with $\boldsymbol {U\xi}$, then $\xi_k\sim N(0,\sigma^2),k=1,\dots,n$.
This proposition seems to work on this problem. Now all we need to do is to prove this proposition.