Random variables expressed in polar coordinates: exercise

3.3k Views Asked by At

Consider $(X,Y)$ uniformly distributed over a circle disk with center $(0,0)$ and radius $m$. Let $(R,\Theta)$ be stochastic random variables such that ($R\in \mathbb{R}^+, \Theta\in[0,2\pi)$)$$ \begin{cases} X = R\cos(\Theta)\\ Y=R\sin(\Theta)\end{cases}.$$ Find the joint pdf of $(R,\Theta)$. Are $R$ and $\Theta$ independent? Calculate $Var(X)$.

My attempt:

Solving for $(R,\Theta)$ gives $\begin{cases} R=\sqrt{X^2+Y^2} \\ \Theta=\operatorname{arctan}(Y/X)\end{cases}$. Thus $$ f_{R,\Theta}(r,\theta)=f_{X,Y}\left(\sqrt{x^2+y^2},\operatorname{arctan}\left(\frac y x\right)\right)\cdot \left| \det\begin{pmatrix} \frac{x}{\sqrt{ x^2+y^2}} & \frac{y}{\sqrt{x^2+y^2}} \\\frac{-y}{x^2+y^2} & \frac x {x^2+y^2}\end{pmatrix}\right|.$$

  • The last factor is equal to $\frac{1}{\sqrt{x^2+y^2}} = \frac 1r $.

  • The first one is equal to $\frac 1{\pi m^2}$ if $x^2+y^2+\operatorname{arctan}^2(y/x)\le m^2 \iff r^2+\theta^2\le m.$

This results in $$ f_{R,\Theta}(r,\theta)=\frac{1}{\pi r m^2}I(r^2+\theta^2\le m).$$

To check independence, we compute the marginal distributions:

$$ f_R(r)=\frac{2\sqrt{m-r^2}}{\pi r m^2}I(r^2\le m).$$ $$ f_{\Theta}(\theta)=\frac 1{\pi m^2}[\ln\sqrt{m-\theta^2}-\ln(-\sqrt{m-\theta^2})]I(\theta^2\le m).$$

Thus $R$ and $\Theta$ are dependent.

For the calculation of $Var(X)$, I would just calculate $E[X^2]$ and $E[X]$. The problem is that we need information about $X$ only here. Do I use $X=R\cos(\Theta)$ and the joint pdf of $(R,\Theta)$ to find the expectation values?

Also, can we deduce whether or not $X$ and $Y$ are independent. I know that $f_{X,Y}(x,y)=\frac1{\pi m^2}I(x^2+y^2\le m)$. The marginal distributions will have square roots that won't disappear in the product and therefore $f_Xf_Y\ne f_{X,Y}$. Is this correct?

1

There are 1 best solutions below

0
On BEST ANSWER

Density of $(X,Y)$ is $$f_{X,Y}(x,y)=\frac1{\pi m^2}\mathbf1_{x^2+y^2< m^2}$$

Applying the polar change of variables $(x,y)\to (r,\theta)$, the jacobian of transformation is $r$.

Clearly, $$x^2+y^2< m^2\implies r^2< m^2\implies 0<r<m$$

Note that $-m<x,y<m$, so $\tan\theta=y/x$ can take any real value. This means $$0<\theta<2\pi$$

Hence the density of $(R,\Theta)$ is $$f_{R,\Theta}(r,\theta)=\frac{r}{\pi m^2}\mathbf1_{0<r<m,0<\theta<2\pi}$$

That is, $$f_{R,\Theta}(r,\theta)=\frac{2r}{m^2}\mathbf1_{0<r<m}\frac1{2\pi}\mathbf1_{0<\theta<2\pi}$$

As the joint density factors into two marginal densities, $R$ and $\Theta$ are independent.

And as you can see this standard approach is not quite as intuitive here as working with the distribution function of $(R,\Theta)$.

Now since $R$ and $\Theta$ are independent, so are $R$ and $\cos\Theta$. Therefore expectation of $X$ is $$\operatorname E[R\cos\Theta]=\operatorname E[R] \operatorname E[\cos\Theta]$$

A quick calculation would give $\operatorname E[\cos\Theta]=\frac1{2\pi}\int_0^{2\pi} \cos\theta\,\mathrm{d}\theta=0$, which is also clear from the fact that distribution of $X$ is symmetric about $0$. Similarly find $\operatorname E[X^2]=\operatorname E[R^2]\operatorname E[\cos^2\Theta]$ and hence the variance of $X$.