proof of Box–Muller transform (polar form)

2.3k Views Asked by At

There're proofs of Box–Muller transform available online but my book (pattern recognition and machine learning) seems to have put it in a different form.

enter image description here

enter image description here

I didn't follow the derivation of equation 11.12, can anyone please help? Thanks!

EDIT
As mentioned in Nadiels's answer, there's a mistake in formula 11.10 and 11.11, as logarithm has to take in a positive number (PRML errata).

2

There are 2 best solutions below

4
On BEST ANSWER

So first thing there is an error in equations $(11.10)$ and $(11.11)$ and in fact you should have the transformations $$ y_i = z_i \left( \frac{-2 \ln r^2 }{r^2 } \right)^{1/2} $$ and in particular we have \begin{align*} \exp\left( -\frac{1}{2} \left(y_1^2 + y_2^2 \right) \right) &=\exp\left( \left( z_1^2 +z_2^2\right)\frac{\ln(r^2)}{r^2} \right) = r^2, \end{align*} which using the inverse function theorem tells us that if $$ \mathbf{J} =\begin{bmatrix} \frac{\partial y_1}{\partial z_1} & \frac{\partial y_1}{\partial z_2} \\ \frac{\partial y_2}{\partial z_1} & \frac{\partial y_2}{\partial z_2}\end{bmatrix}, $$ then to get the desired result we want to show that $\left| \operatorname{det}(\mathbf{J}) \right| = 2/r^2$. Or $$ \left| \left(\frac{\partial y_1}{\partial z_1}\right)\left(\frac{\partial y_2}{\partial z_2}\right) - \left(\frac{\partial y_1}{\partial z_2}\right)^2 \right|= \frac{2}{r^2}. $$ Let $$ y_i = z_i h(r^2), \qquad \mbox{where } h(r^2) = \left(-\frac{2\ln r^2}{r^2} \right)^{1/2} $$ then \begin{align*} \left(\frac{\partial y_1}{\partial z_1}\right)\left(\frac{\partial y_2}{\partial z_2}\right) - \left(\frac{\partial y_1}{\partial z_2}\right)^2 &= h(r^2)^2+2r^2h'(r^2)h(r^2) \\ &=h(r^2)^2 + \frac{2}{r^2}\left( \ln(r^2) - 1 \right)\\ &=\frac{-2\ln(r^2)}{r^2} + \frac{2 \ln r^2}{r^2} - \frac{2}{r^2} \\ &= -\frac{2}{r^2}. \end{align*} as desired.

0
On

As a general rule, a pdf in one set of variables is given by the pdf in another set of variables multiplied by the determinant of the Jacobian matrix.

In you case, you must calculate: $$\left|\left(\begin{array}(\partial z_1 / \partial y_1 & \partial z_2 / \partial y_1 \\ \partial z_1 / \partial y_2 & \partial z_2 / \partial y_2\end{array}\right)\right|$$ Now, this looks a bit complicated, since you are given $y_1,y_2$ in terms of $z_1,z_2$ and not the other way round. Luckily, we can use the Inverse function theorem to get that: $$\left|\frac{\partial (z_1, z_2)}{\partial (y_1, y_2)}\right|= \left|\frac{\partial (y_1, y_2)}{\partial (z_1, z_2)}^{-1}\right|=\left|\frac{\partial (y_1, y_2)}{\partial (z_1, z_2)}\right|^{-1}$$

So you should be able to calculate the derivatives and then the inverse determinant.