Problem: Show that if $X$ and $Y$ are independent $N(0,1)$-distributed random variables, then $X/Y ∈ C(0,1)$.
Question: I don't know how to proceed below. I want to prove that the PDF of $X/Y$ is Cauchy. PS. I looked on wiki and they defined $C$ as $X/Y$
Attempt: 
The expression looked hairy and wolfram could not integrate it.
Note to self: problem 8.
Let $Z=X/Y$. Then $$ Ef(Z) = \int f(x/y)n(x)n(y) dxdy = \int f(z) n(yz)n(y)|y|dydz $$ With $n(x) = \frac1{\sqrt{2\pi}} \exp(-x^2/2)$ and $\sigma(z)^2 = 1/(1+z^2)$, $$ n(y)n(yz) = \frac1{2\pi} \exp((1+z^2)(-y^2/2)) = \frac 1{2\pi} \exp(-y^2/2\sigma(z)^2) \\ \int n(yz)n(y)|y|dy = 2\int_0^\infty \frac 1{2\pi} \exp(-y^2/2\sigma(z)^2)ydy\\ = \frac{\sigma(z)^2}{\pi} \int_0^\infty \exp(-y^2/2\sigma(z)^2)\frac{y}{\sigma(z)} \frac{dy}{\sigma(z)} = \frac{\sigma(z)^2}{\pi} \\ Ef(Z) = \int f(z) n(yz)n(y)|y|dydz = \int f(z) \frac{\sigma(z)^2}{\pi} dz = \int f(z) \frac{dz}{\pi(1+z^2)} $$