I am trying to figure out how to find the distribution of the inverse of a random variable. Say, $Y=X^{-1}$ where X can take negative values.
The two ways I know to find the distribution of a random variable Y=g(X) are:
- $F_y(y) = P(Y < y) = P(X^{-1} < y) = f(x) = \left\{ \begin{array}{lr} P(1/y < X)&: X> 0,\; y>0\\ P(1/y > X)&: X>0,\; y<0 \\ P(1/y < X)&: X<0,\; y<0 \\ P(1/y > X)&: X<0,\; y>0 \end{array} \right. \;\;\;\;\;\;\;\;\;\;$ Then take derivative.
- Transformation theorem: $f_Y(y) = f_x(g^{-1}(y))|\frac{\partial g^{-1}(y)}{\partial y}|$ when g is monotone.
If I do it way #1, I have indicator functions all over the place which are not differentable.
I can't do way #2 because my function is not monotone on the entire real line.
Assume that $P[X=0]=0$, otherwise $Y$ is not well defined.