Density of $ Y = X + \frac{1}{X}$ when $X\sim U(a,b)$

94 Views Asked by At

Let $ X $ be a continuous random variable uniformly distributed in $ \left[a, b\right] $, that is $X \sim U(a, b)$. Suppose $ 0 < a < b $. We wish to find the density function of $$ Y = X + \frac{1}{X} $$ Therefore, we want $ f_Y(y) $.
It is obvious that for $$ y\in (-2, 2) => f_Y(y) = P(Y = y) = P\left(X + \frac{1}{X} = y\right) = 0 $$ because $X$ takes real values, and therefore it is known that the function $ f(x) = x + \frac{1}{x} $ ranges in $(-\infty, -2]\cup[2, \infty)$.
I tried using the definition for $y\in(-\infty, -2]\cup[2, \infty)$: $$ P(Y = y) = P\left(X + \frac{1}{X} = y\right) = P(X^2 - yX + 1 = 0) = \cdots = \\P\left(\left(X - \frac{y + \sqrt{y^2 - 4}}{2} \right)\left(X - \frac{y - \sqrt{y^2 - 4}}{2} \right) = 0\right) = \\ P\left(X = \frac{y + \sqrt{y^2 - 4}}{2} \right) + P\left(X = \frac{y - \sqrt{y^2 - 4}}{2} \right) $$ which seems awfully dull. I am not even sure if it works.
I tried considering the random variable $Z = \frac{1}{X}$. It can easily be derived that $ f_Z(z) = \frac{z^{-2}}{b -a} $ for $b^{-1} < z < a^{-1} $. Therefore we can say $ Y = X + Z $. From here we can proceed: $$ P(Y = y) = P(X + Z = y) = \int_{0}^{y} P(X = y - k, Z = k)dk $$ But firstly the random variables $X$ and $Z$ are dependant, and secondly, it seems wrong. Any ideas?

1

There are 1 best solutions below

4
On

A possible path

Let $F_X(x)$ and $f_X(x)$ be the c.d.f and p.d.f. of $X\sim\mathcal U(a,b)$. Let also

$$g(x) = \frac1x + x,$$ $$\alpha(y) = \frac{y-\sqrt{y^2-4}}{2},$$ $$\beta(y) = \frac{y+\sqrt{y^2-4}}{2}.$$

Case 1 ($a\geq 1$)

$g(x)$ in monotonically increasing in $[a,b]$. Thus

\begin{eqnarray} P(Y<y) &=& \begin{cases} P(X<\beta(y)) & (g(a) \leq y \leq g(b))\\ 0 & \mbox{(otherwise)} \end{cases}\\ &=&\begin{cases} F_X(\beta(y)) & (g(a) \leq y \leq g(b))\\ 0 & \mbox{(otherwise)} \end{cases} \end{eqnarray}

Consequently, by differentiation you derive the p.d.f. of $Y=\frac1X+X$ as

\begin{eqnarray} f_Y(y) &=& \begin{cases} f_X(\beta(y)) \cdot \frac{d\beta}{d y} & (g(a) \leq y \leq g(b))\\ 0 &\mbox{(otherwise)} \end{cases}\\ &=&\begin{cases} \frac1{b-a}\left( \frac12+\frac{y}{2\sqrt{y^2-4}}\right) & (g(a) \leq y \leq g(b))\\ 0 &\mbox{(otherwise)} \end{cases} \end{eqnarray}

Case 2 ($b\leq 1$)

You proceed in a very similar manner, by noting that now $g(x)$ in monotonically decreasing in $[a,b]$. Thus

\begin{eqnarray} P(Y<y) = \begin{cases} P(X<\alpha(y)) & (g(b) \leq y \leq g(a))\\ 0 & \mbox{(otherwise)} \end{cases} \end{eqnarray}

Case 3 ($0<a < 1 <b$ and $ab>1$)

In this case you have $g(b) > g(a)>2$, so

$$ P(Y<y) = \begin{cases} P(\alpha(y) <x< \beta(y)) & (2<y< g(a))\\ P(x<\beta(y)) & (g(a)<y<g(b))\\ 0 & (\mbox{otherwise}) \end{cases} $$

Case 4 ($0<a < 1 <b$ and $0<ab<1$)

In this case you have $2<g(b) < g(a)$, so

$$ P(Y<y) = \begin{cases} P(\alpha(y) <x< \beta(y)) & (2<y< g(b))\\ P(x<\alpha(y)) & (g(b)<y<g(a))\\ 0 & (\mbox{otherwise}) \end{cases} $$