Determine the law of $F^{-1}(U)$, $U$ uniformly distributed on $[0,1]$

288 Views Asked by At

i'm trying to understand the following problem

Let $X$ be a real random variable, its distribution function is $F(t):\Bbb{P}(X\le t), \forall t\in \Bbb{R}$. Define the right-continuous inverse by \begin{equation}F^{-1}(u) := \inf\{x\in \Bbb{R}:F(x)>u\} \qquad (a)\end{equation} If $U$ is uniformly distributed on $[0,1]$ find the law of $F^{-1}(U)$

I tried to solve the problem in the following way: what we are looking for is $F_{F^{-1}(U)}(y)$ $$F_{F^{-1}(U)}(y) = P(F^{-1}(U)\le y) =P(U\le F(y)) = \int_0^{F(y)}1\, dt = F(y)$$

However i think that i'm doing some error above, because the solution of the problem states:

We have

$$P(F^{-1}(U)\le a) \overset{(1)}{=} P\left( \bigcap_{n\ge 1} \{U < F(a+\frac{1}{n})\}\right)=\lim_{n\to \infty}P\left(U<F\left(a+\frac{1}{n}\right)\right) = \lim_{n\to \infty}F(a+\frac{1}{n}) = F(a)$$

Is what i've done wrong the result are equal but i don't see the reasoning behind $(1)$. Can someone explain me the step $(1)$? I think that $(1)$ comes from the definition $(a)$, but i can't prove it. Thank you in advance for any help

4

There are 4 best solutions below

2
On BEST ANSWER

As requested:

well the first thing equals to $P({U\leq F(a)})$ by monotonicity of the function $F$.

Claim:

$\bigcap_{n\ge 1} \{U < F(a+\frac{1}{n})\}=\{U\leq F(a)\}$

Proof: clearly $\{U\leq F(a)\}\subset\bigcap_{n\ge 1} \{U < F(a+\frac{1}{n})\}$, since $\{U\leq F(a)\}\subset \{U < F(a+\frac{1}{n})\}$ for every $n$

To see the reverse direction:

For some $\omega \in \bigcap_{n\ge 1} \{U < F(a+\frac{1}{n})\}$, suppose that $U(\omega)>F(a)$, then there exists $n$ such that $U(\omega)>F(a+1/n)$ since $F$ is increasing and right continuous.

2
On

Your approach worked in the case of the uniformly-distributed random variable, because $F$ is continuous and differentiable on $[0,1]$ -- so you are able to integrate the density function.

For a random variable in general the distribution function is non-decreasing and right-continuous -- but could have jump discontinuities.

The reasoning behind (1) is if $E_n$ constitute a contracting sequence of events, then

$$P(\cap_{k=1}^{n}E_k)=\lim_{n \rightarrow \infty}P(E_n)$$

3
On

Conceptually, the distribution of $F^{-1}(X)$ is always $U[0,1]$, regardless of the underlying random variable. It just happens that with the simple case of when you already start with a standard uniform, you just get one back. This is just from the definition of a CDF.

Also, I don't see why you think you got a different answer: You've concluded that the probability that the inverse transform of a uniform is less than a value is equal to the CDF of the random variable. So did the "solution". The only difference I see is that they are using $a$ instead of $y$. Also, you relied on integration while the solution relied on limits, which is a more general approach. But you are not wrong in this case.

0
On

To compute $\mathbf P\left(F^{-1}(U) \le a\right)$, we should understand the event $\{F^{-1}(U) \le a\}$ better. Notice that if $F^{-1}(U) \equiv \inf\{y : F(y) > U\}$, then by definition of the infimum, for every $n\ge 1$ we have $$ F\left(F^{-1}(U)+\frac{1}{n}\right) > U. $$ If $F^{-1}(U) \le a$, then by monotonicity, $a$ has the property that for every $n\ge 1$, $$ F\left(a + \frac{1}{n}\right) > U. $$ Conversely, if $F\left(a+\frac{1}{n}\right) > U$ for every $n$, then $a$ is at least as big as $\inf\{y : F(y) > U\}$, i.e. $a\ge F^{-1}(U)$. Hence we have \begin{align*} \mathbf P\left(F^{-1}(U) \le a\right) &= \mathbf P\left(\bigcap_{n=1}^\infty\left\{U<F\left(a+\frac{1}{n}\right)\right\}\right) \end{align*} because the two events inside $\mathbf P(\dots)$ are equal. Now, the events inside the second set of parentheses are nested, so \begin{align*} P\left(\bigcap_{n=1}^\infty\left\{U<F\left(a+\frac{1}{n}\right)\right\}\right) &= \lim_{n\to\infty}\mathbf P\left(U < F\left(a+\frac{1}{n}\right)\right)\\ &= \lim_{n\to\infty}F\left(a+\frac{1}{n}\right) \\ &= F(a), \end{align*} where the second equality uses the definition of $U$ and the last equality follows by right-continuity of the distribution function $F$.