Let $F(x)$ be a distribution function and $G(x)$ be $F(x)$ truncated on some interval $(a,b)$. I want to show that:
$$G(x)=\frac{F(x)-F(a)}{F(b)-F(a)}, a<x \leq b$$
I want to do this by using conditional probabilities. I know that
$$ G(x)=Pr(X\leq x \mid a<X \leq b) = \frac{Pr(a<X \leq b \mid X \leq x)}{Pr(a<X<b)}*Pr(X \leq x) $$
It seems to me that I have to argue that
$$Pr(a<X \leq b \mid X \leq x)*Pr(X \leq x)=Pr(a<X \leq x) $$
How can I do that?
$ G(x)=Pr(X\leq x|a<X\leq b) = Pr(X\leq x, a<X\leq b) /Pr(a<X\leq b) = Pr(a<X\leq x) / Pr(a<X\leq b)$
Now numerator is simplify as $ F(x)-F(a) $. Denominator is $F(b)-F(a)$. That's the answer.