Why does interval not matter for cdf of continuous random variable?

151 Views Asked by At

If $X$ is a continuous random variable then $$F(b) - F(a) = P(a \leq X \leq b) = P(a \lt X \leq b) = P(a \leq X \lt b) = P(a \lt X \lt b)$$

Why does the equality not matter? And how will things change if $X$ wasn't continuous in interval $[a, b]$?

1

There are 1 best solutions below

0
On

Because by definition $P(X=c)=0$ for every $c\in\mathbb R$ when $X$ is a continuous random variable.

Observe for instance that consequently: $$P(a\leq X\leq b)=P(X=a)+P(a<X\leq b)=0+P(a<X\leq b)=P(a<X\leq b)$$