Prove that (1 − λ)F + λG is also a distribution function.

676 Views Asked by At

Suppose λ ∈ [0, 1]. Suppose also F and G are distribution functions. Prove that (1 − λ)F + λG is also a distribution function.

I know a function a a distribution if the random variable X is less than or equal to x of function

and if λ equals something in between 0 and 1 and including them would result just in a multiple of F and G expect in cases when λ is 0 and G becomes zero,

what does multiplying a distribution function by a constant do to it?

1

There are 1 best solutions below

0
On

A function $F:\mathbb R \to \mathbb R$ is a distribution function iff it has the following properties:

1) $x \leq y$ i mplies $F(x) \leq F(y)$

2) Limit of $F(y)$ as $y$ decreases to $x$ is $F(x)$ for each $x$

3) $F(x) \to 0$ as $x \to -\infty$

3) $F(x) \to 1$ as $x \to \infty$

From this you can answer the question easily.