Show that $\int_{(a,b]}G(x)dF(x)+\int_{(a,b]}F(x)dG(x)=F(b)G(b)−F(a)G(a)$ where F and G are distribution functions

154 Views Asked by At

$\mathbf {The \ Problem \ is}:$ If $F$ and $G$ are two bounded distribution functions on the interval $[a,b]$ with no common points of discontinuities, then show that $\int_{(a,b]} G(x)dF(x) + \int_{(a,b]} F(x)dG(x) =F(b)G(b) - F(a)G(a)$

$\mathbf {My \ approach}:$ Actually,I was trying to apply Integration By Parts formula in the Riemann-Stiltjes Integral as $F$ is non-decreasing on $[a,b]$ .

But, I am confused about where to apply the fact that their set of discontinuities are disjoint .

A hint is very much required at this time, thanks in advance .

1

There are 1 best solutions below

0
On BEST ANSWER

Hints: Let $\mu$ and $\nu$ be the measures associated by $F$ and $G$. Apply Fubini-Tonelli Theorem to compute $\mu\times\nu( A )=\int\int 1_A(x,y) d\mu(x) d\nu(y)$, where $ A = (a,b]\times (a,b]$. Write $A = A_1 \cup A_2$, where $A_1 = \{(x,y)\in A\mid x\leq y\}$ and $A_2 =\{(x,y)\in A \mid x>y\}$.