Suppose that $F,G \in BV[a, b]$, where $-\infty<a<b<\infty$. I want to prove that if there are no points on $[a,b]$ where $F$ and $G$ are both discontinuous, then $$\int_{[a,b]} F d G+\int_{[a,b]} G dF= F(b)G(b)-F(a-)G(a-)$$
I know that $$\int_{[a,b]}\frac{F(x)+F(x-)}{2}dG(x)+\int_{[a,b]} \frac{G(x)+G(x-)}{2}dF(x)=F(b)G(b)-F(a-)G(-b).$$
How I can use the hypothesis that there are no points in $[a,b]$ where $F$ and $G$ are both discontinuous??
Recall that any rights continuous function of finite variation $F$ on an interval $\alpha,\beta$ generates a unique (possibly signed) measure $\mu_F$ if local finite variation such that $\mu_F((a,b])=F(b)-F(a)$. This is the so called Lebesgue-Stieltjes measure associated to $F$, see for example Klenke, A. Probability theory, Universitext, Springer-Verlag, London 2008, pp 26-27.)
A proof can be obtained using Fubini's theorem \begin{aligned} F(b)-F(a))(G(b)-G(a))&=\int_{(a,b]\times(a,b]}\mu_F\otimes\mu_G(dt,ds)\\ &=\int_{(a,b]}\Big(\int_{(a,s]}\mu_F(dt)\Big)\mu_G(s) +\int_{(a,b]}\Big(\int_{(s,b]}\mu_F(dt)\Big)\mu_G(ds)\\ &=\int_{(a,b]}\Big(\int_{(a,s]}\mu_F(dt)\Big)\mu_G(s) +\int_{(a,b]}\Big(\int_{(a,t)}\mu_G(ds)\Big)\mu_F(dt)\\ &=\int_{(a,b]} F(s)-F(a)\mu_G(s) +\int_{(a,b]}G(t-)-G(a)\mu_F(dt)\\ \end{aligned} Algegraic simplifications yields the result in the Theorem.
If in addition $F$ and $G$ are continuous, then $F(a)=F(a-)$ and $G(a)=G(a-)$. Hence $\mu_F(\{a\})=\mu_G(\{a\})=0$, and we can substitute $(a,b]$ by $[a,b]$ in the integration.