How to prove this relation between joint PDF and partial derivatives?

180 Views Asked by At

Given $X_{1,2,\cdots,d}:(\Omega,\mathcal F,\Bbb P)\to \Bbb R$ are random variables with absolutely continuous distributions $F_i(x_i)$. Consider their joint "CDF": $$F(x_1,\cdots,x_d):=\Bbb P(X_1\le x_1,\cdots,X_d\le x_d).$$ Then $F$ induces a probability measure on $\Bbb R^d$. Now my question is, how to prove that there exists a unique non-negative, measurable function $f:\Bbb R^d\to\Bbb R_{\ge 0}$ such that:
1). $$f(x_1,\cdots,x_d)=p(x_1,\cdots,x_d)=\dfrac{\partial^d F(x_1,\cdots,x_d)}{\partial x_1\cdots\partial x_d}$$ almost everywhere;
2). $$ F(x_1,\cdots,x_d)=\int_{(-\infty,x_1]\times\dots\times(-\infty,x_d]}f(x_1,\cdots,x_d)\,\mathrm dx_1\cdots\mathrm d{x_d}.$$

I think once we prove the integration of $p$ and that of $f$ coincides on rectangles then uniqueness follows. So actually we only have to show that $\int p = F$ where the integration is over $(-\infty, x_1]\times\cdots\times(-\infty,x_d]$. Looks easy but I still have clue. Any suggestion? Thanks!


EDIT As per NCh's comment the partial derivative may not exist even when each component's CDF is absolutely continuous. So let's just assume this singularity away, i.e., the partial derivative exists a.e.