Let $F:\mathbb{R}\rightarrow \mathbb{R}$ be a distribution function (CDF)
In this case, we can define the inverse $X$ of $F$, and it is a random variable on $(0,1)$ such that $F_X=F$.
Hence, every distribution (CDFs) can be viewed as the cdf of a random variable on $(0,1)$.
Is there an analogous result for joint distribution functions (CDFs)?
That is, for a fixed $n$, does there exists a probability space $(\Omega,\mathscr{F},P)$ such that every joint distribution function $F:\mathbb{R}^n\rightarrow \mathbb{R}$ is $F_X$ for some $n$-dimensional random vector $X$ on $(\Omega,\mathscr{F},P)$?
The answer to your question is yes; As in the 1-D case, you can take the probability space $([0,1], \mathcal F, \mathcal L)$, where $\mathcal F$ is the Borel sets and $\mathcal L$ is the Lebesgue measure, and for any CDF $F$ there exists $X: [0,1]\to\mathbb R^n$ such that $X$ has CDF $F$.
This is an instance of the more general fact that any Borel probability measure on a complete separable metric space is a pushforward of Lebesgue measure on $[0,1]$. See chapter 13 of Dudley's Real Analysis and Probability.