I have some doubts on the relation between the joint cumulative distribution function and its marginals.
Consider a random vector $X$ of dimension $L\times 1$ with cumulative distribution function $F$ absolutely continuous. Let $F_1,..., F_L$ denote the marginal cdf's. Let $\mathcal{F}$ denote the space of all possible $L$-dimensional cdf's $F$. Let $\mathcal{F}_l$ denote the space of all possible one-dimensional cdf's $F_l$ for $l=1,...,L$.
If we fix $F$, then $F_1,..., F_L$ are uniquely determined.
My doubts are on the other way around: suppose we fix $F_1,..., F_L$; is it true that any $F\in \mathcal{F}$ can be "compatible" with those marginals?
I am tempted to say that the answer is no: suppose we fix $F_1,...,F_L$ to be uniform in $[0,1]$. Now, take for example, $F\in \mathcal{F}$ which is the $L$-dimensional normal with mean zero and variance-covariance matrix equal to the identity matrix. Can this $F$ be compatible with $F_1,..., F_L$? I think it cannot. But I'm very confused on which is the correct way of thinking about this argument.
Your thinking is spot on and your example is good.
It is clear that if $X_i$ is concentrated on a set $B_i$, i.e. $\mathbb{P}(X_i \in B_i)=1$, then $X=(X_1,\ldots,X_L)$ must be concentrated on $B_1 \times \cdots \times B_L$. So discrete marginal distributions imply discrete simultaneous distributions, marginals concentrated on $[0,1]$ require simultaneous distribution concentrated on $[0,1]^L$, etc.
But actually, the structure is actually even stronger. With $F$ the cumulative distribution function and $F_i$ the marginal distribution functions, one can show that \begin{align*} \lim_{x_2 \to \infty, \ldots, x_L \to \infty} F(x_1,x_2,\ldots,x_L) = F(x_1). \end{align*} (The LHS turns out to be $\mathbb{P}(X_1 \in (-\infty,x_1), X_2 \in \mathbb{R},\ldots , X_L \in \mathbb{R})$ by interchanging the limit and the probability measure.)
Some further understanding can be obtained from Sklar's theorem. It essentially states that any cumulative distribution function $F$ is (in the continuous case uniquely) described by its marginals $F_i$ and a so-called copula $C$ through the relation \begin{align*} F(x) = C(F_1(x_1),\ldots,F_L(x_L)). \end{align*} A copula is a cumulative distribution function with uniform marginals. Sklar's theorem tells you that any cumulative distribution function is composed of two aspects: a dependency structure given by the copula and the marginal distributions. If you have decided upon the marginal distributions already, the class of possible cumulative distributions functions is given by all possible copulas (dependency structures).