Suppose that $(\Omega,\mathcal F, P)$ is a probability space, and let $Z = (X,Y)$ be a $\mathbb R^2$-valued random variable. Let $F_X$ and $F_Y$ denote the respective marginal distribution function of $X$ and $Y$. Suppose that $X_1, X_2, \dots, X_n$ form an independent and identically distributed sample from $F_X$, and $Y_1, Y_2, \dots, Y_n$ form an independent and identically distributed sample from $F_Y$.
The empirical distribution functions $$\hat F_X = n^{-1}\sum_{j=1}^n\mathsf 1_{[X_j,\infty)}\qquad\text{and}\qquad \hat F_Y = n^{-1}\sum_{j=1}^n \mathsf 1_{[Y_j,\infty)}$$ are used to estimate $F_X$ and $F_Y$, respectively. Donsker's Theorem states that $\sqrt n (\hat F_X - F_X)$ and $\sqrt n (\hat F_Y - F_Y)$ converge in distribution to a Gaussian process with mean function $t\mapsto 0$ and variance-covariance function $(t,s)\mapsto \min\{G(s),G(t)\} - G(s)G(t)$, where $G\in\{F_X,F_Y\}$.
But what can I say about the joint distribution of $$G_n := \begin{bmatrix} \sqrt n (\hat F_X - F_X) \\ \sqrt n(\hat F_Y - F_Y) \end{bmatrix}?$$ If $X$ and $Y$ are independent, I suspect that the mean function is $$t\mapsto \begin{bmatrix} 0 \\ 0 \end{bmatrix}$$ and variance function $$ (t,s)\mapsto \begin{bmatrix} F_X(\in\{s,t\}) - F_X(t)F_X(s) & 0 \\ 0 & F_Y(\min\{s,t\}) - F_Y(t)F_Y(s) \end{bmatrix}.$$ Correct? But what if $X$ and $Y$ are not independent? What about the covariances then?
Edit 3: Fixed a (really stupid) copy paste error. It's $F_{X,Y}(x,y)$ instead of $\min\{F_X(x),F_Y(y)\}$. The latter expression is only valid if $X$ and $Y$ are independent. I copy pasted my the above matrix and forgot the adjustment...
Edit 2: I suspect that if $X$ and $Y$ are not independent, we have that $G_n$ converges to a Gaussian process with mean function $$t\mapsto\begin{bmatrix} 0 \\ 0 \end{bmatrix}$$ and variance-covariance function $$(t,s)\mapsto \begin{bmatrix} F_{X}(\min\{s,t\}) - F_X(s)F_X(t) & F_{X,Y}(s, t) - F_X(s)F_Y(t) \\ F_{X,Y}(s,t)\} - F_X(s)F_Y(t) & F_Y(\min\{s,t\})\} - F_Y(s)F_Y(t) \end{bmatrix}.$$ Here $F_{X,Y}$ is the joint distribution function of $X$ and $Y$, i. e. $F_{X,Y}(s,t) = P(\{X\leq s\}\cap\{Y\leq t\}).$
"Proof (ignoring some technicalities:" Let $z_1,z_2,\dots,z_m\in\mathbb R^2$ for $m\in\mathbb N$, and consider $$\begin{bmatrix} \begin{bmatrix} \sqrt n(\hat F_X(z_1) - F_X(z_1) \\ \sqrt n(\hat F_Y(z_1) - F_Y(z_1) \end{bmatrix} \\ \begin{bmatrix} \sqrt n(\hat F_X(z_2) - F_X(z_2) \\ \sqrt n(\hat F_Y(z_2) - F_Y(z_2) \end{bmatrix} \\ \vdots \\ \begin{bmatrix} \sqrt n(\hat F_X(z_m) - F_X(z_m) \\ \sqrt n(\hat F_Y(z_m) - F_Y(z_m) \end{bmatrix}\end{bmatrix} = \begin{bmatrix} G_n(z_1) \\ G_n(z_2) \\ \vdots \\ G_n(z_m) \end{bmatrix} :=\mathcal G_n.$$ This is a vector in $\mathbb R^{2m}$. Applying the multidimensional CLT yields that $\mathcal G_n$ converges to a normally distributed random variable with mean zero and variance-covariance matrix $$\begin{bmatrix} \Sigma(z_1,z_1) & \Sigma(z_1,z_2) & \cdots & \Sigma(z_1,z_n) \\ \Sigma(z_2,z_1) & \Sigma(z_2,z_2) & \cdots & \Sigma(z_2,z_n) \\ \vdots & \vdots & \ddots & \vdots \\ \Sigma(z_n,z_1) & \Sigma(z_n,z_2) & \cdots & \Sigma(z_n,z_n) \end{bmatrix},$$ where
$$ \Sigma(z_i,z_j) = \begin{bmatrix} F_X(\min\{z_i,z_j\}) - F_X(z_i)F_X(z_j) & F_{X,Y}(z_i,z_j) - F_X(z_i)F_Y(z_j) \\ F_{X,Y}(z_i,z_j)\} - F_X(z_i)F_Y(z_j) & F_Y(\min\{z_i,z_j) - F_Y(z_i)F_Y(z_j) \end{bmatrix}$$ as $n$ approaches infinity. Hence by definition of a Gaussian proces, $G_n$ converges to a Gaussian process with the above mentioned mean and variance-covariance function. Correct?
Edit 1: If $X$ and $Y$ are assumed to be independent, I (implicitly) also assume that $X_i$ and $Y_i$ are independent. Conversely, if $X$ and $Y$ are assumed to be dependent, then $X_i$ and $Y_i$ are assumed to be dependent as well.