Let $\mathcal{X}$ denote a set of inputs, and assume we are given an i.i.d. sample $X_1,\dots,X_n$ according to some unknown distribution $P$ over $\mathcal{X}$. Further, given two symmetric kernels $k:\mathcal{X}^l \to \mathbb{R} $, $h:\mathcal{X}^m \to \mathbb{R} $ for integers $l, m$. I am interested in the asymptotic distribution of the following vector:
$$ (U_n, V_n)^T $$ where $$ U_n = \frac{1}{{n \choose l}} \sum_{C_{l, n}} k(X_{i_1},\dots, X_{i_l}) $$ where $C_{l, n}$ is the set of all ${n \choose l}$ combinations of $n$ integers $i_1,\dots, i_l$ chosen from $[n]$. $V_n$ is defined similarly with $l, k(\cdot)$ replaced by $m, h(\cdot)$ respectively. $U_n, V_n$ are U-statistics for parameters $(\theta, \gamma)$ respectively.
Taking $U_n$ for example, and defining the sequence of projections: $$ k_c(x_1,\dots,x_c) = \mathbb{E} h(x_1,\dots,x_c, X_{c+1}, \dots, X_l), \quad c=0,\dots, l, $$ and corresponding variances $\sigma^2_c = \text{Var}(k_c(X_1,\dots,X_c))$, Theorem 2 here states that if $\sigma^2_l < \infty$ then $\sqrt{n} (U_n - \theta) \implies N(0, l^2 \sigma^2_1)$, and a similar result holds for $V_n$, but I would like to know if there is a simple result for the asymptotic joint distribution.
Update I recently came across this set of notes, in which Theorem 10.14 states that for a random sample $(X_i)_{i=1}^n$ and symmetric kernels $\phi, \psi$ with finite variance, then
$$ U_n^{(1)} = \frac{1}{{n \choose a}} \sum \phi(X_{i_1},\dots, X_{i_a}), \quad U_n^{(2)} = \frac{1}{{n \choose b}} \sum \psi(X_{i_1},\dots, X_{i_b}) $$
and
$$ \theta_1 = \mathbb{E}[U_n^{(1)}], \quad \theta_2 = \mathbb{E}[U_n^{(2)}] $$
then
$$ \sqrt{n} \left ( \begin{bmatrix} U_n^{(1)}\\ U_n^{(2)} \end{bmatrix} - \begin{bmatrix} \theta_1 \\ \theta_2 \end{bmatrix} \right ) \implies N \left ( \begin{bmatrix} 0\\0 \end{bmatrix}, \begin{bmatrix} a^2 \tau_1^2 & ab \alpha_{11}\\ab \alpha_{11} & b^2 \tau^2_2\end{bmatrix}, \right ) $$
where
$$ \alpha_{ij} = \text{Cov} \{ \phi_i(X_1,\dots,X_i), \psi_j(X_1,\dots, X_j) \} = \text{Cov} \{ \phi(X_1,\dots, X_a), \psi(X_1,\dots,X_d, X_{a+1}, \dots, X_{a + (b-d)}) \} $$ and where $\tau_1^2 = \text{Var}(\phi_1(X_1)))$, and $\tau_2^2 = \text{Var}(\phi_2(X_1)))$, where $\phi_i$ is the $i$-th projection:
$$ \phi_i(x_1,\dots,x_i) = \mathbb{E} [\phi(x_1,\dots,x_i, X_{i+1},\dots, X_a)]. $$
This pretty much solves my problem but there are no references in the attached notes so if anyone has a link to a proof that would be appreciated