On a probability space $(\Omega, \mathcal{F}, \mathbb{P})$, let us consider the joint law of two random variables $X$ and $Y$ $$\nu(B_1\times B_2):=\mathbb{P}(X\in B_1, Y \in B_2) $$ where $B_i\in\mathcal{B}(\mathbb{R})$. Do we have following statement: $$ (\mathcal{L}\{\nu\}(s,t))^n =\mathcal{L}\{\nu^{*n}\}(s,t)\\=\int_{\mathbb{R}^+}\int_{\mathbb{R}^+}e^{-sx}e^{-ty}\mathbb{P}(X_1+...+X_n\in dx, Y_1+...+Y_n \in dy) $$ where $$ \mathcal{L}\{\nu\}(s,t)=\int_{\mathbb{R}^+}\int_{\mathbb{R}^+}e^{-sx}e^{-ty}\mathbb{P}(X\in dx, Y \in dy), $$ and $X_i$ are i.i.d., $Y_i$ are i.i.d., while $X_i$ and $Y_i$ are not necessarily independent.
Intuitively this statement seems right but I couldn't find a rigorous proof. I know this statement is correct for a two dimensional function (which can be proved by interchanging order of integration), but I am not sure whether it is true for generous cases (for instance, $\nu$ may be a dirac measure or the density may not exist).
Please let me know if you have any idea about the proof, thanks in advance.