I am always considering non-discrete/non-finite probability spaces $\Omega$. For everything that follows feel free to assume $\Omega = \mathbb{R}^n$.
Say you have 2 random variables $X_1,X_2 :\Omega \rightarrow \mathbb{R}$. Now an event based definition of $X_1$ and $X_2$ being ``independent" is as follows : "$X_1$ and $X_2$ are independent random variables if for all $x,y \in \mathbb{R}$ we have that $\mathbb{P}((X_1 \leq x)\cap(X_2 \leq y)) = \mathbb{P}(X_1 \leq x)\mathbb{P}(X_2 \leq y)$" (Am I right?)
Are there natural examples of pairs of independent random variables whose descriptions can be given as maps $\Omega \rightarrow \mathbb{R}$ ?
Is the above setup enough to ensure that there exists a joint-distribution of $X_1$ and $X_2$? If yes, how?
When would one prefer to use a joint-distribution based definition of ``independence" as opposed to such an event based defition and vice-versa?
On $(0,1)$ if you define $X_n(\omega)$ as the n-th coefficient in the expansion of $\omega$ to base $2$ then the random variables $X_1,X_2,\cdots$ are independent. The statement $P\{X\leq x,Y\leq y\}=P\{X\leq x\}P\{Y\leq y\}$ for all $x,y$ is equivalent to the statement $P(X\in A, Y \in B)=P(X\in A)P( Y \in B)$ for all Borel sets $A$ and $B$ in $\mathbb R$. (I suppose this is what you mean by event based definition of independence).