I am looking for a notion encoding a "dependence structure" for identically distributed random variables. More precisely, I would like some kind of structure that allows me to take any number of identically random variables and be able to say their joint distribution is "always the same". A couple of examples that might help clarifying what I mean:
- Being independent: any number of i.d. random variables which are independent have a joint distribution that behaves "the same way".
- Having joint Gaussian distribution: the distribution always behaves the same, for any number of variables.
One possible way I see to do that would maybe be using Archimedean copulas to define the joint distribution of any number of i.d. random variables. Another way might be to define some infinite-dimensional analogue of copulas on the product probability space $[0,1]^\mathbb{N}$ and require some nice property (such as being symmetric in all the variables and taking the expected value over one of the variables should give back the same distribution). However, I am afraid that these methods are either too restrictive (e.g. Archimedean copulas don't model joint Gaussian distribution) or technically not well behaved.
Is there any standard way to do this? What are possible approaches?