I am trying to consider whether this is possible and/or reasonable:
Let $X_n:\Delta_n \to \mathbb{R}$ be a sequence of random variables, defined over a unique space $\Delta_n \subseteq \Omega$ for each $n$. The $\Delta_n \to \Delta$ in some sense of convergence of metric spaces. Is it reasonable to consider whether the $X_n$ converge in distribution to some $X:\Delta \to \mathbb{R}$?
I haven't found this described in this way anywhere but it seems like a reasonable idea to me (surely I haven't just made it up?!) as convergence in distribution should be convergence of the distribution functions $F_n (t) = \mu_n \{ \omega \in \Delta_n | X_n(\omega ) < t \}$ where $\mu_n$ is the conditional probability measure associated to $\Delta_n$.
If yes then can anybody point to anywhere that might help me calculate the distribution of $X$ for some particular examples I have, or anywhere that might talk about this idea and provide any theorems giving properties as I am struggling to get my head around it all?
Convergence in distribution refers to a (well defined) convergence of the distributions, that is, of some probability measures defined on the target space. Thus the target space must be fixed (in your case, this seems to be the real line for every $n$ hence everything is fine) but the source spaces are simply irrelevant.