This is a question about theorem 3.2.2 of James Norris's notes on advanced probability (http://www.statslab.cam.ac.uk/~james/Lectures/ap.pdf).
Let $(\Omega,\mathcal{F}, (\mathcal{F}_n)_{n∈ \mathbb{N}}, \mathbb{P})$ be a filtered probability space with $\mathcal{F} = \mathcal{F}_{∞}$ and let $(X_n)_{n∈ \mathbb{N}}$ be a martingale on this probability space. Then the above theorem affirms that by Caratheodory $\mathbb{E}[X_T] = 1$ for all "finite" stopping times $T$ if and only if there exists a probability measure $\tilde{\mathbb{P}}$ on the filtered space $(\Omega,\mathcal{F}, (\mathcal{F}_n)_{n∈ \mathbb{N}})$ such that
$$\tilde{\mathbb{P}}(A) = \mathbb{E}[X_n\mathbb{1}_{A}]\qquad \mbox{for all $A∈ \mathcal{F}_n$}$$
My first question is about the meaning of "finite" here. Is it correct that this should be strictly understood in the sense that $T:\Omega→ \mathbb{N}$ (i.e. this is not in the almost-sure sense)?
Secondly, this suggests that the extension solely depends on the statistical properties of $(X_n)_{n∈ \mathbb{N}}$: is it correct that in this case the extension does not depend on structure of $\Omega$? e.g. say two martingale $X = (X_n)_{n∈ \mathbb{N}}, X' = (X'_n)_{n∈ \mathbb{N}}$ are "defined identically" on two different probability spaces $(\Omega,\mathcal{F}, (\mathcal{F}_n)_{n∈ \mathbb{N}}, \mathbb{P})$ and $(\Omega',\mathcal{F}', (\mathcal{F}'_n)_{n∈ \mathbb{N}}, \mathbb{P}')$, i.e. for all Borel measurable subset $A$ of $\mathbb{R}^{\mathbb{N}}$,
$$\mathbb{P}(X∈A)=\mathbb{P}'(X'∈ A)$$
and set
$$\tilde{\mathbb{P}}_n(B) = \mathbb{E}[X_n\mathbb{1}_B]\qquad \text{for all $B ∈ \mathcal{F}_n$}$$
$$\tilde{\mathbb{P}}_n'(B) = \mathbb{E}[X_n'\mathbb{1}_{B'}]\qquad \text{for all $B' ∈ \mathcal{F}'_n$}$$
Now, say an extension $\tilde{\mathbb{P}}$ of $(\tilde{\mathbb{P}}_n)_{n∈ \mathbb{N}}$ is achieved by other means (say Kolmogorov extension) on $\Omega$. Does that automatically imply that the extension $\tilde{\mathbb{P}}'$ of $(\tilde{\mathbb{P}}_n')_{n∈ \mathbb{N}}$ can be achieved on $\Omega'$?
If not, are there conditions so that this can be done (without any condition of uniform integrability on $(X_n)_{n∈ \mathbb{N}})$?
Norris's proof only uses everywhere finite $T$ for the "only if" part of the assertion; the "if" assertion is true even for a.s. finite $T$.
As to your second question, consider the following example in the simplest case $X_n=1$ for all $n$. Let $\Omega:=\Bbb R^{\Bbb N}$, with coordinate maps $Y_k(\omega) = \omega_k$ for $\omega=(\omega_1,\omega_2,\ldots)\in\Omega$. Let $\mathcal F_n:=\sigma(Y_1,\ldots,Y_n)$. Let $\Bbb P$ be the probability measure on $(\Omega,\mathcal F)$ under which $Y_1,Y_2,\ldots$ are i.i.d. standard normal random variables. For $\Omega'$ I take $\{\omega\in\Omega:\lim_k\omega_k=0\}$, and then $\mathcal F'_n$ to be the trace of $\mathcal F_n$ onto $\Omega'$. For each $n\in\Bbb N$ there is a unique probability measure $\Bbb P'_n$ defined on $(\Omega',\mathcal F'_n)$ under which $(Y'_1,\ldots,Y'_n)$ are i.i.d. standard normal. (Here $Y'_k$ is the restriction of $Y_k$ to $\Omega'$.) The family ($\Bbb P'_n)$ is consistent in the sense of the Kolmogorov extension theorem. But there is no probability measure on $\mathcal F'_\infty:=\sigma(\cup_{n\in \Bbb N}\mathcal F'_n)$ whose restriction to $\mathcal F'_n$ is equal to $\Bbb P'_n$. If there were such a probability, call it $\Bbb P'$, then we would have $\Bbb P'(\Omega')=1$, while clearly $\Bbb P(\Omega')=0$, which leads to a contradiction.
There are conditions (on the filtered measurable space) under which the extension of a consistent family of probability measures is automatic; see Theorem 4.1 on page 141 of Parthasarathy's Probability Measures on Metric Spaces.