I have problems understanding the concept of a filtration in stochastic calculus. I understand that for example the natural filtration $F_t$ contains only outcomes up to time $t$, but since it is a sigma algebra it contains all possible events. For instance for $X_s$, $s<t$, it should contain all possible outcomes of $X_s$, and even all subsets of possible outcomes of $X_s$, right? How can it then contain information about which events occurred and which did not, when it contains all possible events up to time $t$? What really is meant when people write that "the filtration contains the information of outcomes up to time $t$" and uses $|F_t$ to indicate conditional expectation values? Or have I completely misunderstood what is meant when one says that a filtration is a sigma algebra?
2026-03-25 19:05:32.1774465532
What is meant by a filtration "contains the information" until time $t$?
1.5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in FILTRATIONS
- $\sigma$-algebra generated by a subset of a set
- Why is the following sigma algebra $\mathcal{F}_n=\pi_n^{-1}(\mathcal{B}(\{0,1\}^n))$ finite?
- How can I show that $\mathcal{F}_t^X$ is generated by sets of the form $F=\{(X_{t_1},\dots, X_{t_n}) \in \Gamma\}$
- Underlying Random Variable of Conditional Expectation
- What are the generating sets of $\mathcal{F}_t^X=\sigma(X_s , 0 \leq s \leq t) $?
- Where is the Strong Markov property(SM) being used in the proof that augmented filtration of a Strong Markov process is right continuous?
- Characterization of a set in the augmented filtration $\mathcal{F}_t^{\mu}=\sigma(\mathcal{F}_t^X, \mathcal{N}^{\mu})$
- Law of a Markov process uniquely determined by its 2-dimensional distributions
- Sub-sigma-algebras: infinite coin flips example on wikipedia
- Prove $(X_n, F_n) $ Martingale $\iff \int_{F} X_{n+1} = \int_{F} X_{n} \forall F \in F_n$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
In order to understand the intuition behind filtrations, it's a good idea to start with a very particular case: the $\sigma$-algebra generated by a single random variable $X:\Omega \to \mathbb{R}$, i.e.
$$\sigma(X) = \{ \{X \in B\}; B \in \mathcal{B}(\mathbb{R})\} \tag{1}$$
which is the smallest $\sigma$-algebra $\mathcal{F}$ on $\Omega$ such that $X: (\Omega,\mathcal{F}) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$ is measurable. First of all, you should notice two facts:
Example: For $A \subseteq \Omega$ consider the random variables $$X := 2 \cdot 1_A \quad \text{and} \quad Y := 5 \cdot 1_{A^c}.$$ It follows from the very definition $(1)$ that $$\sigma(X) = \sigma(Y) = \{\emptyset, \Omega,A,A^c\},$$ i.e. the two random variables generate the same $\sigma$-algebra although the random variables are quite different. In particular, we cannot expect to use $\sigma(X)$ to reconstruct the random variable $X$.
This leads to the natural question in which sense we can understand $\sigma(X)$ as "information" about a random variable $X$. There is the following characterization of $\sigma(X)$:
Example: Let $U,V$ be two independent random variables taking the values $0$ and $1$ with probability $1$ and set $R:= U+V$. Then $\{U=1\}$ is not contained in $\sigma(R)$. Why? Once we have observed $R(\omega)$, we cannot tell whether $\omega \in \{U=1\}$, for instance if $R(\omega)=1$ we do not know whether $U(\omega)=1$ or $V(\omega)=1$.
The so-called factorization lemma states that a random variable $Y$ is $\sigma(X)$-measurable if, and only if, there exists a measurable function $h$ such that $$Y=h(X).$$ Intuitively this means that a random variable $Y$ is $\sigma(X)$-measurable if and only if after oberserving our random variable $X(\omega)$ we have all the necessary information to determine $Y(\omega)$. The $\sigma$-algebra $\sigma(X)$ hence stores the information which additional "knowledge" we can get once we have observed the outcome $X(\omega)$ of the random variable. We can consequently read the conditional expectation $$\mathbb{E}(Y \mid \sigma(X))$$ as the expectation of $Y$ given that we have observed $X$. For instance if $Y$ is $\sigma(X)$-measurable, then $$\mathbb{E}(Y \mid \sigma(X)) = Y$$ because - according to our intuition - $Y(\omega)$ is fully determined by $X(\omega)$ (which we already observed).
For the canonical filtration $$\mathcal{F}_t := \sigma(X_s; s \leq t)$$ the situation is not that much different. Similar to the characterization for $\sigma(X)$ we have the following result (see here for a rigorous statement)
Example: Let $U$ be an exponentially distributed random variable and define $$X_t(\omega) := 1_{(U(\omega),\infty)}(t) = \begin{cases} 0, & \text{if $t \leq U(\omega)$} \\ 1, & \text{if $t > U(\omega)$}. \end{cases}$$ Then $$\tau := \sup\{t \geq 0; X_t = 0\}$$ is not a stopping time with respect to the canonical filtration $(\mathcal{F}_t)_{t \geq 0}$, i.e. $\{\tau \leq t\} \notin \mathcal{F}_t$. Why? Say, we observed our process for some time $t$ and it equals zero up to time $t$,i.e. $X_s(\omega)=0$ for all $s \leq t$. Can we decide whether $\omega \in \{\tau \leq t\}$ or not? No, because we do not know whether $X$ is going to jump to $1$ directly after our final observation (i.e. $X_s(\omega)$ for all $s>t$) or whether it will stay zero for another period of time.
According to the above characterization, we can understand the conditional expectation
$$\mathbb{E}(Y \mid \mathcal{F}_t)$$
as the expectation of $Y$ given that we have already observed $X_s$ for $s \leq t$. The "exreme" cases are clearly that