Given a sequence of iid random variables $\{X_n\}$ (not necessarily discrete), and defining the conditional probability $\mathbb{P}_{\mathcal{F}_n}(A)(\omega)$ where the filtration is the natural one and $A$ is an event in the probability space. I am interested in understanding the connection between the conditional probability and $\omega$. Usually one provides the following example, tipically using the discrete case (or more in general a partitioning): for example if $\omega\in\{X_1=x_1,\ldots,X_n=x_n\}$, $\mathbb{P}_{\mathcal{F}_n}(A)(\omega)=\mathbb{P}(A|X_1=x_1,\ldots,X_n=x_n)$, because the events $\{X_1=x_1,\ldots,X_n=x_n\}$ are a partitioning generating $\mathcal{F}_n$. My question is essentially if more generally (that is not just in the case of a partitioning) this relationship extends to some measurable events $H\in\mathcal{F}_n$: when do we have that $\mathbb{P}_{\mathcal{F}_n}(A)(\omega)=\mathbb{P}(A|H)$ for all $\omega\in H$? Thanks for any insight provided. EDIT: As an example, from a paper I read, the following relationship, reminiscent of the above, holds: if $\{X_i\}_{i=1}^N$ are uniform on $(0,1)$, $\mathbb{P}_{\mathcal{F}_n}(X_n=\max\{X_1,\ldots,X_N\})=1_{\{X_n=\max\{X_1,\ldots,X_n\}\}}X_n^{N-n}$. What properties have been used to derive it?
2026-04-05 20:48:48.1775422128
Understanding the dependence of conditional probability on filtrations
53 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in CONDITIONAL-PROBABILITY
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- Finding the conditional probability given the joint probability density function
- Easy conditional probability problem
- Conditional probability where the conditioning variable is continuous
- probability that the machine has its 3rd malfunction on the 5th day, given that the machine has not had three malfunctions in the first three days.
- Sum of conditional probabilities equals 1?
- Prove or disprove: If $X | U$ is independent of $Y | V$, then $E[XY|U,V] = E[X|U] \cdot E[Y|V]$.
- Conditional probability and binomial distribution
- Intuition behind conditional probabilty: $P(A|B)=P(B\cap A)/P(B)$
- Transition Probabilities in Discrete Time Markov Chain
Related Questions in CONDITIONAL-EXPECTATION
- Expectation involving bivariate standard normal distribution
- Show that $\mathbb{E}[Xg(Y)|Y] = g(Y) \mathbb{E}[X|Y]$
- How to prove that $E_P(\frac{dQ}{dP}|\mathcal{G})$ is not equal to $0$
- Inconsistent calculation for conditional expectation
- Obtaining expression for a conditional expectation
- $E\left(\xi\text{|}\xi\eta\right)$ with $\xi$ and $\eta$ iid random variables on $\left(\Omega, \mathscr{F}, P\right)$
- Martingale conditional expectation
- What is $\mathbb{E}[X\wedge Y|X]$, where $X,Y$ are independent and $\mathrm{Exp}(\lambda)$- distributed?
- $E[X|X>c]$ = $\frac{\phi(c)}{1-\Phi(c)}$ , given X is $N(0,1)$ , how to derive this?
- Simple example dependent variables but under some conditions independent
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is generally true if $H$ is an atom of $\mathcal F_n$, i.e. if $G \in \mathcal F_n$ and $G$ is a strict subset of $H$ implies $\mathbb{P}(G)=0$. Otherwise, if $H = H_1 \cup H_2$, it would imply $\mathbb{P}(A|H)=\mathbb{P}(A|H_1)=\mathbb{P}(A|H_2)$. The reason this isn't a problem for atoms is because conditional probabilities aren't defined for events with probability zero.
For the specific example you asked about, we have \begin{align} \mathbb{P}_{\mathcal F_n}(X_n = \max(X_1,...,X_N)) &= \mathbb{E}_{\mathcal F_n}[1_{X_n = \max(X_1,...,X_N)}] \\ &= \mathbb{E}_{\mathcal F_n}[1_{X_n = \max(X_1,...,X_n)} 1_{X_{n+1}\le X_n} \cdots 1_{X_N \le X_n}] \\ &= 1_{X_n = \max(X_1,...,X_n)}\mathbb{E}_{\mathcal F_n}[1_{X_{n+1}\le X_n}] \cdots \mathbb{E}_{\mathcal F_n}[ 1_{X_N \le X_n}] \\ &= 1_{X_n = \max(X_1,...,X_n)} X_n \cdots X_n \\ &= 1_{X_n = \max(X_1,...,X_n)} X_n^{N-n}, \end{align} where the third equality follows from independence and taking out what is known, and the fourth from the fact that $X_{n+1},...,X_N \sim U(0,1)$.