Let $(X_n)_{n\in\mathbb{N}}$ be a sequence of Bernoulli distributed random variables defined on a probability space $(\Omega,\mathcal{F},P)$ and $(\mathcal{F}_n)_{n\in\mathbb{N}}$ be a filtration such that, for all $n$, $X_n$ is measurable with respect to $\mathcal{F}_{n+1}$ and $E\lbrack X_n|\mathcal{F}_n\rbrack>1-\delta$ for some fixed $\delta\in(0,1)$. I want to construct a probability space $(\mathcal{X},\mathcal{A},\mu)$ and sequences $(Y_n)_{n\in\mathbb{N}}$ and $(Z_n)_{n\in\mathbb{N}}$ of random variables which are defined on this probability space such that $(Z_n)_{n\in\mathbb{N}}\overset{d}{=}(X_n)_{n\in\mathbb{N}}$, $Y_1,Y_2,\ldots$ are iid Bernoulli distributed random variables with parameter $1-\delta$ and $Z_n\geq Y_n$ for all $n$. Note that by the assumptions above, we can conclude that, for all $n\in\mathbb{N}$ and $i\in\{0,1\}^n$, we have $P(X_1=i_1,\ldots,X_n=i_n,X_{n+1}=1)\geq (1-\delta)P(X_1=1,\ldots,X_n=i_n).$ First I will give you the basic idea that I have: My idea is to define $Y_n$ by using iid random variables $(U_n)_{n\in\mathbb{N}}$ which are uniformly distributed on $(0,1)$ and setting $Y_n=1_{\{U_n<1-\delta\}}$ for all $n$. Now I want to define $(Z_n)_{n\in\mathbb{N}}$ recursively: On the set $\{Y_1=1\}$ we set $Z_1=1$. I want $Z_1$ to have the same distribution as $X_1$, so I must define $Z_1$ on the set $\{Y_1=0\}$ so that $\mu(Y_1=0,Z_1=1)=P(X_1=1)-(1-\delta)$. At this point I am not sure how to do this. I was thinking about using a random variable $V_1$, which is also uniformly distributed on $(0,1)$ and independent of $(U_n)_{n\in\mathbb{N}}$ and setting \begin{align*}Z_1=1_{\{Y_1=1\}}+1_{\{Y_1=0,V_1<(P(X_1=1)-(1-\delta))/\delta\}}. \end{align*} Then $\mu(Z_1=1)=P(X_1=1)$, is this construction correct? Next, I would construct $Z_2$. My idea is to use the same idea as in the construction of $Z_1:$ Let $V_2,V_3$ be two random variables which are uniformly distributed on $(0,1)$, independent of each other, independent of $V_1$ and independent of $(U_n)_{n\in\mathbb{N}}$. Then I would set \begin{align*} Z_2=1_{\{Y_2=1,Z_1=0\}}+1_{\{Y_2=1,Z_1=1\}}+1_{\{Y_2=0,Z_1=0,V_2<(P(X_1=0,X_2=1)-(1-\delta)P(X_1=0))/(\delta P(X_1=0))\}}+1_{\{Y_2=0,Z_1=1,V_3<(P(X_1=1,X_2=1)-(1-\delta)P(X_1=1))/(\delta P(X_1=1))\}}. \end{align*} By this construction, $(Z_1,Z_2)$ should have the same distribution as $(X_1,X_2)$ if I did not make a mistake. Analogously, I would now define $Z_3,Z_4$ and so on. Ultimately, this leads to a process $(Z_n)_{n\in\mathbb{N}}$ such that for all $n$ we have $(Z_1,\ldots,Z_n)\overset{d}{=}(X_1,\ldots,X_n)$, i.e. $(Z_n)_{n\in\mathbb{N}}\overset{d}{=}(X_n)_{n\in\mathbb{N}}$ by Kolmogorov's extension theorem. The problem that I have is that I do not know how to make this proof rigorously. In the $n$-th step of the construction we need $2^{n-1}+1$ uniformly distributed random variables: one for the construction of $Y_n$ and $2^{n-1}$ more to complete the construction of $Z_n$, i.e. I would define $(\Omega_{n,k},\mathcal{F}_{n,k},P_{n,k}):=((0,1),\mathcal{B}(0,1),\text{Uni}(0,1))$ for all $n\in\mathbb{N}$ and $k=0,\ldots,2^{n-1}$ and set \begin{align*} (\mathcal{X},\mathcal{A},\mu):=\left(\times_{n\in\mathbb{N}}\left(\times_{k=0}^{2^{n-1}}\Omega_{n,k}\right),\otimes_{n\in\mathbb{N}}\left(\otimes_{k=0}^{2^{n-1}}\mathcal{F}_{n,k}\right),\otimes_{n\in\mathbb{N}}\left(\otimes_{k=0}^{2^{n-1}}P_{n,k}\right)\right). \end{align*} Now I am not sure how to make the construction of the processes rigorously. If we assume that we have already constructed $Y_1,\ldots,Y_n$ and $Z_1,\ldots,Z_n$, I would define \begin{align*} Y_{n+1}(x)=1_{\{x_{n+1,0}<1-\delta\}}(x), \end{align*} where for $x\in\mathcal{X}$, $x_{m,j}$ denotes the coordinate of $x$ which belongs to $\Omega_{m,j}$. But now I am not sure how to formally construct $Z_{n+1}$. Does anyone have an idea?
2026-02-23 04:56:29.1771822589
Construction of a coupling of a sequence of Bernoulli random variables
47 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in COUPLING
- How to uncouple and reduce/solve a system of 2nd order PDEs
- Showing that the balls respect to Wasserstein metric with equal radius in $\mathcal{P}_{p}(\Xi)$ and $\mathcal{P}(\Xi)$ respectively are equal.
- Trying to Understand Coupling Arguments Through a Simple Markov Chain Example
- Proof for total variation distance for product measure using coupling
- Coupling two random variables
- Coupling via push-forward from a source space
- Realize a coupling in the target space via a measure on the source space
- How does Markov Chain coupling work?
- Give a coupling under which a random variable dominates another random variable
- Successful couplings and total variation convergence to equilibrium for time-homogeneous Markov processes
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Construction.
Let $(U_n)_{n\in\mathbb{N}}$ be a sequence of i.i.d. random variables which are uniformly distributed on $(0, 1)$.
Then, define $p_n : \{0, 1\}^{n-1} \to [0, 1]$ by \begin{align*} p_n(x_{[1:n)}) &= \mathbf{P}(X_n = 1 \mid X_{[1:n)} = x_{[1:n)}) \\ &= \mathbf{E}[X_n \mid X_{[1:n)}=x_{[1:n)}], \end{align*} where $x_{[1:n)} = (x_1, \ldots, x_{n-1})$ and likewise for $X_{[1:n)}$. (Here, $x_{[1:1)}$ is regarded as an empty list $\varnothing$, so that $p_1(\varnothing) = \mathbf{P}(X_1 = 1) = \mathbf{E}[X_1]$.)
Finally, define $Y_n$ and $Z_n$ by \begin{align*} Y_n = \mathbf{1}_{\{U_n \leq 1-\delta\}} \qquad\text{and}\qquad Z_n = \mathbf{1}_{\{ U_n \leq p_n(Z_{[1:n)}) \}}. \end{align*}
Observations. From the above construction, we observe:
$(Y_n)_{n\in\mathbb{N}}$ is a sequence of i.i.d. $\text{Bernoulli}(1-\delta)$-random variables.
$Z_n \geq Y_n$ holds, since $p_n(x_{[1:n)}) > 1 - \delta $.
For any $n \in \mathbb{N}$, note that $Z_{[1:n)}$ is a deterministic function of $U_{[1:n)}$. In particular, $Z_{[1:n)}$ is independent of $U_n$. Then, for any $z_{[1:n)} \in \{0, 1\}^{n-1}$, we have \begin{align*} \mathbf{P}(Z_n = 1 \mid Z_{[1:n)} = z_{[1:n)}) &= \mathbf{P}(U_n \leq p_n(Z_{[1:n)}) \mid Z_{[1:n)} = z_{[1:n)}) \\ &= p_n(z_{[1:n)}) \\ &= \mathbf{P}(X_n = 1 \mid X_{[1:n)} = z_{[1:n)}). \end{align*} So by the principle of mathematical induction, $(Z_1, \ldots, Z_n) \stackrel{d}= (X_1, \ldots, X_n)$ holds for all $n \in \mathbb{N}$, which in turn implies that $(Z_n)_{n\in\mathbb{N}} \stackrel{d}= (X_n)_{n\in\mathbb{N}}$.
Conclusion. The sequences $(Z_n)_{n\in\mathbb{N}}$ and $(Y_n)_{n\in\mathbb{N}}$ constructed as above satisfies all the desired properties.