Consider an ergodic process $Y_1^n$ that depends on an i.i.d. process $X_1^n$. We can estimate the entropy rate $\overline{H}(Y) = \lim_{n \rightarrow \infty} (1/n)H(Y_1^n)$ using the Shannon-McMillan-Breiman theorem, which says $g_n = -(1/n) \log P(Y_1^n) \rightarrow \overline{H}(Y)$. Instead, consider estimating this entropy rate using the probability $Q(Y_1^n) = \mathbb{P}(Y_1^n|X_1^n \in A_{\epsilon}^{(n)})$, where $A_{\epsilon}^{(n)}$ is the $\epsilon$-typical set of $X_1^n$ with $\mathbb{P}(X_1^n \in A_{\epsilon}^{(n)}) > 1-\epsilon$. That is, we want $\tilde{g}_n = -(1/n)\log Q(X_1^n)$ to be asymptotically close to $g_n$ with high probability. How can we prove this with minimal assumptions about $Y_1^n$? It is likely useful to bound the divergence $D(P||Q)$, but I'm not sure how to do this. Any help would be greatly appreciated.
2025-01-13 05:33:22.1736746402
Estimating entropy rates conditioned on typical sets
42 Views Asked by rey_pato https://math.techqa.club/user/rey-pato/detail AtRelated Questions in INFORMATION-THEORY
- Conditional joint information of two random variables $X,Y$ given $Z$
- Algorithm to determine a set of source symbols in Communication System
- Is it possible to code with less bits than calculated by Shannon's source coding theorem?
- Proof of Central Limit Theorem via MaxEnt principle
- Relation between Shannon Entropy and Total Variation distance
- Mutual information between 2 sequences of random variables?
- Colored Noise Channel Capacity Derivation in Elements of Information Theory (Cover & Thomas)
- permutations of binary sequences
- Why is the mutual information nonzero for two independent variables
- Number of symbol delay in decoder
Related Questions in ERGODIC-THEORY
- Point with a dense trajectory
- Understanding Proof of Poincare Recurrence Theorem
- Question About Definition of Lyapunov Exponents
- Computing Lyapunov Exponents
- Question about dense trajectory on $k$-dimensional torus under rotation map
- Rotation map on $S^1$ preserves measure
- Integrability and area-preservation property of maps
- A discontinuous almost everywhere map does not admit an invariant measure
- Uniform integrability of functional of an ergodic Markov Chain?
- Ergodic theorem in two variables?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity