Computing the limit of an entropy with an additive term that goes to zero?

24 Views Asked by At

I have some confusion about a limiting entropy rate problem I am working on, and I don't know how to proceed.

Let's say I have some discrete random process such that:

$$\lim_{n\rightarrow \infty}\frac{1}{f(n)}H(X_n) = 1,$$

where $H(\cdot)$ is the shannon entropy. Note $f(n)$ is a relatively straightforward function, something like $\sqrt{n}\log(n)$. I also now have $Y_n \rightarrow 0$ (i.e., $\lim_{n\rightarrow \infty}\Pr(Y_n = 0) = 1$), can I say the following?

$$\lim_{n\rightarrow \infty}\frac{1}{f(n)}H(X_n + Y_n) = 1 \;\;(?)$$

I guess my intuition tells me that the above statement is correct, but I'm not sure how to work it out.