Source coding with 2 distinct distributions and entropies

38 Views Asked by At

I'm learning about source coding, and many of the books/resources I've read give examples of the source $X^n$ being defined as a sequence of iid random variables. How about when the sequence is independent but belong to $2$ distinct distributions (e.g. $P_x$ when $X_i$ is odd, and $P_{x'}$ when $X_i$ is even), with $2$ distinct entropies ($H(X)$ and $H(X')$)?

Where can I find resources that talk about such problems, including how to define the typical set in such cases?

One more question - when I'm asked to "provide a coding scheme" or "design a coding scheme", what exactly am I expected to give?

ETA: To clarify, many of the books I've read on channel coding talks about the discrete memoryless source $X^n$ being a sequence of iid random variables, having just one single distribution and entropy.

What about when the source is made up of a sequence of random variables that are independent, but not identically distributed? For example, P($X_i$ = x) = {$P_x(x)$ when $i$ is odd, and $P_{x'}(x)$ when $i$ is even}, with $2$ distinct entropies ($H(X)$ and $H(X')$)?

How do we define the typical set then? How is this different from the typical $X^n$ being a sequence of iid random variables? Are there any online sites that I can go to to understand this more?