If I multiply a conditional probability matrix with a Markov matrix on the left, in what sense is the resulting signal more noisy?

31 Views Asked by At

Let me denote the conditional probabilities of a signal by a matrix $P$, where $P_{ij}$ denotes the probability of realization $j$ conditional on state $i$.

I know that if $\tilde P= P M$ for some Markov matrix $M$, then $P$ is Blackwell more informative than $\tilde P$ because essentially you can recover $\tilde P$ from $P$ by simply garbling the messages.

But what about if I do this from the other side, i.e. $\tilde P = M P$? Clearly, in some sense $\tilde P$ is also more noisy than $P$, because in some sense if you think of each signal as a "random messaging machine" where you feed in the state and it spits out a message, then you can generate $\tilde P$ from $P$ by feeding in a garbled signal of the true state. But is there a formal property that describes the sense in which $\tilde P$ is "more noisy" than $P$?