I am trying to reconstruct a Markov process from Shannons paper "A mathematical theory of communication". My question concerns figure 3 on page 8 and a corresponding sequence (message) from that Markov chain from page 5 section (B). I used R to create a Markov chain, but important is only the transition matrix below (no answer on stackoverflow). I just wanted to check, if I have the right transition matrix to the figure from Shannons paper (I am new to this). I cannot post an image of the figure, so here is a link to Shannons paper: https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
Here is my transition matrix:
MessageTransitionMatrix = matrix(c(.4,.1,.2,.2,.1,
.4,.1,.2,.2,.1,
.4,.1,.2,.2,.1,
.4,.1,.2,.2,.1,
.4,.1,.2,.2,.1),
nrow = 5,
byrow = TRUE,
dimname = list(MessageABCDE, MessageABCDE))
My goal was to also create a sequence (message) from that chain. I am mostly uncertain about the transition matrix, where infered the probabilities had to be all the same in every row to represent indipendence. The sequence though turned out ok, I guess:
> markovchainSequence(n = 20, markovchain = MCmessage, t0 = "A")
[1] "D" "E" "A" "D" "A" "A" "B" "D" "E" "C" "A" "A" "E" "C" "C" "D" "D" "D"
[19] "A" "C"