Say we have a recursive encoder and we can write the codewords $x^j$ in terms of input bits $u^i$ and binary delay operators as
$x^1 = \frac{D}{1-D^3}u^1 + \frac{D^2}{1-D^3}u^2$.
Maybe this is really obvious, but I'm confused about this delay operator which we haven't justified in any real sense. How does a delay in the denominator act on the input bit? I don't understand how we can even define it meaningfully - what if $D^3 = 1$ etc?