Let's say I have two functions, $f(x)$ and $g(x)$. For simplicity let's say both functions are linear.
Let $C_m(x)$ be any function where $f$ and $g$ are composed with each other exactly $m$ times (so $C_m(x)$ is a Random Variable (function) across a space of $2^m$ such possible functions).
Is there any theory which can help predict $E[C_m(1)]$?
EXAMPLE
If $m=2$ , then we can have 4 possible values for $C_m(x)$, namely $f\circ g$ , $f\circ f$ , $g\circ f$ , $g\circ g$
Supposing $f(x)=\frac{x}{2}$ and $g(x)=3x+1$, we then can compute each such function, we see the distribution of $C_2(x)$ is : $\{ 0.25x, 1.5x+1, 1.5x+.5, 9x+4\}$ upon which we find that $E[C_2(1)]=4.4375$.
Even harder, how can we predict, say,
$E[\frac{b}{1-a}]$ where $a = \frac{C_m(1)-C_m(-1)}{2}$ $b = \frac{C_m(1)+C_m(-1)}{2}$
(One will note that this is the expectation of the value of $x$ which satisfies $C_m(x)=x$, under the conditions that $f$ and $g$ are both linear)
Here is a simple solution to the first part of the problem.
Note, that for any $x \in \mathbb{R}$ the sequence of random variables defined as $C_1(x),C_2(x),C_3(x), \dots$ can be viewed as a Markov Chain. The expectation can thus be calculated recursively using the recursive structure of the Markov Chain. In fact we have $$\mathbb{E}[C_m(x)] = \mathbb{E}[C_1(\mathbb{E}[C_{m-1}(x)])] = \frac{f(\mathbb{E}[C_{m-1}(x)])+g(\mathbb{E}[C_{m-1}(x)])}{2}$$