The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${\cal S}_i=\mathbb{R}^d$.
The chain is specified by an integer $m$ and a collection of maps $f_j^{(i)}: S_i \rightarrow S_i, j=1,\dots,m$ and probability functions
$\{p_j^{(i)}: S_i \rightarrow [0,1] \}$ , $\sum_{j=1}^{m} p_j^{(i)}(x) =1 \forall x \in S_i$
Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.
Given all this data you can define a Markov chain with state space $\{1,2,\dots,m\}\times\mathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?