So, I asked a question earlier about an interesting Markov chain that I found, and I learnt that when $k\leq\frac12$, we start getting an ordinary distribution on a measureable set.
But I was wondering, what exactly is this steady-state distribution - so my current way of thinking of it, is that I start with the uniform distribution over $(0,1)$ and then I keep running the following intuitive recipe:
- At given point $x$ take an infinitesimal slice to $x+\delta x$ and transport $1/p$ of that mass to $k+(1-k)x$ to $k+(1-k)(x+\delta x)$ and take the rest to $x(1-k)$ to $(x+\delta x)(1-k)$.
- Repeat indefinitely
I don't really have the analysis tools to think about what this does in various scenarios, I'm particularly interested in what happens when $k$ is pretty small, since my simulations show that I get something that looks like a Beta distribution - but there appears to be a lot of complex behaviour hiding in this idea that I would like to know more about, if people can explain it or point to where its been discussed before.
Thanks again for all your help!
So, I think a rigorous answer drops out of SmallDeviation's answer and John Dawkin comment beneath that. A good way to define the Markov Chain in question is:
$$X_t=(1-k)^t\cdot X_0+\sum_{j=1}^{t} (1-k)^{j}\epsilon_{j}$$
Where $\epsilon_j$ are Bernoulli variables with mean $p$. So, as $t\to \infty$ we have:
$$X_\infty = \sum_{j=1}^\infty (1-k)^j\epsilon_j$$
This general form when $\mathbb{E}[\epsilon_j]=1/2$ and $λ=1-k$ is called a Bernoulli convolution and it is a well studied object.
Theorem 1 (Shmerkin). The set $\{ k \in (1/2, 1) : X_\infty\text{ is singular}\}$ is of 0 Hausdorff dimension.
Theorem 2 (Solomyak). The Bernoulli convolution $X_\infty$ is absolutely continuous and has density in $L^2$ for almost every $k\in(0,1/2)$
This turns out to be a pretty rich subject of research in itself, so to find out if this approximates a normal distribution or a beta distribution seems silly, although that is what it looks like.