A conjecture in need of a proof
Let $P$ be a probability distribution (satisfying countable additivity) with outcome space $\mathbb{N}=\{1,2,3,\ldots\}$. Let $N$ be a stochastic variable $\mathbb{N}\to\mathbb{N}$. For each infinite sequence of independent outcomes from $P$, $\alpha=(x_1,x_2,x_3,\ldots)$, let another infinite sequence $\alpha'$ be formed as follows: the first $N(x_1)$ terms are each the number $N(x_1)$, the next $N(x_2)$ terms are each the number $N(x_2)$, the next $N(x_3)$ terms are each the number $N(x_3)$, and so on. For each $n\in\mathbb{N}$, let $\alpha'_n$ be the initial segment of $\alpha'$ consisting of the "contributions" from $x_1,x_2,x_3,\ldots,x_n$.
Conjecture: There exist $P$ and $N$ such that the probability is 1 that the frequency of even numbers in $\alpha'_n$ does not have a limit for $n$ tending to infinity.
Why I think the conjecture is true
Consider first the $P$ and $N$ such that $P(n)=2^{-n}$ and $N(n)=2^n$. This choice means that if $n$ is sufficiently large, then an initial segment, $\alpha'_n$, can easily contain as many (or more) instances of a large number, say $2^{100}$, as it contains instances of a small number like $2^1$. This is because, while instances of $2^1$ have a high probability of showing up and instances of $2^{100}$ do not, when instances of $2^{100}$ do show up, lots of them do at once.
Now change $P$ as follows. For some $m_1\in\mathbb{N}$ (say, 100), for all $n$ that are smaller than $m_1$ and odd, let $P(n)$ equal 0. Renormalize the probabilities for all other values. This ensures that initial segments $\alpha'_n$ for relatively small $n$ are very likely to have a frequency of even numbers that is close to 1. Then change $P$ again: for some $m_2>m_1$, for all $n$ that are smaller than $m_2$, larger than $m_1$, and even, let $P(n)$ equal 0. Renormalize the probabilities for all other values. This should ensure (if $m_2$ has been chosen large enough) that initial segments $\alpha'_n$ for $n$ of an appropriate size (neither too small nor too large) are very likely to have a frequency of even numbers that is close to 0. Then change $P$ again: for some $m_3>m_2$, for all $n$ that are smaller than $m_3$, larger than $m_2$, and odd, let $P(n)$ equal 0. Renormalize the probabilities for all other values. This should ensure (if $m_3$ has been chosen large enough) that initial segments $\alpha'_n$ for $n$ of an appropriate larger size are very likely to have a frequency of even numbers that is close to 1. And so on.
What I need help with
I suspect that I would be able to develop these considerations into a proof of the conjecture. However, it would probably become a long and inelegant one. I also suspect that there is a much more elegant way of proving the conjecture - perhaps it can even be made to drop out of some theorem in the literature in a few easy steps. But I don't think I will be able to figure that out myself, and am therefore asking for help.
I think your distribution should work, but exponential growth is probably too slow to prove it easily.
The idea is: make $N$ grow so fast that we will infinitely many times get segments longer than previous sequence. Also, by Kolmogorov's zero-one law, probability of limit existence is either 0 or 1, so it's enough to show that probability of it non-existence is greater than 0.
Consider Bernoulli schema with 3 outcomes: one with probability $p$, the second with probability $q\cdot p$ and the third with probability $1 - p - p \cdot q$. There is some monotone function $q(\varepsilon) > 0$ such that for any $p > 0$, probability of outcome with probability $p$ coming before outcome with probability $p \cdot q(\varepsilon)$ is at least $1 - \varepsilon$.
Let's start with some rapidly decreasing distribution $P$. First, define unnormalized version of it: $P'(1) = 1$, $P'(n + 1) = q(2^{-10-n}) P'(n)$. Let $P$ be normalized version of $P'$. Then probability of $1$ coming before $2$ is at least $1 - 2^{-11}$, probability of $2$ coming before $3$ is at least $1 - 2^{-12}$ and so on.
Combining it all, we have that probability that every number first comes before any larger comes is at least $1 - 2^{-10}$.
Now, let's choose some constants $C_k$ s.t. probability of number $C_k$ coming in first $k$ trials is at least $1 - 2^{-10 - k}$. Combining with previous, we get that with probability at least $1 - 2^{-9}$, every number first shows up in the first $C_k$ trials and don't have any larger numbers before it; let's call such outcome good.
Now, it's simple to define $N$: let $N(1) = 1$, $N(k)$ have the same parity as $k$ and $N(k) > 2 \cdot C_k \cdot N(k - 1)$.
Let's see what happens if our outcome is good. First, we get $1$, and frequency of even numbers is $0$. Eventually, we get $2$, and frequency of even numbers immediately after becomes at least $\frac{2}{3}$. Eventually, we get $3$, and the frequency becomes at most $\frac{1}{3}$. And so on - after encountering next even number frequency becomes at least $\frac{2}{3}$, after encountering odd, it becomes at most $\frac{1}{3}$. So, if outcome is good, limit of frequency doesn't exist.