I've checked it in numerical experiments but found it counter intuitive.
A player start with $a_0>0$ dollars, let's denote the amount of money after $n$ rounds $a_n$ dollars.
In each round, the player bet $10$% of the total money in the game he has $51$% chance to win even money and $49$% chance to lose.
Claim: $a_n\rightarrow 0$ as $n\rightarrow \infty$
Prove sketch: Consider $b_n=ln(a_n)$, in each round, $b_n$ has $51$% chance to +$ln(1.1)$, 49% chance to $+ln(0.9)$, the expectation of $b_n$ each round is $0.51ln(1.1)+0.49ln(0.9)\approx-0.003<0$, therefore $b_n$ drop to $-\infty$ in the long round, that means $a_n$ drop to $0$.
Could someone explain this weird thing?
A simpler case would be if the player bets $100\,\%$ of his money at each turn. It is quite intuitive that at some point this will sooner or later lead to bankrupt, even if his EV is positive at each bet. This intuitively implies that you shouldn't try to maximize your EV in dollars in each bet but something else (see for example Kelly betting). For example, if you had $100\,000\,000$ dollars, you probably wouldn't want to bet everything on a 1 in 9 chance of getting $1\,000\,000\,000$ dollars.
Analysis of the case in this question: Note that $$ \lim_{n\to\infty} \mathrm{E}[\ln a_n] = -\infty $$ does not imply that $\lim_{n\to\infty} \mathrm{E}[a_n] = 0.$
The expected value (in dollars, not logarithm) of $a_i$th bet if $a_{i-1}$ is given is $$ .51 \cdot a_{i-1} + .49 \cdot 0.9a_{i-1} - a_{i-1} = 0.002 a_{i-1}, $$ which is positive. The expected value of your bankroll after $i$th bet is $$ \begin{align*} \mathrm{E}(a_{i}) &= \sum_x \mathrm{P}(a_{i-1} = x) \cdot .51 \cdot 1.1 x + \mathrm{P}(a_{i-1} = x) \cdot .49 \cdot 0.9 x \\ & = \sum_x 1.002 x \mathrm{P}(a_{i-1} = x) = 1.002 \mathrm{E}(a_{i-1}). \end{align*} $$ Here the sum $x$ is taken over all possible values of $a_{i-1}$, so the first equals sign is simply writing the expected value as probablilties times values summed over all cases (all $a_{i-1}$ and all outcomes of $i$th bet). Therefore, $$ \lim_{n\to\infty} \mathrm{E}(a_n) = \lim_{n\to\infty} 1.002^n a_0 = \infty. $$
Edit, thanks to @did's comments on the question:
Let $b_n = \ln a_n$ for each $n$. Because $b_n$ is a sum of independent identically distributed random variables (as you have shown in the question), we can use the strong law of large numbers: for all $\epsilon>0$, $$ \mathrm{P}\left(\lim_{n\to\infty} \frac{b_n}{n} = \mu\right) = 1, $$ where $\mu \approx -0.003$ is the expected value (in logarithm of bankroll) of each bet.
Because $\exp$ is a continuous function, we can transform this to $$ 1 = \mathrm{P}\left(\lim_{n\to\infty} e^{b_n/n} = e^\mu\right) = \mathrm{P}\left(\lim_{n\to\infty} \sqrt[n]{a_n} = e^\mu\right) . $$
Let's forget the probabilities and think about the limit. Because $e^\mu < 1$, there is an $\epsilon_0>0$ such that $e^\mu + \epsilon_0 < 1$. Therefore, $$ \begin{align*} &\lim_{n\to\infty} \sqrt[n]{a_n} = e^\mu \\ \Rightarrow&\forall \epsilon \exists N : n>N \rightarrow |\sqrt[n]{a_n} - e^\mu| < \epsilon\\ \Rightarrow& \exists N : n>N \rightarrow |\sqrt[n]{a_n} - e^\mu| < \epsilon_0 \\ \Rightarrow& \exists N : n>N \rightarrow \sqrt[n]{a_n} < e^\mu + \epsilon_0 \\ \Rightarrow& \exists N : n>N \rightarrow a_n < \left(e^\mu + \epsilon_0\right)^n. \\ \end{align*} $$ The last line means that there exists an $N$ such that for $n>N$, $a_n$ is bounded above by $\left(e^\mu + \epsilon_0\right)^n$. But the limit of the upper bound is zero, so thus $$ \lim_{n\to\infty} \sqrt[n]{a_n} = e^\mu \Rightarrow \lim a_n = 0. $$
Going back to the original question, we have proved that $$ 1 = \mathrm{P}\left(\lim_{n\to\infty} \sqrt[n]{a_n} = e^\mu\right) \leq \mathrm{P}\left(\lim_{n\to\infty} a_n = 0\right) = 1, $$ where the inequality is due to the fact that the first limit implies the second limit and thus cannot be more probable, and because probability is at most $1$, we know that the latter probability has to be $1$. This means that if you consider any positive quantity of money, for example, $0.000001$ dollars, you will with probability $1$ (i.e. almost surely) at some point get below that limit and stay there.