Let $\xi \in X^{\omega}$ be an infinite sequence and denote by $\xi[1\ldots n]$ its length $n$ initial segment. Then (due to Martin-Löf) the following holds:
For every $\xi \in X^{\omega}$ there exists some $c \in \mathbb N$ and infinitely many $n \in \mathbb N$ such that $$ K(\xi[1\ldots n]) \le n - \log_r(n) + c. $$ where $K(w)$ denotes the Kolmogorov-Complexity (with regard to some fixed universal computing mechanism) and $r = |X|$.
Does anybody has a proof of this fact?
The result you are thinking of is for $C(n)$ rather than $K(n)$. It is proved in section 3.11 of Downey and Hirschfeldt's book Algorithmic Randomness and Complexity. The proof is somewhat long to copy here. In general, that book should be the first place to look for results about algorithmic randomness.
To see why the result stated above fails for $K(n)$, remember that, by definition, a real $\xi$ is 1-random if there is a $c_0$ such that, for all $n$, $$K(\xi\upharpoonright n) \geq n - c_0.$$ Now fix a 1-random real $\xi$. If it was also true that there are infinitely many $n$ with $$ K(\xi\upharpoonright n) \leq n - \log n + c_1 $$ then we would have infinitely many $n$ with $$ n - c_0 \leq n - \log n + c_1 , \\ \log n \leq c_1 + c_0 $$ which is impossible.