Let $\{X_i\}_{i=1}^{\infty}$ be i.i.d. random variables. Define $$ L_n = \frac{1}{n}\sum_{i=1}^n X_i \quad \forall n \in \{1, 2, 3, …\} $$ Using the central limit theorem, it can be shown that if $E[X_i]=0$ and $0<Var(X_i)<\infty$ then: $$ \lim_{n\rightarrow\infty} P[L_n\leq x] = \left\{ \begin{array}{ll} 1 &\mbox{ if $x > 0$} \\ c & \mbox{ if $x=0$}\\ 0 & \mbox{ if $x<0$} \end{array} \right.$$ where $c=1/2$. If the variance is infinite then the law of large numbers implies a similar structure for the cases $x>0$ and $x<0$, but the case $x=0$ is unclear to me.
Questions: For infinite variance, can we get different behavior for the case $x=0$, such as $c=1/3$? Can we get related step-function structure when the mean does not exist, but with different behavior for the case $x=0$?
Notes:
We can get such a limiting function with $c=1/3$ for random sequences with different structure, such as $L_n= A/n$ with $P[A=1]=2/3, P[A=-1]=1/3$.
I came up with this question while reflecting on the question here: Why does a C.D.F need to be right-continuous?
Yes it is possible for $c$ to take any value strictly between $0$ and $1$. The point is that there exist mean-zero stable distributions which are not symmetric about $0$ (of course, such a stable distribution cannot be Gaussian, and so it must have infinite variance). You may look at the Wikipedia page to see how some of these stable distributions look.
Specifically, if $\alpha \in (1,2)$ and $\beta \in [-1,1]$, then it turns out that there exists a random variable $X$ whose characteristic function will look like $$\phi_X(t) = e^{-|t|^{\alpha}\big(1-i\beta \tan(\frac{\pi\alpha}{2})\text{sign}(t)\big).}$$ As it turns out, this distribution will have mean zero, and moreover (by varying $\alpha$ and $\beta$), $P(X<0)$ can be any predefined number $c\in(0,1)$. Furthermore, for iid copies one may check directly from the characteristic function that $n^{-1/\alpha}(X_1+...+X_n)$ has the same distribution as $X_1$. From this we can easily conclude that $$P(L_n<0) =P(n^{1-1/\alpha}L_n<0)= P(X<0)=c \in (0,1),$$ for all $n$, as desired. I do not know if $c=0$ or $c=1$ is a possible limit for nonzero random variables $X_i$, though it'd be interesting to find out.