I solved a nice exercise which consists of proving a converse for the law of large numbers which is, if $(X_i)_{i \in\mathbb{N}^*}$ are iid random valued such that $$ \frac{X_1+\ldots+X_n}{n} \to Y$$ almost surely for some random variable $Y$, then $X_1$ is integrable and thus $Y = \mathbb{E}[X_1]$.
I want to show that the same statement is false if we change the almost sure convergence to a convergence in probability. I wanna check the following counterexample. Let $(X_i, i\in\mathbb{N}^*)$ iid integer valued variables of law $$ \mathbb{P}(X = k) = \mathbb{P}(X = -k) = \frac{C}{k^2 \log k}, \forall k\in\mathbb{N}^* $$ then $\frac{X_1+\ldots+X_n}{n}$ converges to $0$ in probability.
The key point is that $$\tag{*}\lim_{t\to\infty}t\mathbb P(\lvert X\rvert\geqslant t)=0.$$ Indeed, it suffices to show that $n\mathbb P(X\geqslant n)\to 0$ (by monotonicity of the tail function and the symmetry of $X$). We then use the elementary bounds $$ n\mathbb P(X\geqslant n)=n\sum_{k=n}^\infty\frac 1{k^2\log k}\leqslant \frac{n}{\log n}\sum_{k=n}^\infty\frac 1{k^2}\leqslant \frac{n}{\log n}\sum_{k=n}^\infty\frac 1{k(k-1)}\leqslant \frac{n}{(n-1)\log n}. $$
For a fixed $n$, define $S_n=X_1+\dots+X_n$, $Y_i=X_i\mathbf{1}_{\lvert X_i\rvert\leqslant n}$ and $Z_i:=X_i\mathbf{1}_{\lvert X_i\rvert\gt n}$. For a fixed $\varepsilon$, $$ \mathbb P(\lvert S_n\rVert/n>\varepsilon)\leqslant \mathbb P\left(\left\lvert \sum_{i=1}^nY_i\right\rvert>n\varepsilon/2\right) +\mathbb P\left(\left\lvert \sum_{i=1}^nZ_i\right\rvert>n\varepsilon/2\right).$$ For the first term, use Tchebychev's inequality in order to bound this by $(n\varepsilon^2)^{-1}\mathbb E\left[X^2\mathbf{1}_{\lvert X\rvert\leqslant n}\right]$, which goes to $0$ by Cesaro lemma since we end up with averages of $1/\log k$. [It holds actually under $(*)$ but the proof is a bit more technical]. For the second term, the involved event is contained in $\bigcup_{i=1}^n\left\{\lvert X_i\rvert>n\right\}$: use a union bound and $(*)$.