Let $(X_n)_n$ be a sequence of IID RVs with distribution $exp(1)$, what can you say about the convergence in $L^p$ and a.s of the sequence $(Y_n)_n$ where $Y_n = X_n/\log(n)$?
Here is my attempt, is it correct? How would you have determined convergence?
It's straightforward to see that $(Y_n)_n$ converges in probability to $0$, so if it converges a.s. it must converge to 0.
Now, the events $\{Y_n > 1\}$ are independent and $\sum_{n=1}^{\infty}P(Y_n>1) = \sum_{n=1}^{\infty}1/n = \infty$, hence, by Borel-Cantelli 2, $P(\limsup(Y_n>1))=1$. This means that the set $\{Y_n \le 1, eventually\}$ has measure $<1$ so there is not convergence a.s..
In order to determine the convergence in $L^p$ I've calculated the moment generating function of $X_n$: $M(t)=1/(1-t)$, with $t<1$ and its derivatives $M^{(k)}=k! / (1-t)^{k+1}$.
We have $E(Y_n^k)=E(X_n^k)/\log(n)^k = k!/\log(n)^k \to 0$ when $n \to \infty$, so there is convergence in $L^k$ with $k \ge 1$ integer, by the monotonocity of the $L^p$ spaces we obtain that $Y_n \to 0$ in $L^p$ for every $p\ge 1$. (Here I could have also use the characteristic function)
I would not have considered the case when $p$ is an integer separately. $EY_n^{p} =\frac {EX_1^{p}} {(\log n)^{p}} \to 0$ and you don't need the exact value of $EX_1^{p}$ for this argument. If you just know that $\int_0^{\infty} e^{-x} x^{p}dx <\infty$ for any $p>0$ that is good enough for this proof.