Suppose you play a game where you start with capital $K_0 = 1$. In each turn $i = 1,2, \dots, n$ you throw a fair coin independently of the history. If you get "heads", you get back $3/2$ of your capital, if you get "tails" you only get back $1/2$ of your capital.
Prove that $K_n \to 0$ almost surely.
My attempt:
First define $R_i$ by $$R_i(\omega) = \begin{cases} \frac{3}{2} & \omega_i = \text{heads} \\ \frac{1}{2} & \omega_i = \text{tails}\end{cases}$$
and note that $K_n = \prod_{i=1}^nR_i $, $\mathbb E[R_i] = 1 $ and by independence $\mathbb E[K_n] = 1.$ Now the only ways to prove almost sure convergence I know of are the Strong Law of Large Numbers and the Borel-Cantelli Lemma. A way to apply the former is to look at $$\log K_n = \sum \log(R_i).$$
By the Strong Law I get $$\frac{1}{n}\sum_{i=1}^n \log(R_i) \to \log \left(\frac{\sqrt3}{2}\right)$$
but that is not quite what I want. Also I find this pretty counterintuitive.
So, firstly I will explain the error in your approach via the strong law, and the correct way in which the Strong Law does suggest this to be true.
However, I think you still need to be careful with this approach, so then present an argument via Martingales.
Intuition via the strong law
Your intuition to take logarithms, show almost sure convergence of these, and then exponentiate again is reasonable. This implicitly relies on the Continuous Mapping Theorem, which says under certain conditions on a function $g$ then
$$ X_n \xrightarrow{a.s} X \, \Rightarrow \, g(X_n) \xrightarrow{a.s.} g(X).$$
So in this context the sequence $X_n = \sum_{k=1}^n \log R_k$, and $g(x) = \exp(x)$.
The strong law requires us to divide $X_n$ by $n$ to be able to make a statement about the limit. i.e. the strong law states $$ \frac1n \sum_{k=1}^n \log R_k \xrightarrow{a.s.} \log \frac{\sqrt{3}}{2}$$
But now when we apply the continuous mapping theorem this gives $$ \exp \left( \frac1n \sum_{k=1}^n \log R_k \right) \xrightarrow{a.s.}\frac{\sqrt{3}}{2}$$
At this point the rest of my proof becomes heuristic: and I think you need to do some serious thinking as to whether it can be justified. If we take powers of $n$ on both sides then we have in some heuristic sense $$ K_n = \exp \left( \sum_{k=1}^n \log R_k \right) \xrightarrow{a.s.}\left(\frac{\sqrt{3}}{2}\right)^n \rightarrow 0,$$ since $\sqrt{3} < 2$. However this is not rigorous!
Using Doob's Convergence Theorem
An alternate, rigorous approach is to first show that $K_n$ is a Martingale, and then apply the Martingale convergence theorem.
You have effectively already shown that $K_n$ is a martingale since (without getting into the background measure theory):
$$\mathbf{E}[|K_n|] = 1 < \infty$$
\begin{align*} \mathbf{E}[K_n \, | \, K_1,\ldots, K_{n-1}] & = \mathbf{E}[R_{n}K_{n-1} \, |\, K_{n-1}]\\ & = \mathbf{E}[R_n] K_{n-1} \\ & = K_{n-1} \end{align*}
A particularly simple version of the martingale convergence theorem asserts that if $K_{n}$ is a positive martingale, then it converges almost surely to some $K$: $$K_n \xrightarrow{a.s.} K.$$ So we have now established that $K_n$ does converge almost surely to something, it remains to show that $K \equiv 0$.
In an analogy to real analysis: if an infinite product is known to converge (almost surely) to a finite limit, then either the individual terms of the product must converge to $1$, or the limit must be $0$.
Clearly the $R_n$ do not converge almost surely to $1$, hence $K \equiv 0$.