The process $(M_n)_{n \in \mathbb{N}}$ is defined as follows: $$ M_n = \begin{cases} 1 & \text{with probability } 1/2n \\ -1 & \text{with probability } 1/2n \\ 0 & \text{with probability } 1-1/n \end{cases} $$ if $M_{n-1}= 0$, and $$ M_n = \begin{cases} nM_{n-1} & \text{with probability } 1/n \\ 0 & \text{with probability } 1-1/n \end{cases} $$ if $M_{n-1} \neq 0$.
I have to prove that it is a martingale with respect to the natural filtration $(\mathcal{F}_n)_{n \in \mathbb{N}}$.
Below what I've tried so far.
I couldn't generally prove the conditional expectation martingale equality, namely $$ \mathbb{E}[M_{n+1} | \mathcal{F}_n] = M_n, $$ so I started thinking about the random variable $$ Z := \mathbb{E}[M_{n+1} | M_n]. $$ I know the explicit formula for this kind of conditional expectations, that is: $$ Z(y) = \sum_{x \in \text{Im}(M_{n+1})} x \mathbb{P}[x | M_n = y]. $$ If $y = 0$ I get $Z(0) = 0$ and if $y \neq 0$, let's say $y = \alpha$, I get $Z(\alpha) = \alpha$. So $Z$ and $M_n$ coincide.
Anyway I don't know how to go on and prove the equality for the bigger sigma algebra $\mathcal{F}_n$ given the fact that I can't use the tower property of conditional expectation because the sigma algebra generated by $M_n$ is smaller than $\mathcal{F}_n$.
Any hints would be appreciated.
What you simply need to do is show that if $M_{n}=x$ (say) then $E(M_{n}|M_{n-1})=x$.
So if $M_{n-1}=0$ , then $E(M_{n}|M_{n-1})=1\cdot \frac{1}{2n}-1\cdot\frac{1}{2n} +0\cdot(1-\frac{1}{n})=0$
If $M_{n-1}\neq 0$, then $E(M_{n}|M_{n-1})=(n\cdot M_{n-1}\cdot)\frac{1}{n}+0\cdot(1-\frac{1}{n})=M_{n-1}$ and that's basically it.
So $E(M_{n}|M_{n-1})=M_{n-1}$
Now about your question that how do you show it for the natural filtration.
What you need to observe is that $M_{n}$ as a process is a discrete time Markov chain. That is , given $M_{n-1}=i_{n-1},...,M_{0}=i_{0}$ , $M_{n}$ depends only on $M_{n-1}$ and is independent of $M_{0},...,M_{n-2}$ .
So $E\bigg(M_{n}|M_{n-1}=i_{n-1},...,M_{0}=i_{0}\bigg)=E(M_{n}|M_{n-1}=i_{n-1})$
Now depending on $i_{n-1}=0$ or $i_{n-1}\neq 0$, we already have shown above that $E(M_{n}|M_{n-1}=i_{n-1})=i_{n-1}$ .
Technically we are not doing anything new. The definition of $M_{n}$ itself specifies that we only need to check $E(M_{n}|M_{n-1})$ . So even with the natural filtration when you condition on $\sigma(M_{n-1},...,M_{0})$ it in particular means that the values of $M_{n-1},...,M_{0}$ are known which in particular means that the value of $M_{n-1}$ is known and this completely determines what the conditional expectation would be.
If you ask me, then this is sufficient to show that $M_{n}$ is an martingale.
But ff you still insist on rigour, then here is a complete rigourous justification:(Which btw is completely unnecessary)
To put it more formally $E(M_{n}|M_{n-1})\perp \!\!\! \perp \sigma(M_{n-2},...,M_{0})$ and hence $E(M_{n}|M_{n-1},...,M_{0})=E(M_{n}|M_{n-1},\sigma(M_{n-2},...,M_{0}))=E(M_{n}|M_{n-1})$
What we have done is show that for sets of the form $A\cap B$ where $A\in\sigma(M_{n-1})$ and $B\in\sigma(M_{n-2},...,M_{0})$ we have
\begin{align}\int_{A\cap B}E(M_{n}|\sigma(M_{n-1},...,M_{0})\,dP &=\int_{\Omega}E(M_{n}|M_{n-1},...,M_{0})\mathbf{1}_{A}\cdot\mathbf{1}_{B}\\ &=\int_{\Omega}\bigg(E(M_{n}|M_{n-1},...,M_{0})\mathbf{1}_{A}\bigg)\mathbf{1}_{B}\,dP\\ &=\int_{A}\bigg(E(M_{n}|M_{n-1},...,M_{0})\bigg)P(B)\,dP\,(\text{independence})\\ &=\int_{A}E(M_{n}|M_{n-1})P(B)\,dP\\ &=\int_{A\cap B}E(M_{n}|M_{n-1})\,dP \end{align}
So $\int_{S}E(M_{n}|M_{n-1},..,M_{0})\,dP=\int_{S}E(M_{n}|M_{n-1})\,dP$ for all sets $S$ such that
$S\in \Pi:=\{A\cap B\,:A\in\sigma(M_{n-1}) \,,B\in\sigma(M_{n-2},...,M_{0})\}$
Now $\Pi$ is a $\pi$ system which generates $\sigma(M_{n-1},...,M_{0})$
Now show consider $\Lambda=\bigg\{A\in\sigma(M_{n-1},...,M_{0}):\int_{A}E(X|M_{n-1},...,M_{0})=\int_{A}E(X|M_{n-1})\bigg\}$ .
By using Dominated Convergence Theorem, we have if $A_{n}\uparrow A$ such that $A\in \Lambda$ then
\begin{align}\int_{A}E(X|M_{n-1},...,M_{0})\,dP &=\lim_{n\to\infty}\int_{\Omega}E(X|M_{n-1},...,M_{0})\mathbf{1}_{A_{n}}\\ &=\lim_{n\to\infty}\int_{\Omega}E(X|M_{n-1})\mathbf{1}_{A_{n}}\,dP\\ &=\int_{A}E(X|M_{n-1})\,dP \end{align}
Thus , $A\in\Lambda$ . And $B\subseteq C$ and $B,C\in\Lambda\implies C\setminus B\in\Lambda $ . So $\Lambda$ is a $\lambda$ system.
So invoking Sierpinski-Dynkin $\pi-\lambda$ theore, we have that for all $A\in\sigma(M_{n-1},...,M_{0})$ we have $\int_{A}E(M_{n}|M_{n-1})\,dP=\int_{A}E(M_{n}|M_{n-1},...,M_{0})\,dP$
Thus $E(M_{n}|M_{n-1})\stackrel{a.s}{=}E(M_{n}|M_{n-1},...,M_{0})$