Given first order moving average
$$ x(n) = e(n) + ce(n-1) $$
where $e(n)$ is a sequence of Gaussian random variables with zero mean and unit variance which are independent of each other, and $c$ is a weighting constant in the interval $ \,\, 0 < c \le 1$.
Under these conditions, is $x(n)$ a Markov process?
I tried starting from previous equations
$$ \begin{array}{lcl} x(n-2) & = & e(n-2) & + & c\,e(n-3) \\ x(n-1) & = & e(n-1) & + & c\,e(n-2) \\ x(n) & = & e(n) & + & c\,e(n-1) \\ \end{array} $$
Looking at these equations I intuitively write
$$ f\Big(x(n) \, | \, x(n-1)\Big) = f\Big(x(n) \, | \, x(n-1), x(n-2)\Big) $$
because $x(n)$ is totally independent of $x(n-2)$. But I can't express this idea in Mathematical language.
To compute the distribution of $x_n$ conditionally on $\mathcal G_{n-1}=\sigma(x_{n-1})$, one proceed as follows. First, note that $x_n=e_n+cx_{n-1}-c^2e_{n-2}$, where $x_{n-1}$ is $\mathcal G_{n-1}$ measurable and $e_n$ is independent of $\mathcal G_{n-1}\vee\sigma(e_{n-2})$. To deal with the $e_{n-2}$ part, note that $(e_{n-2},x_{n-1})$ is gaussian hence $e_{n-2}=\alpha x_{n-1}+\beta y_{n-1}$ for some $\alpha$ and $\beta$ and some gaussian random variable $y_{n-1}$ independent of $\mathcal G_{n-1}$. To identify $\alpha$, $\beta$ and $y_{n-1}$, note that $x_{n-1}=e_{n-1}+ce_{n-2}$ and $y_{n-1}=ce_{n-1}-e_{n-2}$ are independent and that $e_{n-2}=\alpha x_{n-1}+\beta y_{n-1}$ for $\alpha=c/1+c^2$ and $\beta=-1/(1+c^2)$. Thus, conditionally on $\mathcal G_{n-1}$, $e_{n-2}$ is gaussian with mean $\alpha x_{n-1}$ and variance $\beta^2\mathrm{var}(y_{n-1})=1/(1+c^2)$.
Finally, the decomposition $x_n=(c-c^2\alpha)x_{n-1}+(e_n-c^2\beta y_{n-1})$ indicates that $x_n=(c/(1+c^2))x_{n-1}+z_n$ where $z_n$ is gaussian, independent of $x_{n-1}$, centered with variance $\sigma^2=1+c^4\beta^2\mathrm{var}(y_{n-1})=(1+c^2+c^4)/(1+c^2)$. Thus, the distribution of $x_n$ conditionally on $\mathcal G_{n-1}$ is gaussian with mean $(c/(1+c^2))x_{n-1}$ and variance $\sigma^2$.
To compute the distribution of $x_n$ conditionally on $\mathcal H_{n-1}=\sigma(x_{n-1},x_{n-2})$, one proceeds as above. The result is that $x_n=\gamma x_{n-1}+\delta x_{n-2}+t_n$ for some $\gamma$ and $\delta$ and some gaussian random variable $t_n$ independent of $\mathcal H_{n-1}$. Thus, the distribution of $x_n$ conditionally on $\mathcal H_{n-1}$ is gaussian with mean $\gamma x_{n-1}+\delta x_{n-2}$ and variance $\tau^2=1+c^2-\gamma^2-\delta^2-2c\gamma\delta$. Since $\delta\ne0$, the distributions of $x_n$ conditionally on $\mathcal G_{n-1}$ and $\mathcal H_{n-1}$ differ hence $(x_n)_n$ is not a Markov process.
The argument can be adapted to show that, for each $k\geqslant1$, $(x_n)$ is not Markov with memory $k$ either.