Let $\{X_n , n\geq 1\}$ be a sequence of random variables that have the same uniform distribution on $[0,2]$ and let $S_n=\sum_{k=0}^nX_k$.
$\mathbf a)$ Prove that $\left\{\frac{S_n}{n},n\geq 1 \right\}$ is a Markov chain.
$\mathbf b)$ Is $\left\{\Bigl[\frac{S_n}{n}\Bigr],n\geq 1 \right\}$ a Markov chain?
Definition: The random process $\{X_t, t\in T\}$ is called a Markov process if for any $n\geq3$, any $t_1<t_2<...<t_n\in T$ and any $x_1,x_2,...,x_n$ from the set of all possible values, the following equality holds
$\operatorname{Pr}(X_{t_{n+1}}<x_{n+1}\mid X_{t_{n}}=x_n,\ldots,X_{t_1}=x_1)=\operatorname{Pr}(X_{t_{n+1}}<x_{n+1}\mid X_{t_n}=x_n)$.
We are looking at it at discrete time points so it is a Markov chain.
I'm not given whether or not $X_n$ are independent of each other. We know the pdf and cdf of $X_n$ $$f{_X{_n}}(x)=\begin{cases}\frac{1}{2}, & x\in[0,2] \\ 0 & \text{otherwise} \end{cases}$$ $$F{_X{_n}}(x)=\begin{cases} 0, & x <0 \\
x/2, & 0, \leq x \leq 2 \\
1 & x > 2
\end{cases}$$
I don't know how to start or what to use. I'm aware that $\frac{S_n}{n}=\overline{X_n}$ and I know that $E[\overline{X_n}]=E[X_n]$. I wanted to find the pdf of $\overline{X_n}$ using characteristic functions but I can't use that neat property for sum of random variables because I don't know if they are independent.
2026-04-03 12:35:53.1775219753