Why the Gibb's phenonmenon doesn't occurre when arithmetic medias are used?
The arithmetic means of an serie $\sum_{n=0}^{\infty} a_n$ is the sequence of numbers $\sigma_n$ defined by $$\sigma_n= \frac{1}{n} \sum_{k=0}^{n-1} s_k$$ where each n is in $\mathbb{N}$ and $s_k$ is a partial sum of the series. If $\lim_{n\rightarrow \infty} \sigma_n$ exists, then we say that the series is summable by arithmetic means to this limit. It seems to me that this concept is also known as Cesàro summation.
Now, if we apply this definition to the Fourier series of an function $f$ with period $2 \pi$, then we get each arithmetic mean $\sigma_n$ is a function of $x$ $$\sigma_n=\frac{1}{n} \sum_{k=0}^{n-1} S_k(x)$$ where for each $k$ $$S_k(x)=\frac{1}{2}A_0+ \sum_{m=1}^{k} (A_m \cos mx +B_m \sin mx)$$ is a partial sum of the Fourier series for $f$, and $A_m$ and $B_m$ are the usual Fourier coefficients for $f$.
On the other hand, we know that $$\sigma_n(x)=\frac{1}{2 \pi} \int_{-\pi}^{\pi} f(x+u)F_n(u)du $$ where $F_n(x)=\frac{1}{n}\left ( \frac{\sin (nu/2)}{\sin(u/2)} \right )^{2}$ is the Fejér's kernel.
My attempt is none because I don't know how to start. I considered to use the fact that if $f$ is continuous and has period $2 \pi$, then $\lim_{n\rightarrow \infty} \sigma_n(x)=f(x)$ holds uniformly for all $x$ values, but the Gibb's phenomenon occurs on points of discontinuity (jump discontinuity) for f. Can anyone help me? Thanks in advance.