I learned that the partial sum of the fourier series at a jump discontinuity always overshoots the value of the original function by about 9% and this percentage does not die out as we increase the terms of the partial sum.
On the other hand, by Dirichlet's theorem we have that the fourier series converge to the average of the jump discontinuity.
I feel that these two statements are contradictory. The first stating that the value of the fourier series at a discontunity will always have a fixed overshoot for all partial sums. And the other saying that the fourier series gets closer and closer to the average of the discontunity which implies that the overshoot is decreasing and approaches zero as we increase the terms of the partial sum.
So can someone clear my confusion?
Gibb's phenomenon doesn't affect the point at the jump discontinuity. It says that, if you go a bit to either side of the jump discontinuty (in a certain proportion to $\frac{1}n$, where $n$ is the number of terms of the Fourier series that you've summed), you will observe that the function overshoots the target by about $9\%$ of the size of the jump discontinuity.
The classic example is illustrated as follows, where I have plotted the 20th partial Fourier series of $\frac{\pi}4\operatorname{sgn}(x)$ where $\operatorname{sgn}$ is the sign function:
In blue is the Fourier series and in red is the sign function. Note that, just to the right of $0$, where the jump discontinuity is, the Fourier series exceeds intended value and similarly to the left. However, right at the jump discontinuity, the function is $0$, which is the average of either side. One observes that adding more terms causes the place where the overshoot happens to get closer to $0$, but that this issue is orthogonal to the issue of what exactly the value at $0$ converges to.