Let $\{X_k\}_{k=1}^n$ be iid random variables that are symmetric around $0$ i.e. $X=-X$ in distribution. Define $S_n=\sum_{i=1}^nX_i$. Then show $P(|S_n|\geq\max_{1\leq i\leq n}|X_i|)\geq\dfrac{1}{2}$.
I believe we would need some nice inequality for this. The first thing that comes to my mind is Kolmogorov's inequality but I am not sure.
Let $Y_i = |X_i|$, and let $s_i$ be chosen randomly for $\{\pm 1\}$. The variables $Y_i$ are iid and have a certain distribution. One way of generating the $X_i$ is generating the $Y_i$ and $s_i$, and using the formula $X_i = s_i Y_i$.
We will show that for every $y_1,\ldots,y_n$, $$ P(|S_n| \geq \max_i y_i \; | \; \vec{Y} = \vec{y}) \geq 1/2, $$ which implies your statement. We can assume without loss of generality that $Y_1 \geq Y_2,\ldots,Y_n$, and further condition on the value of $s_1$, which we assume without loss of generality is $s_1 = 1$. Under this conditioning, we prove an even stronger statement: $$ P(S_n \geq y_1 \; \vec{Y} = \vec{y}, s_1=1) \geq 1/2. $$
In order to prove the latter statement, we consider a different way of generating the remaining signs $s_2,\ldots,s_n$. We first choose $n$ random signs $t_2,\ldots,t_n,u$, and then take $s_2 = ut_2,\ldots,s_n = ut_n$; this results in the same distribution. Taking $Z = \sum_{i=2}^n t_i y_i$, we note that $$ S_n = y_1 + uZ. $$ If $Z = 0$ then $P(S_n \geq y_1) = 1$. If $Z > 0$ then $P(S_n \geq y_1) = P(u=1) = 1/2$. If $Z < 0$ then $P(S_n \geq y_1) = P(u=-1) = 1/2$. This completes the proof.