An oscillating random walk is a series of partial sums $S_n := X_1 + \dots + X_n$ with $S_0 := 0$, where $X_1, X_2, \dots$ are i.i.d. random variables, satisfying
\begin{align} \liminf_{n \to \infty} S_n = -\infty \quad \text{and} \quad \limsup_{n\to\infty} S_n = +\infty. \end{align} First of all, this is the usual definition of an oscillating random walk, right?
Does this imply $E(X_1) = 0$?
Note, that I did not make any assumptions regarding the law of the $X_i$. Therefore, the variance could be infinite etc.
No, that's not the usual definition of "oscillating random walk", as described in J. H. B. Kemperman's The Oscillating Random Walk (Stochastic Processes and their Applications (1974) pp.1-29; Kemperman cites earlier work by B.R. Bhat.) The general concept covers Markov chains where the distribution of the next summand depends on whether (say) the sum so far is positive or negative.
In your simple random walk case, with the lim sup and lim inf hypotheses, if you also assume $EX_1$ exists (that is, $E|X_1|<\infty$) your conclusion that $EX_1=0$ follows from the strong law of large numbers: if $EX_1>0$ then, for example, with probability $1$, $S_n>0$ for all $n$ sufficiently large, and so on.