When the mean of a non-negative, integer-valued random variable $X_n$, for example counting paths in a random graph on $n$ vertices between two fixed vertices, goes to zero,
$$\lim_{n \to \infty}\mathbb{E}[X_n] =0$$
can we have $\lim_{n \to \infty}\mathbb{E}[X_n^3] = \infty$, or under what conditions does this occur? This seems reasonable given we just need two different sequences to have different limits, but a bit counter-intuitive, since we're counting objects, and the raw moments give the typical number of ordered pairs, triples, etc.

Sure. Start with an example that has a finite first moment and an infinite third moment, such as $X$ with probability function proportional to $\frac{1}{x^3}$ for $x \geq 1$. Then, modify it just a bit: let $X_n$ have probability function proportional to $\frac{1}{x^3 \log(x)^n}$ (let's say, restrict to $x \geq 3$ so we can guarantee $\log(x) > 1$).
The first moment will tend to zero, as $\sum_{x=3}^{\infty} \frac{x}{x^3 \log(x)^n} < \frac{1}{\log(3)^n} \frac{\pi^2}{6} \to 0$, but the third moment will never exist, as $\sum_{x=3}^{\infty} \frac{x^3}{x^3\log(x)^n}$ diverges.
You can also adjust the moment at which this occurs by changing the power on $x$ in the denominator; changing it from $3$ to $5$ will guarantee existence of the third moment but not the fourth, for instance.
EDIT: I'd like to thank A rural reader for pointing out a flaw in my argument: by sweeping the normalization constant under the rug, I think my sequence doesn't actually do what it's supposed to do, because that normalization constant grows with $n$, and I believe it may do so fast enough to prevent $\mathbb E[X_n] \to 0$. Instead, let me choose a much simpler example that's more in the spirit of William M's comment:
Let $X_n$ have the following distribution: $$X_n = \begin{cases} n^2, & \text{w/ prob. } \frac{1}{n^3} \\ 0, & \text{w/ prob. } 1 - \frac{1}{n^3} \end{cases}.$$ Then $\mathbb E[X_n] = \frac{1}{n}$, while $\mathbb E[X_n^3] = n^2$.