We're given a wave equation $u_{tt} = c^2 \cdot u_{xx}$, with $t>0, x\in\mathbb{R}, c>0, u(x,0) = f(x), u_t(x,0) = g(x)$.
Assume that $\lim_{|x|\rightarrow\infty}\frac{f(x)}{x} = a\in\mathbb{R}$ and $\lim_{x\rightarrow\infty} g(x) = b\in\mathbb{R}$.
The question is to show that $\lim_{t\rightarrow\infty}\frac{u(x,t)}{t} = c_1\in\mathbb{R}$.
I think this question is lacking information; the behaviour of $g$ at $-\infty$ is needed to complete the solution. For the record, I used D'Alembert followed by L'Hôpital and got stuck. My intuition also tells me that we're missing information since D'Alembert reduces the problem to finding something along the lines of the average of $g$.
The question is, therefore - is the problem lacking information or not?