Consider an equation like
$$y''(x)+y'(x)-xy(x)=0.\tag1$$
Using method of dominant balance, we substitute $y(x)=e^{S(x)}$ and get
$$S''(x)+(S'(x))^2+S'(x)-x=0.\tag2$$
Now we assume that $S''(x)\ll (S'(x))^2$ as $x\to\infty$ and then get
$$(S'(x))^2+S'(x)\sim x,\tag3$$
which we can solve to get
$$S(x)\sim\frac12\left(-x\pm\frac16(4x+1)^{3/2}\right).\tag4$$
Then we check that the assumption is consistent with the result and, since it is, we say that $(4)$ is the true asymptotics of $\ln(y(x))$ as $x\to\infty$.
Now my question is: why is it OK to first assume something, and then only check if the results derived from this assumption are consistent? Can't such procedure give us some false results, even if consistency check says it's consistent?