Consider a NLP $\min\{f(x): g(x) \le 0\}$. There are no equality constraints. The problem is feasible for small steps $t > 0$. I have to prove that $g(x + td) \le 0$ if $g(x) < 0$, where $t$ is the step length and $d$ is the direction of the line search (gradient descent).
I was thinking that since $t$ is positive and the direction $d$ can not be negative (not too sure about this fact), hence their multiplication is positive. The only way for $g(x + td)$ to be $0$ or negative is for $g(x)$ to be negative.
If I understood correctly, you are given $g(x) < 0$ and you have to show that for some small $t$ and any $d$ you will still have $g(x+td) \le 0$.
So if $g(x) < 0$ then you move along $d$. If for all points $r$ of the ray $R$ from $x$ in the direction of $d$ you will have $g(r) < 0$, the condition is fulfilled.
So assume there is some point $r$ on $R$ where you will have $g(r) = 0$. (If there are many such points, pick the closest one to $x$.) Since $r$ is on $R$, there must exist a $T > 0$ such that $x + td = r$, and so you have $g(x) < 0$ and $g(x+Td) = 0$ for some fixed $T$, with $g(x+td) < 0$ for all $0 \le t < T$, as desired.