Consider a sequence $x_1,\dots,x_n\in [0,1]^d$. Suppose that $\mathcal{F}$ is a subset of the space of continuous functions from $[0,1]^d$ to $\mathbb{R}$.
Moreover, let $f\in\mathcal{F}$ such that $$f(x_1)< f(x_2)< \dots < f(x_n).$$
I'm wondering if there is any condition over $\mathcal{F}$ and $\delta>0$ so we can conclude that for all $g\in\mathcal{F}$ with $\|f-g\|_{\infty}<\delta$, $$g(x_1)< g(x_2)< \dots < g(x_n).$$
I think this may be true if we take $f,g$ lipschitz and something like $\delta=\min_{i,j}\|x_i-x_j\|$, but I'm not sure.
Thank you very much.
Edit: I relaxed changed the condition
$$f(x_1)\leq f(x_2)\leq \dots \leq f(x_n),$$
to
$$f(x_1)< f(x_2)< \dots < f(x_n).$$
With strictly increasing values $f(x_1) < f(x_2) < \cdots < f(x_n)$ you can define $\epsilon = \frac 12 \min(f(x_{k+1})-f(x_k))$. Then $\Vert f-g \Vert_\infty < \epsilon$ implies $$ g(x_k) < f(x_k) + \epsilon \le f(x_{k+1}) - \epsilon < g(x_{k+1}) $$ for $1 \le k < n$.
This works for all bounded functions $f, g$ from an arbitrary domain to $\Bbb R$, the continuity is not relevant here.
Requiring Lipschitz continuity for $f$ does not help because that does not guarantee a minimum distance between the $f(x_k)$.