Question: Suppose we are given $f(x,t)$ a uniformly lipschitz continuous function in $x$.
That is to say for any $x, \hat{x}$ in $\mathbb{R}$ we have $|f(x,t) - f(\hat{x},t)| \leq L|x-\hat{x}|$.
I want to show that $|f(x,t)| \leq L(1+|x|)$.
Context: I am reading Evan's Introduction to SDE's, and I am currently in the proof regarding the existence and uniqueness of solutions to SDEs. The original problem itself is multidimensional and I will post the statement below for completeness.
[Theorem]
Suppose that $\textbf{b}: \mathbb{R}^n \times [0,T] \to \mathbb{R}^n$ and $\textbf{B}: \mathbb{R}^n\times [0,T] \to \mathbb{R}^{m\times n}$ are continuous and satisfy the following conditions.
- For some constant $L$ and all $0 \leq t \leq T$, $x, \hat{x} \in \mathbb{R}^n$ we have \begin{equation*} \begin{split} |\textbf{b}(\textbf{x}, t) - \textbf{b}(\hat{\textbf{x}}, t)| &\preceq L|\textbf{x}-\hat{\textbf{x}}| \\ |\textbf{B}(\textbf{x}, t) - \textbf{B}(\hat{\textbf{x}}, t)| &\preceq L|\textbf{x}-\hat{\textbf{x}}| \end{split} \end{equation*}
- For the same constant $L$ (as above) and all $0 \leq t \leq T$, $x \in \mathbb{R}^n$ we have \begin{equation*} \begin{split} |\textbf{b}(\textbf{x}, t)| &\preceq L(1+|\textbf{x}|) \\ |\textbf{B}(\textbf{x}, t)| &\preceq L(1+|\textbf{x}|) \end{split} \end{equation*} (... The rest of the theorem statement is irrelevant to this question)
And after stating the theorem, the author says "It is possible to show that $(1) \implies (2)$" directly. On the onset this sounds believable, but somehow, I am getting stuck.
My attempt: WLOG, we let $n=1$, and try only to solve the first part with $\textbf{b}$ as the other part is done similarly. Notice that letting $\hat{x} = 0$ we have $$|f(x,t)|-|f(0,t)| \leq |f(x,t) - f(0,t)| \leq L|x|$$ This tells us that $$|f(x,t)| \leq |f(0,t)| + L|x|$$
At this point, I am forced to somehow show $|f(0,t)| \leq L$, but since no additional assumptions on $f$ is given, I find this hard to believe. Intuitively, the lipschitz condition only bounds the derivative (wrt $x$) of $f$ by $L$, but the function value itself can be anything it wants. However, if the statement I am trying to prove is true, then indeed $|f(0,t)| \leq L$. So, some miraculous force is stopping $|f(0,t)|$ from going beyond $L$; any help in helping me figure out what that might be is very much appreciated!
Resolution: The comment from @Shalop was a great first step, but the details had to be ironed out; I am doing it here for my own references, and just in case someone else needed help with this stuff. Let $T>1$. The example $f(x,t) = t^2 + x$ seems to work on the onset because $|f(x,t) - f(\hat{x},t)| \leq |x - \hat{x}|$, so $L=1$ and $|f(0,t)| = t^2$. However, $L$ doesn't necessarily have to be $1$; in this case if we let $L = T^2$; then we still have
$$|f(x,t) - f(\hat{x},t)| \leq T^2|x-\hat{x}|$$ and at the same time $$|f(0,t)| \leq L$$ giving us what we need.
However, motivated by this first approach, I wrote a different proof, which I think works.
Let $f$ satisfy the given condition. By hypothesis, we are given some $L$ such that $$|f(x,t) - f(\hat{x},t)| \leq L|x - \hat{x}| \quad \forall{x, \hat{x} \in \mathbb{R}, t \in [0,T]}$$ Now consider $g(x,t) = f(x,t)+L+1-f(0,t)$. Notice that for the same $L$, $g$ also satisfies $$|g(x,t) - g(\hat{x},t)| \leq L|x - \hat{x}|$$ However, if the hypothesis did in fact imply the conclusion, this would mean $|g(0,t)| \leq L$; but by construction, we know that $|g(0,t)| = |f(0,t)+L+1-f(0,t)| = L+1$, which is a contradiction!
So, we have found a $g$ that satisfies the hypothesis, but fails to satisfy $g(0,t) \leq L$.