I'm doing an Introduction to Machine Learning course by myself using some open university coursebook and it has the following question which I've tried to solve, but to no avail:
Let there be a Lipschitz function $g:\mathbb{R}^n\rightarrow \mathbb{R}$.
Meaning the following holds: $|g(x)-g(y)| \leq L \,\cdot \parallel x-y\parallel$ for some constant $L\geq 0$.
Prove that the following function $f:\mathbb{R}^n\rightarrow\mathbb{R}$ defined as:
$f(x) = \parallel\,x\,\parallel ^2 + \, g(x)$ has a global minimum
OK so I want to show that function basically goes to infinity in every direction but I am having trouble proving it. I tried going directly from the definition but I can't figure out what the Lipschitz function gives me (I'm sure I need to use it somehow)
Any help is appreciated! Thank you to anyone who helps
Hint: set $y=0$ and see what the Lipschitz condition does for you.