Assume that we are given some $f$, which we know is a convex function. We are now trying to find the minima of this function so we take the $-\nabla f$ to find the direction of steepest decline.
My question is, is this guaranteed to point towards the minima. For simple functions like a paranoid I am imagining yes, but this seems way too good to be true.
Yes, in the sense that the inner product will be nonnegative (though it will of course not always point in the exact direction of the minimizer). If $f:\mathcal{X}\rightarrow\mathbb{R}$ is a convex function, $x^*$ is a minimizer, and $x$ is any other point in the domain in which a subgradient $f'(x)$ exists, then we have by definition of subgradient: $$ f(y) \geq f(x) + f'(x)^T(y-x) \quad \forall y \in \mathcal{X} $$ and evaluating at $y=x^*$ gives $$ f(x^*) \geq f(x) + f'(x)^T(x^*-x)\geq f(x^*)+f'(x)^T(x^*-x)$$ where the final inequality uses $f(x)\geq f(x^*)$ by definition of minimizer. So we get: $$ \boxed{(-f'(x))^T(x^*-x) \geq 0}$$
An example is $f:\mathbb{R}^2\rightarrow\mathbb{R}$ defined by $f(x,y) = ax^2 + by^2$ for $a>0, b>0$. Then $(x^*,y^*)=(0,0)$ is the unique minimizer and at any nonzero vector $(x,y)$ we have: $$ -f'(x,y) = [-2ax; -2by] $$ The direction from $(x,y)$ to $(0,0)$ is $(-x,-y)$, and indeed $(-2ax; -2by)$ has a nonnegative inner product with $(-x,-y)$, even though it does not point in exactly the same direction when $a\neq b$.