I have a function which computes values on images with the size of 32x32 pixels. Therefore, the function is applied on every single pixel $x_i$ but also depends on all 32x32 input pixels. The input is constrained to $-1 \leq x_i \leq 1$.
More formal: A function $f_i:\mathbb [-1,1]^{1024} \rightarrow \mathbb R$ and an input vector $\vec{x} \in \mathbb [-1,1]^{1024}$.
Now I am looking for the global maximum slope of $f_i$ (for any input) numerically. A local maximum slope is easy to find as one goes in the direction of the gradient. How to find the global maximum? I am not sure if brute force is an option because when the pixel values have a high precision (e.g. 1e-10) then the total number of input permutations is ridiculously high. The brute force method might work with a lower precision but this doesn't make sense in this scenario.
Can someone give me a hint? Is this even possible?
This is a convex optimization problem. You can get a global minimum $\implies$ that the local minimum from gradient descent if both your domain and the function that you are optimizing is convex.
In your case, you are maximizing $f'(x):[-1,1]\rightarrow \mathbb R$ (most probably as I don't know the domain of the derivative)in a $32\times32$ discrete grid. As you can understand the domain is convex as it in an interval in $\mathbb R$. Only need to check if the function is convex or not.
PS: And I think you can apply Newton's method or other faster algorithms depending upon the smoothness of the function.