Numerical optimization in function space

155 Views Asked by At

I'm new to calculus of variations. I'm curious about how to apply simple numerical optimization techniques in function space.

Consider the classical problem: finding the shortest path between two points on a plane. Let these points be $A=(x_1,y_1)$ and $B=(x_2,y_2)$ with $x_1 \neq x_2$. The path is parameterized by $(x,f(x))$ for $x \in [x_1,x_2]$, where $f(x_1) = y_1$ and $f(x_2) = y_2$.

It is easy to show that the path can be formalized as:

$I( f )=\underset{{{x}_{1}}}{\overset{{{x}_{2}}}{\mathop \int }}\,{{( 1+{{( {{f}^{'}} )}^{2}} )}^{\frac{1}{2}}}dx$.

I want to find $f$ such that the shortest path between $A$ and $B$ is minimized. I'd like to start with an initial $f_0$ and update it via a gradient descent procedure.

for $t:1 \to T$

$f_t \leftarrow f_{t-1} - \eta \nabla_f I$

end

where $\eta$ is suitably selected real number. Perhaps the loop can contain an if statement to check whether $f_t$ satisfies the Euler–Lagrange Equation and if so halts.

My questions are:

1) Is $\nabla_f I$ calculated similar to ordinary differentiation, i.e. treating function $f$ as a variable?

2) Is the above update step reasonable, if so how to subtract (or add) two functions?