Shortest curve between two points using finite differences

488 Views Asked by At

Use the method of finite differences to find the shortest plane curve connecting two points $A $ and $B $

Typically, I would just use the Euler equation to do this type of problem, but using finite differences sounded intriguing.

If we let $$J [y] = \int_a^b \sqrt {1+y'^2}dx = \int_a^b \sqrt {dx^2+dy^2} $$

And assume that the point $A$ is at $(a,\alpha)$ and $B $ is at $(b, \beta)$.

We can separate the interval from $(a, \alpha)$ to $ (b,\beta)$ into $n+1$ polygonal lines.

Let $$a = x_0 < x_1 < x_2 <... < x_n < x_{n+1} = b \space \text { and } y_k = y (x_k) $$

Then $$L(y_1,y_2,...,y_n) = \sum_{i=1}^{n+1} \sqrt{h^2+(y_{i}-y_{i-1})^2} $$ Where $h = x_{i}-x_{i-1} $ (here we have assumed the points are evenly space w.r.t. $x $)

It is clear that for large $n $,

$L(y_1,...,y_n) \approx J[y] $, and the equality becomes exact as $n \to \infty $.

We are trying to minimize, so it seems to me that we must first establish a critical point by taking the gradient of $L $ and setting it to $\vec{0}$. This would amount to taking the partial derivative with respect to each $y_i $ and setting them all equal to $0$

There are two terms that depend on each $y_i $. Generally speaking,

$$\frac {\partial L}{\partial y_i} = \frac {(y_i-y_{i-1})}{\sqrt {h^2+(y_i-y_{i-1})^2}} + \frac {(y_{i}-y_{i+1})}{\sqrt {h^2+(y_{i}-y_{i+1})^2}} = 0$$

For algebraic purposes, let $p = y_i-y_{i-1} $ and $q = y_i-y_{i+1}$

$$\frac {p}{\sqrt {h^2+p^2}}=- \frac {q}{\sqrt {h^2+q^2}} $$

$$p^2 (h^2+q^2) = q^2 (h^2+p^2) $$

$$p^2 h^2 = q^2 h^2$$

$$q = p \implies y_{i+1} - y_i = y_i - y_{i-1} $$ So that $$\frac {y_{i+1}+y_{i-1}}2 = y_i $$ Thus for all $1 \le i \le n$, the point $y_i $ is the average of its neighboring points. This is consistent with lines, but is only a necessary, not sufficient, condition to be a line. If we take $n \to \infty $, then $y_{i}-y_{i-1} $ become infinitesimally close to each other. That is to say, every point, $y $, on the curve is the average of any two points equally spaced from $y $

This implies that $y(x) $ is a line with slope $\frac {\beta-\alpha}{b-a} $

It gave the desired concluion, but is this approach correct?