Additional natural boundary conditions determined by $G(y(b))$ outside of Lagrangian.

222 Views Asked by At

Here is a problem I am not quite sure how to approach:

Determine the natural boundary condition at $x=b$ for the variational problem defined by $$J(y) = \int^b_a L(x,y,y')dx + G(y(b)),$$ where $ y\in C^2[a,b]$ and $y(a) = y_0$. I know the solution, but I don't know how to arrive to it.

1

There are 1 best solutions below

2
On BEST ANSWER

Suppose $G:\mathbb{R}\to\mathbb{R}$ is a differentiable function.

In a typical way of physics, you may have, by repeating the derivation of Euler-Lagrange equation, \begin{align} 0=\delta J[y]&=\delta\int_a^bL(x,y,y'){\rm d}x+\delta G(y(b))\\ &=\int_a^b\left[\frac{\partial L}{\partial y}\delta y+\frac{\partial L}{\partial y'}\delta y'\right]{\rm d}x+G'(y(b))\delta y(b)\\ &=\int_a^b\frac{\partial L}{\partial y}\delta y{\rm d}x+\int_a^b\frac{\partial L}{\partial y'}{\rm d}\delta y+G'(y(b))\delta y(b)\\ &=\int_a^b\frac{\partial L}{\partial y}\delta y{\rm d}x+\frac{\partial L}{\partial y'}\delta y\Bigg|_{x=a}^{x=b}-\int_a^b\frac{\rm d}{{\rm d}x}\left(\frac{\partial L}{\partial y'}\right)\delta y{\rm d}x+G'(y(b))\delta y(b)\\ &=\int_a^b\left[\frac{\partial L}{\partial y}-\frac{\rm d}{{\rm d}x}\left(\frac{\partial L}{\partial y'}\right)\right]\delta y{\rm d}x+\left[\frac{\partial L}{\partial y'}(b,y(b),y'(b))+G'(y(b))\right]\delta y(b)-\frac{\partial L}{\partial y'}(a,y(a),y'(a))\delta y(a). \end{align}

In the above result, due to the arbitrariness of $\delta y$ for all $x\in\left(a,b\right)$, the first term implies $$ \frac{\partial L}{\partial y}-\frac{\rm d}{{\rm d}x}\left(\frac{\partial L}{\partial y'}\right)=0, $$ namely the well-known Euler-Lagrange equation. Due to the arbitrariness of $\delta y$ at $x=b$, the second term from above forces $$ \frac{\partial L}{\partial y'}(b,y(b),y'(b))+G'(y(b))=0. $$ This gives the boundary condition for $y$ at $x=b$. Due to the imposed boundary condition $y(a)=y_0$, it is a must that $\delta y(a)=0$, and the last term from above vanishes.

Mathematically, you may always make the above derivation more rigorous. That is, let $$ y=y(x,t) $$ be a family of feasible functions. That is, for each fixed $t\in\mathbb{R}$, $$ y(x,t)\in C^2\left[a,b\right],\quad y(a,t)=y_0. $$ Now, suppose $y=y(x,0)$ corresponds to the optimal function that minimizes the functional $J$. Then it is a must that $$ 0=\frac{\rm d}{{\rm d}t}\Bigg|_{t=0}J[y(\cdot,t)]=\frac{\rm d}{{\rm d}t}\Bigg|_{t=0}\left(\int_a^bL(x,y(x,t),y'(x,t)){\rm d}x+G(y(b,t))\right), $$ where we adopt the notation $$ y'(x,t)=\frac{\partial y}{\partial x}(x,t). $$ You may carry out this derivative, and the arbitrariness of $$ \frac{\partial y}{\partial t}(\cdot,t) $$ would give you exactly the same equation and boundary condition as above.