I have the following homework problem (from Calculus of Variations course) :
Show that if in
$$ \min \int_a^b f(x^2+y(x)^2)\sqrt{1+y'(x)^2}\;dx$$
polar coordinates are used, then the problem will be converted into one that contains no independent variable. Solve it to optimality.
I'm having few problems on converting the integral into one with polar coordinates in it. For example I'm not quite sure how to write $y'(x)$ in polar coordinates. The thing that confuses me is that if $y(x)=y(r\cos \theta)$, then should $y'$ be calculated with respect to $r$ or $\theta$? Could someone show how the transformation into polar coordinates would be done in this example.
Hope my question is clear =) thank you for the help! Please let me know if you need more information.
Suppose the graph of the given curve $x \mapsto y(x)$, $a \leq x \leq b$, can be written as a polar curve, say, as $\theta \mapsto r(\theta)$, $\alpha \leq \theta \leq \beta$.
Now, note that $$\sqrt{1 + y'(x)^2} dx$$ is just the arc length element (mnemonically, this is $\sqrt{\left[1 + \left(\frac{dy}{dx}\right)^2\right] \cdot dx^2} = \sqrt{dx^2 + dy^2}$), which some elementary geometry gives is $$\sqrt{r^2 + r'(\theta)^2} d\theta$$ in polar coordinates.
Remark Note that if $f$ is a suitable power function and we regard $r(\theta)$ (or $y(x)$) as tracing out the shape of a wire with unit mass per length, then the integral is the corresponding moment of the wire about the origin.