Let $F:\mathbb{R}^{n}\rightarrow \mathbb{R}$ be defined as \begin{align} F(\textbf{v})&=||\textbf{v}||\\ &=\sqrt{\sum\limits_{i=1}^{n}v_{i}^{2}}, ~\textbf{v}=[v_{1},\ldots,v_{n}]^{T}\in \mathbb{R}^{n}. \end{align} Let $L$ and $L'$ be two subsets of $\mathbb{R}^{n}$, with $L'\subset L$, defined as follows: \begin{align} L&=\{\textbf{x}\in \mathbb{R}^{n}:A\textbf{x}=\textbf{b}\},\\ L'&=\{\textbf{x}\in \mathbb{R}^{n}:A'\textbf{x}=\textbf{b}'\}; \end{align} here, $A$ is an $m\times n$ matrix, $\textbf{b}\in \mathbb{R}^{m}$, $A'$ is a $k\times n$ matrix where $k>m$ with the property that the first $m$ rows of $A'$ are those of $A$, and the first $m$ entries of $\textbf{b}'\in \mathbb{R}^{k}$ are those of $b$ (thus making $L'\subset L$). The matrices $A$ and $A'$ are of full rank.
Now, suppose that $\textbf{v}$ is the unique minimizer of $F(\textbf{x})$ subject to $\textbf{x}\in L$, i.e., $\textbf{v}$ is the unique point in $L$ where $F$ attains a minimum in $L$. Suppose further that $\textbf{v}\in L'$. I want to show that $\textbf{v}$ is the unique minimizer of $F(\textbf{x})$ subject to $\textbf{x}\in L'$.
I tried the method of Lagrange multipliers. I wrote down the necessary conditions that arise from Lagrange's theorem (given that we know $\textbf{v}$ is a minimizer of $F$ in $L$), but they are not taking me anywhere.
I'd appreciate any help or hints. Thanks.
Here's a hint: You have $L' \subset L$, so can there be an $\mathbf{x} \in L'$ with $F(\mathbf{x}) < F(\mathbf{v})$?