Let $A \in \mathbb{R}^{m\,\times\,n}$, $b \in \mathbb{R}^m$, and $p \in \mathbb{R}^n$, with $m > n$. Show how the constrained least squares problem:
minimize $\Vert b - Ax \Vert_{2}$ subject to $x^{T}p = \delta$ (a fixed real scalar)
can be reduced to solving a related unconstrained least squares problem.
I'm not sure exactly how to approach this problem. My thought was if I could get $x^{T}p$ to somehow be within the $\Vert b - Ax \Vert_{2}$ equation that would be it, but I'm not sure exactly how to get to that point. Also this question overall has been asking about Householder reflectors, so I'm not sure if a householder reflector would be the way to go.
I've though about rewriting $\Vert b - Ax \Vert_{2}$ as $\Vert Q^{T}(Ax-b) \Vert_{2}$ and then to $\Vert Rx - Q^{T}b \Vert_{2}$ using the QR-factorization and rewriting the Q's and R's as the Householder reflectors, but I'm not sure that would get me anywhere because I still won't have any $x^{T}p$ anywhere. If I could some direction that would be great.
If $p = 0$, this is essentially an unconstrained problem. So, suppose that $p \neq 0$. Extend the set $\{p\}$ into an orthogonal basis $\{v_0,v_1,\dots,v_{n-1}\}$ of $\Bbb R^n$ with $v_0 = p$. Let $V$ denote the matrix whose columns are $v_1,\dots,v_{n-1}$.
Argue that $x$ satisfies the constraint $x^Tp = \delta$ if and only if it can be expressed in the form $$ x = x_0 + Vy, $$ where $y \in \Bbb R^{n-1}$ and $x_0$ is any single vector satisfying the constraint (for instance, we could take $x_0 = (\delta/p^Tp) \cdot p$).
With that established, the constrained minimization problem may be rewritten as an unconstrained minimization over $y \in \Bbb R^{n-1}$ by substituting $y$ into the objective function:
$$ \min_{y \in \Bbb R^{n-1}}\|b - A(x_0 + Vy)\|_2. $$ Notably, this expression can be rewritten in the usual form. $$ \|b - A(x_0 + Vy)\|_2 = \|(b - Ax_0) - (AV)y\|_2, $$ which is to say that we are looking for the least squares solution to $(AV)y = b - Ax_0$.