Minimization problem on $\mathbb{R}^n$

45 Views Asked by At

I would like to find the solution of this minimization problem :

$$ \min_{\text{Ker}B} J(x) = \frac{1}{2}\langle Ax, x \rangle -\langle b,x\rangle $$

Where $A$ is a $n$ by $n$ symmetric matrix positive definite, $b\in\mathbb{R}^{n}$ and $B\in\mathcal{M}_{m,n}(\mathbb{R})$ is a full rank matrix.

First we notice that the derivative is given by $J’(x)(h) = \langle Ax -b,h\rangle$

Now we describe $\text{Ker}B$ as

$$ \{ x\in\mathbb{R}^{n} : f_i(x)= \langle b_i , x\rangle = 0\quad i=0,…,m\} $$

Where $b_i$ are the row of the matrix $B$. We have that $f_{i}^{´}(x)(h) = \langle b_i, h\rangle$.

Two observations : there exists a solution since the domain of interest is convex and closed and our function is convex and continuous.

Now to find the solution we will use Lagrange multiplier : $\exists \lambda_i\in\mathbb{R}$ ($i=1,…,m$) such that at the minimizer $\bar{x}$ we have

$$ J’(\bar{x})(h)+\sum_{i=1}^{m}\lambda_if_{i}^{´}(\bar{x})(h) = h^{t}(A\bar{x}-b + \sum_{i=1}^{m}\lambda_i b_i) = 0\quad\forall h\in\mathbb{R}^{n} $$

Which gives

$$ A\bar{x} - b = \sum\tilde{\lambda_i}b_i = B^{t}\tilde{\lambda}\implies \bar{x} = A^{-1}(B^{t}\tilde{\lambda} + b) $$

Where $\tilde{\lambda} = (-\lambda_1, …, -\lambda_m)$.

Is this seems correct to you please ?