I am working on brushing up my linear algebra knowledge and have stumbled over the following problem.
Assume we have $$Wx + \lambda a = 0$$ where $W$ a diagonal square Matrix, $x$ a conformable column vector, $\lambda$ a scalar, and $a$ a column vector of dimensions conformable to $Wx$. Intuitively I just wanted to solve for $\lambda$ as $$\lambda=(-Wx)a^{-1}$$ just before realizing that I do not know any definition of the inverse of a vector. So perhaps my thinking goes wrong here. Can you help me finding the solution to this linear system of equations?
The background of this question is a lagrange multiplier optimization with function $f(x)=x^T W x$ and constraint $g(x)=a^Tx+a_0$ ($a_0$ a scalar) where after taking the gradient of Lagrangian $L=f(x)+\lambda g(x)$ I am left with the equation in question (with constraint $a^Tx+a_0$).
You can't do the “inverse of a vector”. Following the hint, instead, you can multiply by $x^T$ to the left, getting $$ x^TWx+\lambda(x^Ta)=0 $$ Now $x^TWx$ and $x^Ta$ are numbers (actually $1\times 1$ matrices, but it doesn't make a difference). If $x^Ta\ne0$, the solution is $$ \lambda=-\frac{x^TWx}{x^Ta} $$