Lagrange Multipliers on matrix function

1.8k Views Asked by At

The Question states: "Let $A$ be a symmetric $ n \times n$ matrix. Use Lagrange multpliers to find the maximum of $f: \Bbb R^n \to \Bbb R$ with $f(x) = (Ax) \cdot x$ under the constraint $g(x) = \lvert x \rvert ^2 - 1 = 0$. What does this prove?"

Here is my work so far.

So I brute-force expanded $f$ and took the derivative of it with respect to each variables $x_j$. From the formula for Lagrange multipliers, and from the given contraint, we get $n+1$ set of equations (after simplifing) $$\partial _1 f(x) = x_1 a_{11} + x_2a_{12} + ...+x_na_{1n} = \lambda x_1$$ $$\partial _2 f(x) = x_1 a_{12} + x_2 a_{22} + x_3a_{23} + ...+x_na_{2n} = \lambda x_2$$ $$\partial _3 f(x) = x_1 a_{13} + x_2 a_{23} + x_3a_{33} + x_4a_{34}+ ...+x_na_{3n} = \lambda x_3 $$ $$ \vdots$$ $$\partial _n f(x) = x_1 a_{1n} + x_2 a_{2n} + x_3a_{3n} + x_4a_{4n}+ ...+x_na_{nn} = \lambda x_n$$ $$x_1^2+...+x_n^2=1$$ where $a_{ij}$ is the entry in $i$th row and $j$th column (or $j$th row and $i$th column, since the matrix is symmetric).

I am really stuck after this. How do I solve this huge set of equations? Trying to solve is nearly impossible by hand (very, very, very complicated). Looking at other posts(like this one Eigenvalues of a symmetric matrix with Lagrange multipliers), I see that finding the max and min would prove that all symmetric matics have all real eigenvalues. The thing is, I know nothing about linear algebra except what eigenvectors and eigenvalues are. My instructor told me that I can find the maximum of $f$ as $f(a)$ without actually computing the vector $a$. How is this possible? Please explain without getting too much into linear algebra.

1

There are 1 best solutions below

0
On BEST ANSWER

Don't.

First note that $\nabla f(x) = 2 Ax$ and $\nabla g(x) = 2 x$. Then the Lagrange condition gives $\nabla f(x) + \mu \nabla g(x) = 0$, or $Ax = -\mu x$.