Proof of Lagrange Multiplier Method with multiple constraints (analytical not geometric)

413 Views Asked by At

I want to prove/substantiate the method of Lagrange Multipliers for a general multi-variable function $f: R^n \rightarrow R$ subject to $m$ constraints of the form $g_j ( \vec{x} ) = C_j$ where $g_j:R^n \rightarrow R$ and $C_j$ is a constant.

To do this we need to prove the Implicit Function Theorem

Lemma 1: Implicit Function Theorem

Let $L_c (g_j)$ be a level set of $g_j$

it follows that

$dg_j = 0$

$L_c(g_j) = \left\{ \vec{x} \epsilon R^n | g_j(\vec{x}) = C_j \right\}$

It follows that

$dg_j = \sum \limits_{i = 1}^{n} \left(\frac{\partial g_j}{\partial x_i} \right) \left(x_i - a_i \right) = 0$

$\left( \frac{\partial g_j}{\partial x_n}\right) \left( x_n - a_n \right) = - \sum \limits_{i=1}^{n-1} \left( \frac{\partial g_j}{\partial x_i}\right) \left( x_i - a_i \right)$

$dg_j = \sum\limits_{i=1}^{n} \left( \frac{\partial g_j}{\partial x_i}\right)\left( x_i - a_i \right)= 0$

$dg_j = \left( \frac{\partial g_j}{\partial x_n}\right)\left( x_n - a_n \right) + \sum\limits_{i=1}^{n-1} \left( \frac{\partial g_j}{\partial x_i}\right)\left( x_i - a_i \right)= 0$

We also know that

$\sum\limits_{j=1}^{m} dg_j = 0$

$\sum\limits_{j=1}^{m} dg_j = \sum\limits_{j=1}^{m} \left( \frac{\partial g_j}{\partial x_n}\right) \left( x_n - a_n \right) + \sum\limits_{j=1}^{m} \sum\limits_{i=1}^{n-1} \left( \frac{\partial g_j}{\partial x_i} \right) \left( x_i - a_i \right) = 0$

Therefore

$x_n = a_n - \sum\limits_{j=1}^{m} \sum\limits_{i=1}^{n-1} \frac{\left( \frac{\partial g_j}{\partial x_i}\right)}{\left(\frac{\partial g_j}{\partial x_i} \right)} \left( x_i - a_i \right)$

linearizing $x_n$

$x_n = a_n + \sum\limits_{i=1}^{n-1} \left( \frac{\partial x_n}{\partial x_i} \right) (x_i - a_i)$

Comparing the two previous expressions we get

$\left( \frac{\partial x_n}{\partial x_i} \right) = - \sum\limits_{i-1}^{n-1} \frac{\left( \frac{\partial g_j}{\partial x_i} \right)}{\left( \frac{\partial g_j}{\partial x_n} \right)}$

Actual Theorem of Lagrange Multipliers

Edit: The statement of the theorem is as follows:

If $f(\vec{x})$ constrained by $g_j (\vec{x}) = C_j$. (where $j$ ranges from $1$ to $m$) then extremal points are found by solving the system of equations $\left( \frac{\partial f}{\partial x_i} \right) - \sum_{j=1}^{m} \lambda_j \frac{\partial g_j}{\partial x_i} = 0$

If $f$ is extremized then $\left(\frac{\partial f}{x_i} \right)_{total} =0$

$\left(\frac{\partial f}{\partial x_i} \right)_{total} = \left(\frac{\partial f}{\partial x_i} \right) + \left( \frac{\partial f}{\partial x_i}\right) \left( \frac{\partial x_n}{\partial x_i} \right) = 0$

$\left( \frac{\partial f_i}{\partial x_i}\right) - \sum\limits_{j=1}^{m} \frac{\left(\frac{\partial f}{\partial x_n}\right)}{\left(\frac{\partial g_j}{x_n}\right)} \left( \frac{\partial g_j}{\partial x_i}\right) = 0$

**The part I'm having trouble with **

I'm having a tough time substantiating that

$\frac{\left(\frac{\partial f}{\partial x_n}\right)}{\left(\frac{\partial g_j}{x_n}\right)}$ is a pure scalar $\lambda_j$