Prove multivariable function has no extrema if $|\alpha| > 1$

120 Views Asked by At

$$ G: \mathbb{R}_+^n \to \mathbb{R}, x \mapsto G(x) = \prod_{i=1}^n x_i^{\alpha_i} - \sum_{i=1}^n k_i x_i $$

Let's call the product $\prod_{i=1}^n x_i^{\alpha_i} =: P(x)$ For a local extremum we require $$ 0 = \frac{\partial G}{\partial x_j}(x) = \frac{\alpha_j}{x_j} \underbrace{\prod_{i=1}^n x_i^{\alpha_i}}_{P(x)} - k_j $$

Thus $$ x_j = \frac{\alpha_j}{k_j} \prod_{i=1}^n x_i^{\alpha_i} \iff x = \left(\frac{\alpha_1}{k_1}, \dots, \frac{\alpha_n}{k_n}\right) P(x) $$

$$ \implies x_j^{\alpha_j} = \left(\frac{\alpha_j}{k_j}\right)^{\alpha_j} P(x)^{\alpha_j} \implies P(x) = \prod_{j=1}^n \left(\frac{\alpha_j}{k_j}\right)^{\alpha_j} P(x)^{|\alpha|} $$

where $|\alpha| = \sum_{i=1}^n a_i > 1$

My question is: Why can this equality not be if $|\alpha| > 1$? I know (by the problem text) there is one maximum if $|\alpha| < 1$.

Continuing we could divide by $P(x)^{|\alpha|}$ yielding $$ P(x)^{1-|\alpha|} = \prod_{j=1}^n \left(\frac{\alpha_j}{k_j}\right)^{\alpha_j} \implies P(x) = \prod_{j=1}^n \left(\frac{\alpha_j}{k_j}\right)^{\frac{\alpha_j}{1 - |\alpha|}} $$

1

There are 1 best solutions below

0
On

Substitute $x_i = e^{y_i}, y_i\in\mathbb{R}.$ It is easy to convince yourself that the problem is equivalent to study the function: $$E(x) = \exp\left(\sum_{i=1}^n\alpha_iy_i\right) - \sum_{i=1}^nk_ie^{y_i}.$$ The critical points are found by the first-order partials and this will lead to a linear equation in $y_i$-s thanks to our substitution: $$(\alpha_1-\delta_{i1})y_1+(\alpha_2-\delta_{i2})y_2+\dots+(\alpha_n-\delta_{in})y_n=\ln\dfrac{k_i}{\alpha_i},\quad i=1,2,\dots n.$$

Note that I omitted a line or two of computations that are very similar to what you already did in your question. This can be written as a matrix equation: $$Ay = c,$$ where $A = (\alpha_i-\delta_{ij})_{i,j=1}^n$, where I am using Kronecker delta to simplify writing. Next, if you compute the Hessian and get rid of the large exponentials the same way you did in the previous step, then you will find the Hessian as: $$H = DA$$ where $D = \text{diag}(k_1e^{y_1}, k_2e^{y_2},\dots, k_ne^{y_n})$ and $A$ is as above. Since $D$ is positive definite and symmetric, the positive definiteness of $H$ is equivalent to that of $A.$ However, $$A+I = (\alpha_i-\delta_{ij})_{i,j=1}^n + (\delta_{ij})_{i,j=1}^n=(\alpha_i)_{i,j=1}^n$$ or equivalently it is a matrix with identical rows $r = (\alpha_1,\alpha_2,\dots,\alpha_n)$ $$A+I = \begin{pmatrix} r\\ r\\ \vdots \\ r \end{pmatrix}.$$ In fact, you can find that it has eigenvalues $\lambda_1=\lambda_2=\dots=\lambda_{n-1}=0$ and $\lambda_n = \alpha_1+\alpha_2+...+\alpha_n=S.$

This where we finish the problem because it immediately follows that $A$ has both $-1$ and $S-1$ as eigenvalues; therefore, if $S>1$, then $A$ has both negative and positive eigenvalues, which exactly means that our critical point must be a saddle point.

I will leave the computational details for you to fill in.