Differentiating the maximizer with respect to the coefficients

127 Views Asked by At

Consider the following maximization problem: $$ \begin{split} \max_{(x_1,\ldots,x_n) \in \mathbb{R^+}^n} &\sum_{i=1}^n \alpha_i \ln(x_i)\\ \text{such that} & \sum_{i=1}^n \pi_i x_i = M,\\ & x_i \ge 0 \quad \forall i \in [n] \end{split} $$

  1. Find all solutions to the above maximization problem. Make sure to prove that your solution must be a maximizer.
  2. Let $x^* = (x_1^*, \ldots, x_n^*)$ be a solution to the above problem. Define $$ V^*(\pi_1, \ldots, \pi_n, M, \alpha_1, \ldots, \alpha_n) = \sum_{i=1}^n \alpha_i \ln(x_i^*). $$ Find the derivative of $V^*$ with respect to $\pi_j$ for any j.

To begin, I set up a Lagrangian so that I had the maximizing function above minus lambda multiplied by the constraint function above equal to zero. Then, I set up the first order conditions, taking a partial derivative with respect to xi and setting equal to zero, and then a partial with respect to lambda as the second equation. I am not sure how to proceed after that.

2

There are 2 best solutions below

0
On BEST ANSWER

Notice that if $x_i \rightarrow 0$, the objective goes to negative infinity, so that the inequality constraints will fail to bind at the optimum.

Then the Lagrangian is $$ \mathcal{L}(x,\lambda;\pi,M) = \sum_{i=1}^N \alpha_i \ln(x_i) - \lambda \left( \sum_{i=1}^n \pi_i x_i - M \right) $$

Do part 2 first using the envelope theorem (if you don't know the envelope theorem, I would look it up, because there is a whole genre of questions like #2 based on the envelope theorem). Differentiate the Lagrangian and evaluate at the optimum to get $$ \dfrac{\partial V}{\partial \pi_j} = - \lambda^* x_{j}^*. $$

To do part 1, the FONCs are $$ \dfrac{\alpha_i}{x_i} - \lambda \pi_i = 0, \quad i = 1, ..., N $$ and $$ \pi'x - M = 0. $$ Take the FONC's for any $x_i$ and some $x_j$ and solve to eliminate $\lambda$: $$ \dfrac{\alpha_i}{\pi_i x_i} = \dfrac{\alpha_j}{\pi_j x_j} $$ so that $$ x_j = \dfrac{\alpha_j \pi_i}{\alpha_i \pi_j}x_i. $$ Substitute the above into the constraint for each $j\neq i$ to get $$ \pi_i x_i + \sum_{j \neq i} \pi_j \dfrac{\alpha_j \pi_i}{\alpha_i \pi_j}x_i = M $$ or $$ \alpha_i x_i + x_i \sum_{j \neq i} \alpha_j = \frac{\alpha_i}{\pi_i} M $$ and $$ x_i^* = \dfrac{\frac{\alpha_i}{\pi_i} M}{\sum_{j=1}^N\alpha_j}. $$

0
On

From your response, I would like to suggest an additional way to approach the problem that will be simpler than Lagrange multipliers, which is the approach you seem to have started off with.

It should not be a strong assumption that there is at least one non-zero $\pi_k$. WLOG assume $\pi_n \ne 0$ (if not, just renumber them). Then, $$ x_n = M - \sum_{i=1}^{n-1} \frac{\pi_i}{\pi_n} x_i. $$ Now rename $p_i = \pi_i/\pi_n$ for all $i \in [n-1]$ and you are left to maximize $$ f(x_1, \ldots, x_{n-1}) = \sum_{i=1}^{n-1}\alpha_i \ln(x_i) + \alpha_n \ln\left(M - \sum_{i=1}^{n-1} p_i x_i\right) $$ which is a simpler problem than constrained optimization.