Consider the following maximization problem: $$ \begin{split} \max_{(x_1,\ldots,x_n) \in \mathbb{R^+}^n} &\sum_{i=1}^n \alpha_i \ln(x_i)\\ \text{such that} & \sum_{i=1}^n \pi_i x_i = M,\\ & x_i \ge 0 \quad \forall i \in [n] \end{split} $$
- Find all solutions to the above maximization problem. Make sure to prove that your solution must be a maximizer.
- Let $x^* = (x_1^*, \ldots, x_n^*)$ be a solution to the above problem. Define $$ V^*(\pi_1, \ldots, \pi_n, M, \alpha_1, \ldots, \alpha_n) = \sum_{i=1}^n \alpha_i \ln(x_i^*). $$ Find the derivative of $V^*$ with respect to $\pi_j$ for any j.
To begin, I set up a Lagrangian so that I had the maximizing function above minus lambda multiplied by the constraint function above equal to zero. Then, I set up the first order conditions, taking a partial derivative with respect to xi and setting equal to zero, and then a partial with respect to lambda as the second equation. I am not sure how to proceed after that.
Notice that if $x_i \rightarrow 0$, the objective goes to negative infinity, so that the inequality constraints will fail to bind at the optimum.
Then the Lagrangian is $$ \mathcal{L}(x,\lambda;\pi,M) = \sum_{i=1}^N \alpha_i \ln(x_i) - \lambda \left( \sum_{i=1}^n \pi_i x_i - M \right) $$
Do part 2 first using the envelope theorem (if you don't know the envelope theorem, I would look it up, because there is a whole genre of questions like #2 based on the envelope theorem). Differentiate the Lagrangian and evaluate at the optimum to get $$ \dfrac{\partial V}{\partial \pi_j} = - \lambda^* x_{j}^*. $$
To do part 1, the FONCs are $$ \dfrac{\alpha_i}{x_i} - \lambda \pi_i = 0, \quad i = 1, ..., N $$ and $$ \pi'x - M = 0. $$ Take the FONC's for any $x_i$ and some $x_j$ and solve to eliminate $\lambda$: $$ \dfrac{\alpha_i}{\pi_i x_i} = \dfrac{\alpha_j}{\pi_j x_j} $$ so that $$ x_j = \dfrac{\alpha_j \pi_i}{\alpha_i \pi_j}x_i. $$ Substitute the above into the constraint for each $j\neq i$ to get $$ \pi_i x_i + \sum_{j \neq i} \pi_j \dfrac{\alpha_j \pi_i}{\alpha_i \pi_j}x_i = M $$ or $$ \alpha_i x_i + x_i \sum_{j \neq i} \alpha_j = \frac{\alpha_i}{\pi_i} M $$ and $$ x_i^* = \dfrac{\frac{\alpha_i}{\pi_i} M}{\sum_{j=1}^N\alpha_j}. $$