Maximising a quadratic function under a constraint

53 Views Asked by At

Let $$f:[0,1]^n\rightarrow\mathbb{R},~~(x_1,\ldots,x_n)\mapsto \sum_{i=1}^{n} x_i(1-x_i).$$

Now fix $\overline{x}\in[0,1]$. I would like to show that under the constraint $$\overline{x} = \frac{1}{n}\sum_{i=1}^n x_i,$$ $f$ is maximised by $$x_1 = \ldots = x_n = \overline{x}.$$ Background: The variance of a generalised Binomial distribution with given mean probability of success is maximised if all probabilities of success are equal.

I tried using a Lagrange multiplier and bordered Hessian, but I don't understand the result:

Consider $$L_{\overline{x}}(\lambda, x) = \sum_{i=1}^{n} x_i(1-x_i) - \lambda\left(\sum_{i=1}^n x_i-n\overline{x}\right).$$

Then the extended gradient and bordered Hessian of $L$ are given by (I think) $$\nabla L_{\overline{x}}(\lambda, x) = \left(n\overline{x}-\sum_{i=1}^n x_i,1-2x_1-\lambda,\ldots,1-2x_n-\lambda\right)$$ and $$HL_{\overline{x}}(\lambda, x) = \begin{pmatrix}0&-1&-1&\cdots&-1\\ -1&-2&0&\cdots&0\\ -1&0&-2&\ddots&\vdots\\ \vdots&\vdots&\ddots&-2&0\\ -1&0&\cdots&0&-2\end{pmatrix}.$$

Now, curiously, through Laplace's theorem, I think the determinant of $H$ has alternating sign with $n$: $$\mbox{det}(HL_{\overline{x}}(\lambda, x)) = -n\cdot(-2)^{n-1}$$

Are the conjecture and my approach reasonable? If so, is there a mistake?

1

There are 1 best solutions below

4
On BEST ANSWER

You can show your claim in an elementary way as follows: \begin{eqnarray*} \sum_{i=1}^{n} x_i(1-x_i) & = & \sum_{i=1}^{n} x_i - \sum_{i=1}^{n} x_i^2 \\ & = & n\bar x - \sum_{i=1}^{n} (x_i-\bar x + \bar x)^2 \\ & = & n\bar x - \left( \sum_{i=1}^{n} (x_i-\bar x)^2 + 2\bar x\underbrace{\sum_{i=1}^{n} (x_i-\bar x)}_{=0} + n\bar x^2 \right)\\ & = & n\bar x -\sum_{i=1}^{n} (x_i-\bar x)^2 - n\bar x^2\\ & \leq & n(\bar x - \bar x^2) \mbox{ with equality for } x_i =\bar x \mbox{ for } i=1,\ldots , n \end{eqnarray*}