Maximising function of $n$ variables

447 Views Asked by At

I am considering the following function $$f(x_1,\dots,x_n,y)=-\alpha \left(y-k_1\right)^2-\beta \sum_{i=1}^n\left(k_2-x_i\right)^2-\gamma \sum_{i=1}^n\left(y-x_i\right)^2 - \frac{\delta}{y-d} \sum_{i=1}^n (x_i-d)\, ,$$ where $(x_1,\dots,x_n,y)\in[d,1]^{n+1}$, $d>0$ and $\alpha$, $\beta$, $\gamma$, $\delta$, $k_1$, $k_2>0$. Moreover, $x_i\leq y$ $\forall$ $i=1,\dots,$ $n$. I'm trying to calculate the maximum of this function on that domain.

Befor using "brute force" approach (i.e. by calculating derivatives, Hessian and so on), I wonder whether it's possible to obtain the absolute maximum in a more simple way. For example, I notice that $f\leq0$...

2

There are 2 best solutions below

1
On BEST ANSWER

Consider $y$ fixed as Paul Sinclair suggested. Your objective function is now separable: you can optimize each $x_i$ independently. It is also concave in $x_i$ (since the second derivative is negative), so it is maximized when the derivative is 0: $$-2\beta (k_2-x_i) - 2\gamma(y-x_i) - \frac{\delta}{y-d} = 0$$ which can be written as $$2(\beta + \gamma)x_i = 2\beta k_2 + 2\gamma y + \frac{\delta}{y-d}$$ so the solution is $$x_i = \frac{\beta k_2 + \gamma y}{\beta + \gamma} + \frac{\delta}{2(\beta + \gamma)(y-d)}$$ You can plug this in and maximize over just $y$. Since the last term in your objective is not squared, maximizing over $y$ is not simple: there are probably multiple local optima. You could perform grid search for $y$, or apply a gradient based optimization algorithm and try multiple starting points.

0
On

Are really the derivatives way the bruit force?

The derivates of $$f(x_1,\dots,x_n,y)=-\alpha \left(y-k_1\right)^2-\beta \sum_{i=1}^n\left(k_2-x_i\right)^2-\gamma \sum_{i=1}^n\left(y-x_i\right)^2 - \frac{\delta}{y-d} \sum_{i=1}^n (x_i-d)\, $$ are zeros in the stationary points, $$\begin{cases} f''_{x_j}(x_1,\dots,x_n,y)=2\beta \left(k_2-x_j\right)+2\gamma \left(y-x_j\right) - \dfrac{\delta}{y-d}\, =0\\ f'_y(x_1,\dots,x_n,y)=-2\alpha \left(y-k_1\right)-2\gamma \sum\limits_{i=1}^n\left(y-x_i\right) - \dfrac{\delta}{y-d} \sum\limits_{i=1}^n (x_i-d)\, =0, \end{cases}$$ or, for $y\not=d,$ $$\begin{cases} 2\beta (y-d)\left(k_2-x_j\right)+2\gamma (y-d) \left(y-x_j\right) - \delta\, =0, \quad j=1\dots n\\ -2\alpha (y-d)^2\left(y-k_1\right)-\gamma (y-d)^2\sum\limits_{i=1}^n\left(y-x_i\right) - \delta \sum\limits_{i=1}^n (x_i-d)\, =0, \end{cases}$$ so $$\begin{cases} -2(\beta+\gamma) (y-d)x_j + 2(y-d)(\beta k_2+\gamma y) - \delta\, =0, \quad j=1\dots n\\ -2(\beta+\gamma) (y-d)\sum\limits_{i=1}^n x_i + 2n(y-d)(\beta k_2+\gamma y) - n\delta\, =0\\ (\gamma-\delta)(y-d)^2\sum\limits_{i=1}^nx_i + (2\alpha k_1 - \gamma y)n(y-d)^2 +n\delta d\, =0. \end{cases}$$ Summation of the second and third equations factors $2(\gamma-\delta)(y-d)$ and $(\beta+\gamma)$ gives $$\begin{cases} -2(\beta+\gamma) (y-d)x_j + 2(y-d)(\beta k_2+\gamma y) - \delta\, =0, \quad j=1\dots n\\ 2(\gamma-\delta)(2n(y-d)(\beta k_2+\gamma y) - n\delta) + ((2\alpha k_1 - \gamma y)n(y-d)^2 +n\delta d)(\beta-\gamma)\, =0, \end{cases}$$ and this leads to the cubic equation for $y$ and explicit expressions for $x_j.$