Solving optimzation problem

87 Views Asked by At

I would like to solve to find the vector Y, solution of this maximization problem

Max Y'C + Y'Br + αr0

Subject two constraints: k=sqrt(Y'ΣY) and Y'e + α = 1

Where Y, C and B are columns vector of n lines.

Σ is symetric matrix of n order

e =(1,....1)' and α is a reel parameter.

r and r0 are reel scalars

C, B, r, r0 are knowns.

I did calculus with lagrangian but i fear that i did some error. So if someone can help me to solve this fo Y and α.

1

There are 1 best solutions below

11
On

k=sqrt(Y'ΣY) is a non-convex, and therefore difficult to deal with constraint. It also deviates from the standard formulation, which is to limit portfolio standard deviation, not force it to equal a fixed value. Therefore, I will presume the constraint actually is $\sqrt{Y'ΣY} \le k$, which is equivalent to $Y'ΣY \le k^2$ In reality, as long as the input data are such that positive return can be achieved, the optimal $Y$ will satisfy this constraint with equality, unless that is not feasible (possible), in which case the equality constrained version would not have a solution, whereas the inequality constrained version still would.

With this modification, this is a convex Quadratically Constrained Quadratic Programming problem (QCQP), which is really a Quadratically Constrained Linear Programming problem. A convex optimization system, such as CVX, will convert this to a Second Order Cone Problem (SOCP) and call a solver to solve it. The term ar0 in the objective function doesn;t affect the optimal Y, so I will dispense with it, but it can be included if you want.

Here is the CVX formulation:

cvx_begin
variable Y(n)
maximize(Y'*(C+B*r))
Y'*Sigma*Y <= k^2
sum(Y) + alpha == 1
cvx_end

CVX will call the solver and report the result.

Instead of Y'*Sigma*Y <= k^2, this constraint can be specified directly as a second order cone constraint as norm(R*Y) <= k, where $R$ is the upper triangular Cholesky factor of $\Sigma$, such that $R^TR = \Sigma$, which is what CVX would do under the hood anyway. This presumes $\Sigma$ is positive definite. In any event, presumably $\Sigma$ is positive semidefinite, being a covariance matrix.