How to Compute the Second Derivative of a Quadratic Maximization Problem with Respect to the Weight Parameter?

91 Views Asked by At

I am working on a problem involving the maximization of a function subject to a normalization constraint, specifically in the context of quadratic forms and eigenvalues. The objective function is defined as:

$$ F(\mathbf{x}) = \mathbf{x}^T(w\mathbf{Q_1} + (1-w)\mathbf{Q_2})\mathbf{x} $$

where $\mathbf{x}$ is the vector of decision variables subject to $\mathbf{x}^T\mathbf{x} = 1$, $\mathbf{Q_1}$ and $\mathbf{Q_2}$ are symmetric positive semi-definite matrices, and $w$ is a weight parameter varying between 0 and 1. The goal is to maximize $F(\mathbf{x})$ and analyze how the optimal solution changes with respect to $w$. $\mathbf{Q_1}$ and $\mathbf{Q_2}$ are also similar matrices.

I have successfully computed the first derivative of $f_1(\mathbf{x}_0) = \mathbf{x}_0^T\mathbf{Q_1}\mathbf{x}_0$ with respect to $w$, where $\mathbf{x}_0$ is the eigenvector corresponding to the maximum eigenvalue $\lambda_0$ of the weighted matrix $\mathbf{A}(w) = w\mathbf{Q_1} + (1-w)\mathbf{Q_2}$ and found that its a monotonically increasing function with respect to the weight parameter.

$$\frac{d}{dw}f_1(\mathbf{x_0})=\frac{2}{1-w}\sum_{i=1}^{N-1}\frac{\left(\mathbf{{x}_i}^T\mathbf{Q_1}\mathbf{x}_0\right)^2}{\lambda_0-\lambda_i}$$

However, I am struggling with how to approach the calculation of the second derivative, $\frac{d^2}{dw^2}f_1(\mathbf{x}_0)$, especially considering the implicit dependence of $\mathbf{x}_0$ on $w$.

Questions:

  1. What is a systematic approach to computing the second derivative $\frac{d^2}{dw^2}f_1(\mathbf{x}_0)$ for this optimization problem?
  2. How do we account for the dependence of $\mathbf{x}_0$ on $w$ when computing this second derivative?