Familiy of positivity linear penalties in quadratic programming

42 Views Asked by At

I have the following family of quadratic programming problems: \begin{equation} x_t = \arg\min_{x\succeq 0} \frac{1}{2}\|Ax-b\|^2 + r(t)c_t^Tx, \, t\in (0, +\infty). \end{equation} where $A$ is square invertible matrix, scalar-valued function $r(t)$ and family of vectors $c_t$ are such that \begin{equation} r(t) \geq 0, \, r(t) \rightarrow +\infty, \, c_t\succeq 0, \, c_t \rightarrow c_* \succeq 0. \end{equation} Notation $x\succeq 0$ means that $x$ belongs to the positive cone of some $R^p$.

The question is: does $x_t$ has a limit, i.e., does it exist $\lim\limits_{t\rightarrow +\infty} x_t$? Intuitively I want to say, yes, and the limit should be the following one: \begin{equation} \lim_{t\rightarrow +\infty} x_t = \arg\min_{x\succeq 0, \, x^Tc_* = 0} \|Ax-b\|^2. \end{equation} Intuitively, the linear penalty $c_t^Tx$ which is positive does not push solution to infinity and $r(t)\rightarrow +\infty$ dominates the solution so it becomes just a constraint.... I also think that if there is a limit, convergence should be uniform in $b$ (in fact for each $t$ these programming problems are the projections onto convex sets, which are known to be Lipschitz in $b$).

I managed to show $x_t$ is necessarily bounded (which is quite obvious). Indeed, point $x = 0$ belongs to the set of constraints, therefore \begin{equation} \frac{1}{2}\|Ax_t-b\|^2 + r(t)c_t^Tx_t \leq \frac{1}{2}\|b\|^2. \end{equation} From the above inequality and the fact that $c_t^Tx_t\geq 0$ it follows that \begin{equation} \|Ax_t-b\|\leq \|b\| \Rightarrow \|x_t\| \leq 2(\min_{\sigma\in \sigma(A)}|\sigma|)^{-1}\|b\|, \end{equation} where $\sigma(A)$ is the set of eigenvalues of $A$ which are all non-zeros by the initial assumption on $A$. Boundeness of $x_t$ is proved.

1

There are 1 best solutions below

0
On BEST ANSWER

I close this question since I've found the answer. In general, there is no limit for $x_t$ under the given assumptions.

Let us consider dimension $d=2$, $A = I$. Vector $b$ I will denote by $x_0$ and choose $x_0 = (2,2)$.

Choose also $c_* = (0,1)$, and \begin{equation} c_t = c_* + \dfrac{u_t}{r(t)}, \, u_t = (|\sin(t)|, 0). \end{equation}

Then the expression for $x_t$ can be rewritten as follows: \begin{equation} x_t = \arg\min_{x\succeq 0} \dfrac{1}{2}\|x-(x_0 - r(t)c_t)\|^2 = \mathrm{Proj}(x_0 - r(t)c_t, R^2_+), \end{equation} where $\mathrm{Proj}(x,X)$ denotes the euclidean projection of point $x$ on set $X$, $R^2_+$ is the positive cone of $R^2$.

It is easy to check (make a simple picture) that for such choice of $c_t$, the minimizer $x_t$ will be of the form \begin{equation} x_t = (x_{01} - r(t) c_{t1}, 0). \end{equation}

For parameters chosen above we have that \begin{equation} x_t = (2 - |\sin(t)|, 0). \end{equation} Clearly the above sequence is not convergent.

Hypothesis: $\lim_{t\rightarrow +\infty}x_t$ would exist if we would assume in addition that \begin{align} r(t)(c_t - c_*) \rightarrow 0 \text{ when } t\rightarrow +\infty. \end{align} But this question is not of my interest, since I know that I cannot have such rate for $c_t$.