I need to solve the following problem:
\begin{equation} \min_{(y, a)\in \mathbb R^n \times \mathbb R}\quad \ a+\beta\sum_{j=1}^m\max(0,y^Tb_j-a)+\|x-y\|_2^2 \end{equation}
My efforts show that $y^* =-0.5 \beta\sum_{j=1}^mb_j\delta_j+x$, where $\delta_j\in [0,1]$ for all $j\in [m]$, but I am not sure how to compute $a^*$ and even $\delta_j$'s. Any help would be appreciated!
If $\beta \le 0$ then the cost can be written as $c(y,a) = a-|\beta| \sum_k \max(0, b_k^Ty -a) + \|y-x\|_2^2 \le a + \|y-x\|_2^2 $, and so the program is unbounded below and has no solution.
If $\beta > 0$, the cost can be written as $c(y,a) = \sum_k \max(a, \beta b_k^T y+ {(1-\beta) \over q} a) + \|y-x\|_2^2$.
If $\beta \in (0,1)$, then $\inf c(y,a) \le c(0,a) = \sum_k \max(a, {(1-\beta) \over q} a) + \|x\|^2_2$ and we see that the right hand side is unbounded below and so the program has no solution.
If $\beta = 1$, the cost is $c(y,a) = \sum_k \max(a, b_k^T y) + \|y-x\|_2^2$, and it is clear that $c(y,a) \ge \sum_k b_k^T y + \|y-x\|_2^2$. Since for any $y$ we can choose $a$ such that we have equality, we can just focus on minimising $f(y) = \sum_k b_k^T y + \|y-x\|_2^2$. This is a convex problem and setting the gradient to zero gives $y^*=x-{1 \over 2} \sum_k b_k$. You can choose any $a^* \le \min_k b_k^T y^*$.
If $\beta >1$ then some work is needed to show that a solution exists, but it does not appear to have a simple solution as in the $\beta=1$ case. The problem can then be written as a semidefinite QP: $\min_{(y,a,t)} \{a + \beta \sum_j t_j +\|y-x\|_2^2 \mid 0 \le t_j, b_j^T y -a \le t_j \}$