So I am studying robotics and as an elective I took this industrial engineering topic course on machine learning. We are going over convex analysis (which I've never had any experience in). If someone could point me in the right direction that would be greatly appreciated!
Consider the sparse linear regression model:
$\min_{\beta_{0},\beta} \left \{ \frac{1}{2}\left \| \beta _{0}e + X\beta -y \right \|^{2} +\lambda\left \| \beta\right \|_{1}+\frac{\mu}{2}\left \| \beta \right \|^2\right \}$,
where $\mu \ge 0$ and $\lambda > 0$ are the parameters, and $e$ is the all-ones vector.
Let $\bar{\lambda} = \left \| X^{^{T}} \left ( \frac{e^{T}y}{n}e-y \right )\right \|_{\infty}$ Show that $\left ( e^{T}y/n,0 \right )$ is an optimal solution of the model for any $\lambda\geq \bar{\lambda}$.