$$\hat{\beta}(\lambda) = \arg \min_\beta (|| Y - X\beta ||_2^2 /n + \lambda ||\beta||_1)$$
It says the whole optimization above is convex. I know the logic here. Both the sum of squares and the lasso penalty are convex, and so is the lasso loss function. But I don't know how to see "the lasso penalty are convex".
The Lasso penalty is, up to a positive scalar factor $\lambda$, the function $$ \beta\to \|\beta\|_{l^1}=\sum_i|\beta_i| $$ This function is convex, being a sum of convex functions $\beta\to|\beta_i|$. More generally, any norm is convex: $$ \|\alpha x+(1-\alpha)y\|\le \|\alpha x\|+\|(1-\alpha)y\|=\alpha\|x\|+(1-\alpha)\|y\| $$ by the triangle inequality and the (positive) homogeneity of the norm.