$f(x,y) = 2x+y$ subject to $g(x,y)=x^2+y^2-1=0$. The Lagrangian function is given by
$$ \mathcal{L}(x,y,\lambda)=2x+y+\lambda(x^2+y^2-1), $$
with corresponding
$$ \nabla \mathcal{L}(x,y,\lambda)= \begin{bmatrix} 2 + 2\lambda x \\ 1+2\lambda y \\ x^2+y^2-1 \end{bmatrix}. $$
From the latter we can see that $x=\frac{-1}{\lambda}$ and $y=\frac{-1}{2\lambda}$, which we can substitute into $x^2+y^2=1$ to obtain $\lambda = \pm\sqrt{\frac{5}{4}}$. Meaning that $x = \pm \dfrac{2}{\sqrt{5}}$ and $y= \pm \dfrac{1}{\sqrt{5}}$. We find the critical points $(\frac{2}{\sqrt{5}}, \frac{1}{\sqrt{5}})$ and $(\frac{-2}{\sqrt{5}}, \frac{-1}{\sqrt{5}})$.
I am confused on how I should proceed to check wether these points or minima or maxima? I know the Hessian is involved, but which one?
In general when optimizing $f(x)$ subject to $g(x)=0$, you solve the problem $\nabla f(x)=\lambda \nabla g(x)$ and the critical points can be checked by the bordered Hessian matrix:
$$H=\begin{pmatrix} 0 & g_x & g_y\\ g_x & f_{xx}+\lambda g_{xx} & f_{xy}+\lambda g_{xy}\\ g_y & f_{yx}+\lambda g_{yx} & f_{yy}+\lambda g_{yy} \end{pmatrix}.$$
Define $h:=\det(H)$. If $h>0$ at the critical point, you're at a maximum and if $h<0$ you're at a minimum. Equivalently, you can replace all instances of $g$ in the above Hessian with $-g$ so that the derivatives in the bottom right square become those of $f-\lambda g$. Indeed, consider expanding $f-\lambda g$ around each critical point taking into account that you are moving along $g$.