I have read everywhere that the resolution of the linear system
$$Ax+b=0$$
where $A\in S_n(\mathbb R)$ and $b\in \mathbb R^n$ is equivalent to the resolution of the following optimization problem:
$$f(x)=\frac 12\langle Ax,x\rangle+\langle b,x\rangle.$$
Which means that if $x\in \mathbb R^n$ minimize the function $f(x)=\frac 12\langle Ax,x\rangle+\langle b,x\rangle$, then $x$ is solution of the linear sytem $Ax+b=0$ where $A\in S_n(\mathbb R)$.
I do not understand why this should be true... Can someone please explain it?
U cannot prove this by taking the gradient! This is not a function but a functional! You should variate it! U have the following functional $$F[x]=\langle{Ax}, x\rangle-\langle{b, x}\rangle$$ U variate the functional $$\delta{F[x]}=\langle{Ax}, \delta{x}\rangle-\langle{b, \delta{x}}\rangle=\langle{Ax-b}, \delta{x}\rangle$$ For the extremum, u require the functional to be stationary with respect to any variations, i.e. $$\delta{F[x]}=\langle{Ax-b}, \delta{x}\rangle=0$$ As the above equality holds for any $\delta{x}$, u better have $$x^{T}A^{T}-b^{T}=0$$ Taking the transpose of this equation, u arrive at $Ax=b$ !