why is SSE cost function convex?

971 Views Asked by At

In regards to Machine Learning, in the Adaline rule we say that

$$ J(w)=\frac{1}{2} \sum_{i} (\mbox{target}^{(i)} - \mbox{output}^{(i)})^2, \quad \mbox{output}^{(i)} \in \mathbb{R} $$

is convex. I´d like to know how can we say that? Some proof that it is always convex.

Thank you in advance

1

There are 1 best solutions below

1
On

Please check that the Hessian matrix is positively semidefinite.

For a more concrete answer please write precisely what do you understand by $w$.