Least squares, Approximation by a constant

199 Views Asked by At

I have this problem to solve for a homework and and I am a bit lost. I don't really understand what I have to do and how should I approach it.

As I understand it, I have to use the least-square method to find a constant function that minimize the errors squared. For the first question, what is the function that I have to derivate to find $= 0$ ? Is it simply $f(x) = b$ since we have a constant function ? And for the second questions, I am equally lost to what I have to do.

Here is the problem :

Question 1. Approximation by a constant Let us assume that we are given $n>0$ real-valued observations $y_1,\dots,y_n \in (-\infty,+\infty)$ and that we would like to find simple approximation to them in terms of affine functions of some other real-valued points $x_1,\dots,x_n \in (-\infty,+\infty)$.

We first consider the case where the observations are approximated by a real-valued constant $\alpha$. Let us assuming that the quality of this approximation is measured by the following function of $\alpha\in (-\infty,+\infty)$:

$$ f(\alpha)=\sum_{i=1}^n (y_i-\alpha)^2. $$

  1. The $y_i$ values being fixed, justify why $f$ possesses derivatives over $(-\infty,+\infty)$ and calculate its first and second derivatives $f'$ and $f''$ in closed form.
  2. Show that $f$ posseses a unique point of minimum $\alpha^*$ and that its value is actually given by the arithmetic average of the observations.

Can you please help me ? Thank you