Thank you all in advance I have been having some trouble figure out the following problem. You are given a sample {$y_i$}, i=1,… N, from an unknown probability distribution p(y). I want to show the value of $t$ that minimizes the following:
$$ f(t) = \sum_{i=0}^n |y_i - t|^R $$
a) $t$ is the median that minimizes the equation above, when R = 1
b) $t$ is the average that minimizes the equation above, when R = 2
c) $t$ is average between the highest and smallest $y$ when R = $\infty$
I know to minimize the problem we want to take the derivative and set it equal to zero and for the first part a) I know that $$ {d\over dt} |y_i - t| = \begin{cases} -1, & \text{if t > $y_i$} \\ 1, & \text{if t < $y_i$} \end{cases} $$
However I am not sure how to handle the summation and when R = $\infty$. Any tips would be helpful. Thank you!!
(b) is simple as the absolute value goes away when squaring it and this becomes a normal optimization problem.
For (a) suppose that the terms are increasing order so that $y_0 \leq y_1 \leq ... \leq y_n$ then split the summation into two parts
$f(t) = \sum_{i=0}^{k}|y_i-t| + \sum_{i=k}^{n}|y_i-t|$
Where $k$ is defined as the maximum value of $i$ such that $y_i < t$. You can then get rid of the absolute value signs. Note that you will also have to consider the case when $t=y_i$ which isn't complicated.
For (c) split the summation into two parts as for (a). Then show that
$\lim_{R \to\infty} \sum_{i=0}^{k}|y_i-t|^R + \sum_{i=k}^{n}|y_i-t|^R = \lim_{R \to\infty}(t-y_0)^R + (y_n-t)^R $
You can then optimize $t$ in the equation on the right hand side