Sample statistics can be estimated by solving an optimization problem.
- Is the optimization problem unique? As I know, the optimization problem for expectile is not unique.
- If not, is there a best one? For example, fastest convergence, unbiasedness, etc
- Linear regression(eg. quantile regression) is a minimization of such errors(eg. K-B error) w.r.t coefficients $(k_1,k_2,...,k_n)$. If the error is not unique, which should be used in the regression?
Eg.1
Sample quantile of $X$ can be estimated by solving the optimization problem
$ \min_{\theta} \sum_{x \in X} (f_{\alpha}(x-\theta)) $
where $f_{\alpha}(x) = \alpha |x| \quad(x>0),\quad = (1-\alpha)|x|,\quad (x\leq0) $.
Eg.2
Sample expectile of $X$ can be estimated by solving the optimization problem
$ \min_{\theta} \sum_{x \in X} (f_{\alpha}(x-\theta)) $
where $f_{\alpha}(x) = \alpha x^2 \quad(x>0),\quad = (1-\alpha)x^2,\quad (x\leq0) $.