minimize a sum of weighted square distances

503 Views Asked by At

Let $f_1, f_2$ be given polynomials of degree $k$ and we want to find a polynomial $f$ of degree $k+1$ that solves the following minimization problem on $[0,1]$: $$f=\operatorname*{argmin}_{\hat{f}\in P_{k+1}([0,1])}w_1 \lVert \hat{f}-f_1 \rVert^2_{[0,1]}+w_2 \lVert \hat{f}-f_2 \rVert^2_{[0,1]},$$ where $w_1,w_2$ are positive constants and $\lVert\cdot\rVert_{[0,1]}$ denotes $L^2$ norm over $[0,1]$.

Surely we can compute underdetermined coefficients but it would too complicated if $k$ is large. I wonder if there is any trick to quickly find the minimizer (as well as minimum if possible) without excessive computations for a general degree $k$.

1

There are 1 best solutions below

0
On BEST ANSWER

Just use the identity:

$$ w_1||\hat{f} - f_1||_2^2 + w_2||\hat{f} - f_2||_2^2 = (w_1 + w_2)||\hat{f} - \frac{(w_1 f_1 + w_2 f_2)}{w_1 + w_2}||_2^2 + R(w_1, w_2, f_1, f_2) $$

where $R$ does not depend on $\hat{f}$. Now you can just look for a polynomial of degree $(k+1)$ that minimizes $L2$ distance to a given polynomial of degree $k$, which you can do using standard techinques.

Edit: I just wanted to add that one can write an analogous identity for $n$ components, so it should be possible to solve analogous problem with given $f_1 , f_2 , \ldots , f_n$ and positive $w_1, w_2, \ldots , w_n$.