Find the orthogonal projection of a bounded set

49 Views Asked by At

Suppose that we have a bounded subset in $\mathbb{R}^n$ defined as $f(x) = [f_1(x), f_2(x), \dots, f_n(x)]$, where $f(x): [a,b] \in \mathbb{R} \rightarrow A \in \mathbb{R}^n$. We are interested to find a set of $k < n$ vectors in $\mathbb{R}^n$, $\{u_1, \dots, u_k\}$ such that we can approximately represent $f(x)$ using a linear combination of these vectors, i.e.

$\hat{f}(x) = \sum_{i=1}^k c_i u_i.$

We want to approximate $f(x)$ as well as possible.

So far, I did the following:

(1) We want to minimize $||f(x) - \hat{f}(x)||_2^2$ for all $x \in [a,b]$. If $x \in \{x_1, \dots, x_m\}$ was discrete, we would minimize $\frac{1}{m} \sum_{i=1}^m ||f(x_i) - \hat{f}(x_i)||_2^2$. Equivalently, we should now minimize $\int_{a}^b ||f(x) - \hat{f}(x)||_2^2 dx$.

(2) Suppose that $\{u_1, \dots, u_k\}$ are orthogonal with norm 1, we have $\hat{f}(x) = \sum_{i=1}^k \langle f(x), u_i\rangle u_i$

How can I solve this optimization problem? In summary,

$\arg \min\limits_{u_1, \dots, u_k} \int_{a}^b ||f(x) - \sum_{i=1}^k \langle f(x), u_i\rangle u_i||_2^2 dx$

$s.t. \langle u_i, u_j \rangle = 0 $ for $i \neq j$, $\langle u_i, u_i \rangle = 1 $

I think it might be possible to first compute a matrix $A_{n \times n}$ and then use eigenvalue decomposition and select the first $k$ eigenvectors. But I don't know how this matrix should be related to $f(x)$.