Multiple Polynomial Regression / Dictionary Learning with unknown Vandermonde dictionary

79 Views Asked by At

The question is not technical, but more to receive possible suggestions of applicability and applications of a particular mathematical model.

Consider the observation model $y_i= w_0 + w_1 x_i + \ldots + w_L x_i^K$, where $y_i$ is a scalar value expressed as a polynomial regression of a quantity $x_i$. Collecting $N$ observations $y_i$ and stacking them in a vector $y \in \mathbb{R}^N$, the overall model can be expressed in matrix vector form as

$$y= V(x)w$$

where $V(x)$ is the Vandermonde matrix $$V(x)= [1 \; x \; x^2 \cdots x^{K-1}]$$

where the power of each vector is to be considered element-wise. In this case $w \in \mathbb{R}^K$ , while $x \in \mathbb{R}^N$. Each row of this system expresses the observation model introduced at the beginning.

Assume that I also collect $L$ measurement vectors such as $y$ and I stack them in the matrix $Y \in \mathbb{R}^{N \times L}$. I can express the model as

$$Y= V(x)W$$ with $W \in \mathbb{R}^{K \times L} $ collecting the parameters of the model.

My question is: which examples of systems do follow such polynomial relationship? More importantly, in which contexts might it be of interest to learn not only the regression parameters $W$, but also the Vandermonde dictionary $V(x)$ itself? Clearly a straightforward algorithm would learn $V(x)$ and $W$ with some ambiguity. Are there applications in which we might be satisfied also with such ambiguities?

Similar problems are found in neural networks, where oftentimes a local minimum is satisfactory, especially when the learned parameters need not be interpretable.