How can I solve for a combined regression & regularized polynomial product identity?

15 Views Asked by At

It is a well known fact in signal processing, that polynomials $$p_1(t) = \sum_k c_{1k}t^k= {\bf \Phi}{\bf c_1}\\p_2(t) = \sum_k c_{2k}t^k = {\bf \Phi} {\bf c_2}$$ where we have coefficient vectors : $${\bf c_i}=[c_{i0},c_{i1}, c_{i2}, \cdots, c_{in}]^t$$ and where $\bf \Phi$ is the kernel matrix : $${\bf \Phi} = \begin{bmatrix}1&t&t^2&\cdots&t^n\\1&t&t^2&\cdots&t^n\\\vdots&\ddots&\ddots&\ddots&\vdots\\1&t&t^2&\cdots&t^n\end{bmatrix}$$can be multiplied with each other $p_3(t) = p_1(t)\cdot p_2(t)$ and that the coefficients will be transformed according to :

$${\bf c_3} = {\bf c_1} * {\bf c_2}$$

Where $*$ denotes non-circular zero-padded convolution.

Now to the question, If I have a problem where I simultaneously want to find $f_1(t)$ as regression given known data as pairs $(\hat t, \hat f_1(t))$ and that the product with a known $f_2(t)$ shall be "as close as possible" to $p(t)$ : $$f_1(t) \cdot f_2(t) \approx p(t)$$ And that we might want to regularize both $p$ as well as $f_1$.

How can I express this problem as one big optimization incorporating everything?