Is Bilinear regression a form of Multivariate polynomial regression?

237 Views Asked by At

I came across a similarity/metric learning method that takes in the form of $x^TWy = z$, where $ x $ and $y$ are real valued vectors. For example, two images.

Breaking it into a more familiar form: $ x^TWy = \sum_{ij}w_{ij} x_{i}y_{j} = z $

This essentially looks very similar to polynomial regression with only interactions between features (without the polynomials). i.e. $z = f_w(x)= \sum_{i} w_ix_i + \sum _i\sum _{j=i+1} w_{ij}x_ix_j$

I was curious to see if the optimization for the matrix W is the same as doing optimization for multivariate linear/polynomial regression, since $x$ and $y$ are fixed, and the only variate is the weight matrix $W$?

And if it is a form of linear regression, is the optimization then convex?

1

There are 1 best solutions below

1
On

How your data looks like? If you have only one vector $(z, x_1, ..., x_p, y_1, ..., y_p)$ and you want to find matrix $W$ such that $z = x'Wy$, then this is simple linear equation with infinite set of solutions, i.e., infinitely many $ W$s can solve it. If you have $p(p+1)/2$ distinct vectors of that kind, i.e., $$ \{(z_i, x_{1i}, ..., x_{pi}, y_{1i}, ..., y_{pi})\}_{i=1}^{p(p+1)/2}, $$ then you'll have unique solution (no optimization is required, just inverting matrices in whatever method you like). And if you have more than $p(p+1)/2$ distinct such vectors, then this is a linear regression problem, i.e., $$ z_i = \sum_{i\ge j} \beta_{ij}x_iy_j + \epsilon_i, $$ that you can solve using ordinary least squares, which is indeed convex optimization problem.