"Leave-on-out Correlation" between Matrices

70 Views Asked by At

I'd like to enforce a special constraint in my optimization problem.
The solution to my problem is a set of matrices $Q_1, ..., Q_N \in \mathbb{R}^{G \times K}$ and I'd like to make sure that:

  • For any column vector $q_i$ of any matrix there must be at least one other matrix, say $Q_J$, such that no column vector in $Q_J$ is correlated with $q_i$ (their dot product is close to zero).

My thought was: Extend my objective $f$ with the hadamard product, i.e. $$ \min -f + \bigl\vert \, Q_1^T Q_2 \odot Q_1^T Q_3 \odot... \odot Q_1^T Q_N\,\bigr\vert_F$$ This ensures that each element is zero, if at least one of the correlations is zero. But I am not sure if this really solves my problem and is easy to optimize.

Are there any other solutions to approach my constraint?

Thanks in advance!

P.S.: I am solving a variational inference problem, i.e. my $f$ is the ELBO.