Parameterisation that enforces positivity of linear combination of projections.

12 Views Asked by At

I have a set of $n$ vectors that form a basis for a subspace, $v_i$. Suppose I take a linear combination $w = \sum^n_{i=0} v_i \alpha_i, \quad \alpha_i \in \mathbb{R}$ such that for a set of vectors, $\sigma_1, ...\sigma_m$, $w^T \sigma_i > 0 \quad \forall i\lt m$. I'm scratching my head trying to think of some sort of parameterisation of the alphas (possibly non-linear, but at least differentiable as the alphas are determined by a neural network) that can enforce this constraint while still spanning the entirety of the set of alphas that satisfy that constraint. . I think that the strict inequality might pose an issue as well, but with regards to that I am completely okay with taking that constraint as $w^T \sigma_i \geq 0 + \epsilon$. Is there any obvious solution that I might be missing? Any help would be greatly appreciated.