Optimization with probability constrain

44 Views Asked by At

Is there a clever way (analitically or nummerically) to minimize following objective function

$$ L(W, W_{in}, W_1, W_2) = \sum_{t=0}^T (W + W_{in}W_{out})\vec{x}(t) \cdot (W_1+W_2 W_{out})\vec{x}(t) $$ with $W \in R^{1 \times N},W_{in} \in R, W_{out} \in R^{1 \times N}, W_1 \in R^{1 \times N},W_2 \in R $ and $x(t) \in R^N$ is an arbitrary continious function.

The matrices should all have the constrain though, that it "looks" like they were drawn from a uniform (or gaussian) distribution. (I dont know how to formalize it mathematically correctly, maybe with something like: When $N \rightarrow \inf$, the values of the entries of the matrices "fill" the intervall $[-1,1]$ ) Otherwise they can be arbitrary, even in $N$.