I built a Variational auto-encoder (VAE), that takes an input $G \in \mathbb{R}^{K\times N \times N}$ (No prior Knowledge about its distribution) and tries to reconstruct it using reconstruction and KL-divergence losses.
The bottleneck of the VAE is two matrices:
matrix of means $\mu \in \mathbb{R}^{N\times N }$
matrix of variances $\sigma \in \mathbb{R}^{N\times N }$
Therefore it is possible to sample $Z$ in way : $$ z_{ij} \in \mathcal{N}\left(\mu_{ij}, \sigma_{ij}^2\right) \longrightarrow z_{ij}=\mu_{ij}+\sigma_{ij} \odot \epsilon, \text { where } \epsilon \in \mathcal{N}(0,1) $$ Let's assume that I have successfully trained the model and have found the optimal values for $\mu$ and $\sigma$. My goal now is to find a probabilistic and significant dimensionality reduction of $Z$ (or its parameters) that is independent of the variable $N$. I am hoping to achieve a reduced dimension of $N \times d$, where $d$ is an arbitrary number.