Dimensionality Reduction

99 Views Asked by At

Let $X\in\mathbb{R}^{100\times 100}$ matrix and let its eigenvector and eigenvalues be $X_{vec}$ and $X_{val}$ respectively. If the rank of $X$ is $5$, then is it possible to approximate $X$ with $X_1\in\mathbb{R}^{5\times 5}$?

I have tried to look over internet about dimensionality reduction. Two methods are given: 1. using singular value decomposition(SVD) 2. using principal component analysis(PCA). While the method using SVD does not actually reduce the size of the matrix, it reduces the number of singular values. But this is not what I need. I want, if possible, to reduce the size of the matrix. Similarly, from my understanding, PCA does not reduce the size. So, is there any way to form $X_1$ out of $X$?

1

There are 1 best solutions below

0
On

How can you apply a $5 \times 5$ matrix to a 100-long input vector, and get a 100-long output vector ?

100 in   -->  5   -->  5   -->  100 out
        compress   A       expand

SVD gives $X = U D^2 V$ with
$\qquad V: 100 \to 5$
$\qquad D: 5 \to 5$ diagonal
$\qquad U: 5 \to 100$

One possibility is to take
$\qquad V5: 100 \to 5 =$ 5 biggest $\|\|_2$ columns of $D V$
$\qquad U5: 5 \to 100 =$ 5 biggest rows of $U D$
$\qquad A = U5\ V5: 100 \to 100$ .

This amounts to setting 95 input components and 95 output components to 0 -- sounds not so hot. Instead of zeros, try column / row averages ?

Fwiw, $U D^2 V x$ is fast in Numpy with Lapack --

python -m timeit -s '
import numpy as np
np.random.seed( 0 )
U = np.random.normal( size=(100,5) )
D = np.random.normal( size=5 )
V = np.random.normal( size=(5,100) )
x = np.random.normal( size=100 )

' \
'
UDV = U.dot( V.dot(x) )

(Texperts, how do I get labelled arrows please ?)