I have a covariance matrix $\mathbf{S} = \mathbf{X}^T \mathbf{X}$. I then divide the matrix $\mathbf{X} \in \mathbb{R}^d$ that has $N$ rows into $n$ portions $\mathbf{x_i} \in \mathbb{R}^d$ each having $N_i$ rows such that:
$$ \mathbf{X} = \begin{bmatrix} \mathbf{x_1} \\ \mathbf{x_2} \\ \vdots \\ \mathbf{x_n} \end{bmatrix} $$
with $N = \sum_{i=1}^{n} N_i$.
I compute covariance matrices on each portion so that in general we have:
$$ \mathbf{S_i} = \mathbf{x_i}^T \mathbf{x_i} $$
I noticed that:
$$ \mathbf{S} = \mathbf{S_1} + \mathbf{S_2} + \dots + \mathbf{S_n} $$
since:
$$ s_{ij} = \sum_{k=1}^{N} x_{ik} x_{kj} = \sum_{\substack{k=1\\k \in (1)}}^{N_1} x^{(1)}_{ik} x^{(1)}_{kj} + \sum_{\substack{k=1\\k \in (2)}}^{N_2} x^{(2)}_{ik} x^{(2)}_{kj} + \dots + \sum_{\substack{k=1\\k \in (n)}}^{N_n} x^{(n)}_{ik} x^{(n)}_{kj} $$
where $x_{ij}$ are elements of $\mathbf{X}$ and $x_{ij}^{(p)}$ are elements of $\mathbf{x_p}$.
I call each $\mathbf{S_i}$ a component covariance matrix.
Suppose I find the eigenvectors of each component covariance matrix $\mathbf{S_i}$. Are there known theorems that can help me find out a relationship between the eigenvectors of all $\mathbf{S_i}$ and the eigenvectors of $\mathbf{S}$?
For the moment, it only seems intuitive to me to think that such relation should exist, since the component matrices are computed on portions of $\mathbf{X}$ and the information about each portion $\mathbf{x_i}$ is as well contained in $\mathbf{X}$. So perhaps eigenvectors of $\mathbf{S}$ would be as well "informed" by the eigenvectors of all $\mathbf{S_i}$?