Multiplying an orthogonal matrix not change eigenvectors (directions)?
I think it is true in some cases.
The orthogonal matrix represents rotations and reflections. Eigenvectors do not change direction during stretching and shearing. Depending on the degree of rotation, eigenvectors may or may not change direction, right?
This is the example I tried, which seems not change eigenvectors
import numpy as np
# Create a dataset
data = np.array([[1, 2], [2, 3], [3, 4], [4, 5]])
# Compute the covariance matrix
cov_matrix = np.cov(data, rowvar=False)
# Compute the eigenvalues and eigenvectors of the covariance matrix
eigenvalues, eigenvectors = np.linalg.eig(cov_matrix)
print("Original Eigenvectors:")
print(eigenvectors)
# Create an orthogonal matrix for transformation
orthogonal_matrix = np.array([[0, 1], [1, 0]])
# Multiply the data by the orthogonal matrix
transformed_data = np.dot(data, orthogonal_matrix)
# Recompute the covariance matrix for the transformed data
cov_matrix_transformed = np.cov(transformed_data, rowvar=False)
# Compute the eigenvalues and eigenvectors of the new covariance matrix
eigenvalues_transformed, eigenvectors_transformed = np.linalg.eig(cov_matrix_transformed)
print("\nTransformed Eigenvectors:")
print(eigenvectors_transformed)
Eigenvalues also may or may not change?
Let $X$ be your $n\times p$ data matrix. You are computing the eigenvalues and eigenvectors of $\Sigma := (n-1)^{-1}X^\top(I_n - n^{-1}1_n^{~}1_n^\top)X$, i.e., you are trying to find $\lambda\in\mathbb R$ and $v\in\mathbb R^p$ such that $$\Sigma v = \lambda v.$$ Now consider the transformed data, i.e. $Y = TX$, where $T = cI_n$ for some $c\in\mathbb R$, and observe that $$\Omega := (n-1)^{-1}Y^\top(I_n - n^{-1}1_n^{~}1_n^\top)Y = c^2(n-1)^{-1}X^\top(I_n - n^{-1}1_n^{~}1_n^\top)X = c^2\Sigma.$$ It's easy to find $\mu\in\mathbb R$ and $u\in\mathbb R^p$ such that $\Omega u = \mu u$ now given that you know the eigenvalues and eigenvectors of $\Sigma$. Plugging in $\Omega$ yields the equation $c^2\Sigma u= \mu u$. Assuming $c\neq 0$, we can divide by $c^2$ and arrive at $\Sigma u = c^{-2}\mu u$. Since $\Sigma v = \lambda v$, we can deduce that $u = v$ (that is, the eigenvectors are the same) and $\lambda = c^{-2}\mu$ (that is, the eigenvalues change). So if you know the eigenvalues and eigenvectors of $\Sigma$, you can immediately get the eigenvalues and eigenvectors of $\Omega$ by the above reasoning.
Note that is, of course, only true for a transformation that is a scalar multiple of the identity matrix. If you chose a different orthogonal matrix, e.g. $$T = \begin{pmatrix} 2&0\\ 0&3\end{pmatrix}$$ both eigenvalues and eigenvectors of $\Omega$ will change.