What happens to eigenvalues and eigenvectors after a given matrix transformation

341 Views Asked by At

Assume I have a square matrix $A$ with the following properties:

  1. $A$ is positive definite
  2. $A$ is symmetric
  3. SVD of $A$ is $ A = U \Sigma U^{'} $

Now, I form a new matrix $B$ as follows:

$b_{ij} = \alpha \times a_{ij} $ if $(i-j)$ is perfectly divisible by 2.

$b_{ij} = 0 $ otherwise. (Note that $\alpha$ is a real scalar)

*My aim is to find the eigen values and eigen vectors of $B$? Are there any well known theorems that we could use to find the eigen values and eigen vectors of $B$? Can someone provide any guidance in how to solve this problem.

1

There are 1 best solutions below

0
On

Let $n$ denote the size of $A$. I'll focus on the case where $n$ is odd, but the case where the size is even is essentially the same. Let $P$ denote the permutation matrix whose columns are $$ P = \pmatrix{e_1 & e_3 & \cdots & e_n & e_2 & e_4 & \cdots & e_{n-1}}, $$ where $e_j$ denotes the $j$th column of the size $n$ identity matrix. Verify that we have $$ P^TBP = \alpha\pmatrix{A_{1} & 0 \\0 & A_2}, $$ where $A_1$ is the submatrix of $A$ attained by deleting the even rows and columns, and $A_2$ is the submatrix of $A$ attained by deleting the odd rows and columns. To find the eigenvalues of $B$, we can find the eigenvalues of $A_1$ and $A_2$ separately, then multiply these eigenvalues by $\alpha$.