Finding minimal projections in subalgebra generated by a given set

44 Views Asked by At

Consider the set of complex matrices $\mathbb{C}^{n\times n}$ for some set. Suppose we have a set $\{A_1,\ldots, A_n\}$ of Hermitian matrices. We want to find minimal projections in the subalgebra $\mathcal{A}$ generated by this set of matrices.

Let's suppose the given subalgebra is a factor, then the minimal projections will all have the same rank (in fact, $\mathcal{A}$ is isomorphic to some $\mathbb{C}^{m\times m}\otimes 1_{n/m}$ for some $m$, but I do not wish to really construct this isomorphism, which would immediately give me all minimal projections).

An idea would be the following, which is also my question:

Choose at random a sequence $\{\gamma_1,\ldots,\gamma_n\}\in \mathbb{R}^{n}$ and consider the spectral projections of $\sum_i \gamma_i A_i$. They should be minimal with large probability. Is this correct and how can it be proven?

In fact I believe they should be minimial almost surely. My intuition is that the spectral projections of a random Hermitian matrix will be minimal projections, since random Hermitian matrices have always distinct eigenvalues. However, matrices of the form $\sum_i \gamma_i A_i$ might only form a measure zero subset of the whole algebra, so the argument doesn't work just like this. Can this be amended?