In a lecture about operator theory we used the claim, that the set of projections in a von Neumann algebra $\mathcal M$ is dense in $\mathcal M$, with respect to the operator norm. Sadly that claim was not proven, therefore I tried it myself.
Let $\mathcal H$ be a (separable) Hilbert space and $\mathcal {M\subset B(H)}$ be a von Neumann algebra. Now let $T\in \mathcal M$ self-adjoint, where $\mathcal M'$ denotes the commutant of $\mathcal M$. We can now use the spectral theorem to approximate $T$ (in operator norm) by a linear combination of orthogonal projections. In order to find such orthogonal projections, we can use the spectral measure $E$ corresponding to $T$. For any $\varepsilon >0$ we can now find some step function $f=\sum_{j=1}^na_j \chi_{A_j} \in L^\infty(\sigma(T))$ with $$||T-f(T)||_{op}=||\int_{\sigma(T)}x-f(x)dE(x)||_{op}\leq ||id-f||_\infty<\varepsilon,$$ where $f(T)$ is a linear combination of orthogonal projections.
So far so good. For every $B\in \mathcal M'$ we know that $B$ commutes with $T$. My problem is, we need to know that $B$ also commutes with $\chi_A(T)$ for any compact $A\subset \sigma(T)$. If we can prove this then we have finished, since then are the projections of $\mathcal M$ dense in the subset of the self-adjoint operators of $\mathcal M$, and the set of the self-adjoint operators is itself dense in $\mathcal M$.
Question: How to prove for a self-adjoint $T\in\mathcal M$, that $B\chi_A=\chi_A B$ for any $B\in \mathcal M'$ and compact $A\subset \sigma(T)$?
I am grateful about any help.