Let $(L, D(L))$ be a self-adjoint operator in a Hilbert space $\mathfrak{H}$ (in particular, I am assuming this operator is the generator of $C_0$-contraction semigroup on $\mathfrak{H}$), and let $E$ be the spectral resolution corresponding to $L$:
$-L = \int_{[0, \infty)} \lambda \textrm{ d} E(\lambda)$.
We know that given any $\phi \in L^{\infty}([0, \infty))$, we may define a bounded operator $\phi(-L) = \int_{[0, \infty)} \phi(\lambda) \textrm{ d} E(\lambda)$.
I'm interested in what conditions are required such that for any $f \in \mathfrak{H}$ and $v \in \mathfrak{H}$ there exists $\phi \in L^{\infty}([0, \infty))$ such that
$v = \int_{[0, \infty)} \phi(\lambda) \textrm{ d} E_{f, \cdot}(\lambda)$.
For example, if $\mathfrak{H} = L^2(U)$ for some bounded open $U \subset \mathbb{R}^d$ and $L = \Delta$ with Dirichlet boundary conditions, the eigenvectors form an orthonormal basis for $L^2(U)$ so if $v = \sum_{k} v_i e_i$, then $\phi$ would be given by any mapping such that $\lambda_i \mapsto v_i$ where $\lambda_i$ is the eigenvalue corresponding to $e_i$.
Is there any similar results in the general case? In particular, I would be interested in the case where $L$ is a second order divergence form elliptic operator defined in some $L^2(E, \textrm{ d} \mu)$.