I was reading through a research paper on compressive particle filtering for target tracking, and I came across the following:
Let $z \in d_{z}$ denote a vectorized image with $d_{z}$ pixels. Assume that $z$ is $K$–sparse in a basis given by the columns of $\Psi \in \Re^{d_{z} \times d_{z}}$, meaning that $z = Ψθ$ and at most $K$ of the coordinates of $θ$ are nonzero. Compressive sensing theory ensures that $θ$, and therefore $z$, can with high probability be exactly recovered from appropriate linear projections onto the rows of a measurement matrix $Φ ∈ \Re^{d_{y}×d_{z}}$ , with $d_{y} < d_{z}$. Specifically, define the $coherence$ $μ$ between $Φ$ and $Ψ$ as $μ ≡ \sqrt{d_{z}} max_{i,j} <\phi_{i}, \psi_{j}> \in [1, \sqrt{d_{z}}]$ for all rows $\phi_{i}$ and columns $ψ_{j}$ of $Φ$ and $Ψ$, respectively;
What is the meaning of the line "$z$, can with high probability be exactly recovered from appropriate linear projections onto the rows of a measurement matrix $Φ ∈ \Re^{d_{y}×d_{z}}$ , with $d_{y} < d_{z}$"
What is the meaning of "linear projections onto rows of measurement matrix" in this context? Please explain philosophically as well as mathematically.
This basically means that if you acquire $d_y$ measurements of the form $$y = \Phi z = \Phi \Psi \theta$$ You can reconstruct $z$ with high probability from measurements stored in vector $y$, even if the above system of linear equations in undetermined as you have less measurements than unknowns. In order to achieve this, matrix $\Phi$ should satisfy a condition called restricted isometry property which essentially means maximum and minimum singular values of $\Phi$ with high probability are in the interval $(1-\delta,1+\delta)$ for some $\delta>0$ which might depend on the number of nonzero entries of $\theta$ (i.e. $K$). This concept is strongly related to the Johnson–Lindenstrauss lemma where the linear map $f$ in the wiki page is just pre-multiplication with matrix $\Phi$.