How to find the covariance matrix $K$ (in some basis) of dimension $n$, given $\{$projected variance of $K$ along $V \enspace:\enspace \forall \enspace n$-dimensional vector $V\}$?
In other words, if we know the variance along any direction, how can a covariance matrix be recovered?
Your question is equivalent to "if I know the value of $v^\top K v$ for every vector $v$, how can I find $K$?" (Check that you understand that this is equivalent to what you are asking.)
Then the $(i,i)$th entry of $K$ is $e_i^\top K e_i$ where $e_i$ is the $i$th standard basis vector.
Note also that $(e_i + e_j)^\top K (e_i + e_j) = e_i^\top K e_i + e_j^\top K e_j + 2 e_i^\top K e_j$ which allows you to solve for $e_i^\top K e_j$, which is the $(i,j)$ entry of $K$.