I have a matrix:
$$ v = \begin{pmatrix} r_1 & r_2 \\ s_1 & s_2 \\ \end{pmatrix} $$
and I wish to find a numerical efficient way to compute:
$$ O = \begin{pmatrix} r_1 r_1 & r_1 r_2 & 0 & 0 \\ r_2 r_1 & r_2 r_2 & 0 & 0 \\ 0 & 0 & s_1 s_1 & s_1 s_2 \\ 0 & 0 & s_2 s_1 & s_2 s_2 \\ \end{pmatrix} $$
But I can't find a clean way to compute this.
I could compute the full outer product matrix and then just set the off diagonals to 0, but once I start moving to large v matrices this becomes infeasible. Another approach would be to compute the outer product of r first, and then the outer product of s, and then stitch it together, but is this really my best option?
To put it all into context: I'm trying to compute the Hessian of the softmax function, which is given as: H = diag(p) - outer(p) The problem is I do this in Matlab that is horribly inefficient when doing for loops, and furthermore I need to combine this into an anonymous function to use as a preconditioner.