I'm trying to work through the process talked about in the article "Fixing a broken correlation matrix" by Phil Joubert and Stephen Langdell (found here https://www.nag.co.uk/IndustryArticles/fixing-a-broken-correlation-matrix.pdf), but I'm having trouble replicating the results in the 3x3 example they give. I'm attempting the "quick and dirty" method of patching up the broken matrix $C$ given by
\begin{bmatrix}1&0.95&0\\0.95&1&0.95\\0&0.95&1\end{bmatrix}
It says $C$ can be decomposed as $C = Q\Lambda Q^T$ where $Q$ is the matrix whose columns are eigenvectors of $C$ and $\Lambda$ is a matrix with the corresponding eigenvalues on the diagonal and zeros elsewhere. The article claims that we can get the "patched up" matrix $C'= Q\Lambda' Q^T$ where $\Lambda'$ is $\Lambda$, but with negative entries replaced by zero. However, I do not get the $C'$ they get given by
\begin{bmatrix}1&0.73454&0.07908\\0.73454&1&0.73454\\0.07908&0.73454&1\end{bmatrix}
The eigenvalues I get for $C$ are $\Lambda = \{-\frac{19\sqrt{2}-20}{20}, 1, \frac{19\sqrt{2}+20}{20}\}$ (these match the paper) and the corresponding eigenvectors (in matrix form, i.e. $Q$) are
\begin{bmatrix}1&1&1\\-\sqrt{2}&0&\sqrt{2}\\1&-1&1\end{bmatrix}
From what it sounds like in the article, I now need to replace the first eigenvalue with 0, and come up with $C'$ as described earlier where $\Lambda'$ is given by
\begin{bmatrix}0&0&0\\0&1&0\\0&0&\frac{19\sqrt{2}+20}{20}\end{bmatrix}
However, $C'= Q\Lambda' Q^T$ with these matrices does not give the $C'$ found in the article. Did I calculate my eigenvalues and eigenvectors incorrectly? At first I thought I needed to calculate a different eigenvector with $\Lambda_1 = 0$, but using that eigenvector did not give me their $C'$ either. Any help would be greatly appreciated!
I do not yet see exactly, what happened here, but I'm able to reproduce the given matrix $C'$ First, I get the same eigenvalues by similarity rotations on C. But by rotations, the matrix $Q$ in my setting is a scaling of your $Q$. If you take $Q \cdot Q^t $ with your matrices, you don't get the identity matrix - but which is expected if $Q$ is a rotation-matrix/is orthonormal. So my matrix $R = \operatorname{norm_columns}(Q)$ which has the sum-of-squares over a column equalling $1$ does the job better. (my earlier remark $R=Q/2$ was wrong and based only on a too short view on the two matrices)
The next step is not yet sufficient: doing $S = R^t \cdot \Lambda' \cdot R$ gives now a covariance-matrix: $$ \text{ S =} \left[ \begin{array} {rrr} 1.09& 0.83& 0.09\\ 0.83& 1.17& 0.83\\ 0.09& 0.83& 1.09 \end{array} \right]$$ The final step could be the (standard) norming of a covariance to a corrleation-matrix: you put the diagonal elements $s_{k,k}$ of $S$ into the diagonal matrix $D = \operatorname{diag}(\{1/\sqrt{s_{k,k}} \}) $ and multiply $C' = D \cdot S \cdot D $ getting $$ \text{ C' =} \left[ \begin{array} {rrr} 1.000& 0.735& 0.079\\ 0.735& 1.000& 0.735\\ 0.079& 0.735& 1.000 \end{array} \right] $$ which looks trustworthy. So the final step in the described method was possibly forgotten to be written down...
[update] Well, I just looked into the article. There they write (on page 3)
Well, they mentioned it - but somehow cursory...