I'm looking for the general form of a symmetric $3\times3$ matrix $\mathbf{A}$ with only two different eigenvalues, i.e. of a matrix with the diagonalized form $\mathbf{D}=\left(\matrix{a,0,0\\0,b,0\\0,0,b}\right)$.
In general, such a matrix can be described by 4 parameters, e.g. the two eigenvalues $a, b$ and the direction of the eigenvector of $a$ defined by the angles $\theta, \phi$ (in spherical coordinates). The other eigenvectors are in the plane perpendicular to this direction.
With these four parameters, $(a, b, \theta, \phi)$, I can construct arbitrary matrices with the eigenvalues $(a,b,b)$ by multiplying $\mathbf{D}$ with an appropriate rotation matrix $\mathbf{R}$ (consisting of the eigenvectors): $\mathbf{A} = \mathbf{R}^T \mathbf{D} \mathbf{R}$.
Is there any other (well-known?) parametrization (with 4 independent parameters) of such matrices $\mathbf{A}$? Ideally, a parametrization without (spherical) angles, but closely related to the matrix entries in $\mathbf{A}$?
The background of this question is that I want to find such a matrix $\mathbf{A}$ by least-squares fitting to measured data that depends on $\mathbf{A}$, and up to now it seems more efficient to vary 6 independent parameters $(d_1, \ldots, d_6)$ defining a general symmetric matrix $\mathbf{S}=\left(\matrix{d_1,d_2,d_3\\d_2,d_4,d_5\\d_3,d_5,d_6}\right)$ than to vary e.g. the 4 parameters $(a, b, \theta, \phi)$ from above. (More efficient means that the fit converges faster - in spite of having more degrees of freedom - and typically does not run into wrong local minima which sometimes happens depending on the initial values of the angles $\theta, \phi$.) It might help if I could express 2 of the 6 parameters $(d_1, \ldots, d_6)$ by the other 4 parameters and use these 4 parameters for fitting, so I could perhaps rephrase my question: Are there two "simple" dependencies $d_1=d_1(d_3,d_4,d_5,d_6)$ and $d_2=d_2(d_3,d_4,d_5,d_6)$ to describe the matrix $\mathbf{A}$?
(If this is a known problem, I'd be also grateful for pointing me to any text books or articles dealing with it - I wasn't able to find any.)
Two such dependencies are $$d_1 = d_6 + d_3(\frac{d_2}{d_5} - \frac{d_5}{d_2})$$ and $$d_4 = d_6 + d_5(\frac{d_2}{d_3} - \frac{d_3}{d_2})$$ for the entries of the symmetric matrix $\mathbf{S}$ as defined above (if the off-diagonal elements $d_2, d_3, d_5$ are not zero.)
This result can be checked by writing $\textbf{A} = \textbf{R}^T\mathrm{diag}(a,b,b)\mathbf{R}$ as (thanks to Orodruin at physicsforums.com!) $$\textbf{A} = \textbf{R}^T \left( b\mathbf{1} + (a-b) \begin{pmatrix}1&0&0\\0&0&0\\0&0&0\end{pmatrix} \right) \mathbf{R} = b\textbf{1} + (a-b)(\textbf{v}_1\otimes\textbf{v}_1) = b\textbf{1} + (a-b)\begin{pmatrix}rr&rs&rt\\rs&ss&st\\rt&st&tt\end{pmatrix} $$ where $\mathbf{v}_1=(r,s,t)^T$ is the eigenvector to the eigenvalue $a$, which satisfies $r^2+s^2+t^2=1$. Substituting $t=\sqrt{1-r^2-s^2}$, one finds $$ \textbf{A}=\begin{pmatrix}d_1&d_2&d_3\\d_2&d_4&d_5\\d_3&d_5&d_6\end{pmatrix} = \begin{pmatrix} b+(a-b)r^2 & (a-b)rs & (a-b)r\sqrt{1-r^2-s^2} \\ (a-b)rs & b+(a-b)s^2 & (a-b)s\sqrt{1-r^2-s^2} \\ (a-b)r\sqrt{1-r^2-s^2} & (a-b)s\sqrt{1-r^2-s^2} & b+(a-b)(1-r^2-s^2) \end{pmatrix} $$ which can be used to check the two dependencies given above.