$\newcommand{\ba}{\mathbf{a}}$$\newcommand{\bb}{\mathbf{b}}$$\newcommand{\bc}{\mathbf{c}}$ Given two vectors, $\mathbf{a}\in\mathbb{R}^{9}, \mathbf{b}\in\mathbb{R}^{6}$, each with a norm of $\lVert\mathbf{a}\rVert=\lVert\mathbf{b}\rVert=\sqrt{2}~$, along with nonzero scalars $c, \alpha_i,\beta_i,\gamma_i\in\mathbb{R}$, where $i\in\{0,1,2\}$, consider the following block vectors: $$ \mathbf{v}_0 := \begin{bmatrix} \alpha_0\ba\\ \alpha_1\bb\\ \alpha_2c\\ \end{bmatrix}~,\qquad % \mathbf{v}_1 := \begin{bmatrix} \beta_0\ba\\ \beta_1\bb\\ \beta_2c\\ \end{bmatrix}~,\qquad % \mathbf{v}_2 := \begin{bmatrix} \gamma_0\ba\\ \gamma_1\bb\\ \gamma_2c\\ \end{bmatrix} $$
I'm struggling to demonstrate the following:
- The existence of nonzero values of $\alpha_i,\beta_i,\gamma_i$, with $i\in\{0,1,2\}$, that:
1.1) make the set of vectors $\{\mathbf{v}_0, \mathbf{v}_1, \mathbf{v}_2\}$ mutually perpendicular.
1.2) satisfy the condition $\alpha_i^2+\beta_i^2+\gamma_i^2=1$ - That any other vector, constructed similarly, such as: $$ \mathbf{v}_3 :=\begin{bmatrix} \nu_0\ba\\ \nu_1\bb\\ \nu_2c\\ \end{bmatrix} $$ will be linearly dependent to the set $\{\mathbf{v}_0, \mathbf{v}_1, \mathbf{v}_2\}$.
Is there a way to show that that these points can be satisfied?
To clarify, in case I am posing a XY problem, my end goal is to demonstrate that $\mathbf{v}_0, \mathbf{v}_1, \mathbf{v}_2$ can be the sole eigenvectors / singular vectors with nonzero eigenvalues, of a symmetric matrix that is formed by a linear combination of the outer products: $\mathbf{v}_0\mathbf{v}_0^\top$, $\mathbf{v}_1\mathbf{v}_1^\top$, $\mathbf{v}_2\mathbf{v}_2^\top$.
Expanding the standard inner products leads to an equivalent statement: does there exist a $3\times 3$ real matrix $M = \begin{pmatrix}\alpha_0\sqrt{2}&\beta_0 \sqrt{2} & \gamma_0 \sqrt{2} \\ \alpha_1\sqrt{2}&\beta_1 \sqrt{2} & \gamma_1 \sqrt{2} \\ \alpha_2&\beta_2 & \gamma_2 \end{pmatrix}$ with nonzero elements such that the diagonal elements of $M M^T$ are $2,2,1$ and $M^TM$ is a diagonal matrix?
Suppose $F$ is a symmetric matrix with nonzero elements such that $F^2$ is the identity. Such matrices exist: e.g. $$ \begin{pmatrix} \frac{1}{4}-\frac{1}{\sqrt{2}} & \frac{1}{4} \left(\sqrt{2}+2\right) & \frac{1}{4} \\ \frac{1}{4} \left(\sqrt{2}+2\right) & \frac{1}{2} & \frac{1}{4} \left(\sqrt{2}-2\right) \\ \frac{1}{4} & \frac{1}{4} \left(\sqrt{2}-2\right) & \frac{1}{4}+\frac{1}{\sqrt{2}} \\ \end{pmatrix}$$
(I constructed this one by conjugating $\text{diag}(1,-1,1)$ with $\pi/4$ rotations in three planes.)
Then, if we take $D = \text{diag}(x,y,z)$ and $M = FD$, $M^T M$ is still diagonal, but with $x,y,z$ we can now vary the diagonal elements of $M M^T$. I didn't think much about values on the diagonal that we can get in that manner (as linear combinations with positive coefficients), and whether we can always get $2,2,1$, but at least for this specific $F$ Mathematica gives $x^2 = 2$, $y^2 = (9+4\sqrt2)/7$, $z^2 = 4(3-\sqrt2)/7$. So the claim is true.
Note: this problem might have a substantially easier solution.
Edit. 1) Yes, it indeed assumes $c = 1$, sorry... For some reason I missed $c$ completely, so the last row of $M$ should be $\alpha_2 c, \beta_2 c, \gamma_2 c$.
2), 3) With the corrected $M$, we have $M M^T = \begin{pmatrix}2(\alpha_0^2+\beta_0^2+\gamma_0^2) & \ldots & \ldots\\ \ldots & 2(\alpha_1^2+\beta_1^2+\gamma_1^2) & \ldots\\ \ldots&\ldots & c^2(\alpha_2^2+\beta_2^2+\gamma_2^2) \end{pmatrix}$ and $M^T M = \begin{pmatrix} \ldots & \alpha_2 \beta_2 c^2+2 \alpha_0 \beta_0+2 \alpha_1 \beta_1 & \alpha_2 \gamma_2 c^2+2 \alpha_0 \gamma_0+2 \alpha_1 \gamma_1 \\ \alpha_2 \beta_2 c^2+2 \alpha_0 \beta_0+2 \alpha_1 \beta_1 & \ldots & \beta_2 \gamma_2 c^2+2 \beta_0 \gamma_0+2 \beta_1 \gamma_1 \\ \alpha_2 \gamma_2 c^2+2 \alpha_0 \gamma_0+2 \alpha_1 \gamma_1 & \beta_2 \gamma_2 c^2+2 \beta_0 \gamma_0+2 \beta_1 \gamma_1 & \ldots \\ \end{pmatrix}$
The diagonal elements of $MM^T$ are scalar multiples of LHS in the conditions $\alpha_i^2 + \beta_i^2 + \gamma_i^2 = 1$, and the off-diagonal elements of $M^T M$ are the LHS of the orthogonality relations for $v_0, v_1, v_2$.
The problem is, after I reintroduce $c$, the equations on $x,y,z$ (that were $a,b,c$ in my older revision; changed notation to not confuse with OP) need not have real roots. For my $F$, the roots are all real if $|c| \in (3-2\sqrt2,2+2\sqrt2) \approx (0.17,4.83)$. This is definitely not the general case, and I'm not sure whether we can get arbitrarily close to $0$ or to $+\infty$ when varying $F$.