I am dealing with a vector space where I can calculate the expansion of a vector in any basis of my choice:
$\vec{v}_1 = c_1 \vec{b}_1 + c_2 \vec{b}_2 + c_3 \vec{b}_3 + \ldots$
The way I can calculate the coefficients $c_i$ is not so important (basically by generating a system of equivalence relations and then solving for the vector in terms of the basis). What is important, is that I cannot compute scalar products between the vectors. There exist definitions of a scalar product on this vector space, but they are computationally too expensive to use.
My goal, however, is to determine an orthogonal basis. (Of course orthonormal would be even better, then I could simply define a scalar product myself.)
I guess it is hard to define what orthogonality means without a scalar product.
My idea is the following: If I change e.g. $\vec{b}_2$ to another vector $\vec{b^\prime}_2$ in the above equation then $c_1$ will also change because of the components of $\vec{b}_2$ and $\vec{b^\prime}_2$ that point into the direction of $\vec{b}_1$.
This is also the reason why I want an orthogonal basis: I would like to examine the influence of changing some basis vectors without influencing my choice for other basis vectors.
My ansatz so far was to use exactly this influence of the change of basis to find an orthogonal basis: Let's say I find two basis vectors $\vec{b}^\prime_2$ and $\vec{b}^{\prime\prime}_2$ that have exactly the same influence on $c_1$:
$c_1 \longrightarrow c_1^\prime=c_1^{\prime\prime}$
That should mean that their components in the direction of $\vec{b}_1$ should be equal and therefore that $\vec{b}^{\prime}_2-\vec{b}^{\prime\prime}_2$ is orthogonal to $\vec{b}_1$.
So far this has unfortunately not worked...