v1 = (2,3,t), v2 = (t,t2, t2). For each t when v1 and v2 are not orthogonal, find an orthogonal basis of the subspace spanned by v1 and v2.
In this case does finding an orthogonal basis of the subspace spanned by v1 and v2 mean the same as transforming the vectors into an orthogonal basis {v1,v2}?