Vector similar to, but not quite equal to least squares approximation of another vector

32 Views Asked by At

Suppose we have some vector $V$, and we want to locate the vector $kV$ which best approximates some other vector $J$. The answer $T$, in the least-squares sense, is the vector you get by projecting $J$ orthogonally onto the subspace through $V$. We can compute this using the pseudoinverse (given that T, V, and J are all row vectors)

$$ T = JV^+V = V \frac{\langle V, J \rangle}{\langle V, V \rangle} $$

There is, however, another vector which I've sometimes run into when deriving things, which is very close, but not exactly equal to the above least-squares vector. That vector $W$ is the unique multiple of $V$ such that $W$, if projected orthogonally onto the subspace through $J$, is equal to $J$ itself. This can also be solved in closed form using the pseudoinverse

$$ W = \frac{V}{V J^+} = V \frac{\langle J, J \rangle}{\langle V, J \rangle} $$

Here is a quick drawing of what I am talking about:

Picture of vectors

If the angle between V and J is relatively small, then T and W will be close. In particular, if $||J|| = 1$, then we have that $||T-J|| = \sin(\theta)$ and $||W-J|| = \tan(\theta)$, which are approximately equal whenever $\theta$ is small. So, whenever $W$ appears, I typically interpret it as "approximately $T$".

But I am curious if there is some standard way to interpret $W$ as a useful approximation in its own right, with some rigorous interpretation perhaps in statistics. Does $W$ minimize some other kind of error than the least-squared error? Does it have a name? One way to interpret it is that it's the unique point such that $W$ approximates $J$ better than any scalar multiple of $J$. But, it's a bit strange to think of it this way, as $J$ is typically predetermined and the degree of freedom we have is through scaling $V$, where the least-squared solution would be $T$.

Not sure if this is best here or on the stats StackExchange, but thought I'd ask here first.