Orthogonal Basis Vectors

529 Views Asked by At

I have a question about expressing some coplanar vectors in terms of a smaller number of orthogonal basis vectors.

I have a small number of vectors $P$ in a higher dimensional space $N$, something like $5$ vectors in a $100$ dimensional space. I'm trying to figure out how to find

  1. the vector from the origin that is orthogonal to the 4-D plane that contains all $5$ points,
  2. $4$ orthogonal vectors parallel to that plane, and
  3. the $P\times P$ matrix to express the $5$ original vectors in terms of these new $5$ basis vectors.

This is problem I've been trying to crack for a work project and implement in Python, but I'm not very good at matrix algebra. I've been doing the 3Blue1Brown series and trying to figure it out myself.

Any thoughts?

1

There are 1 best solutions below

0
On

If I understand what you are trying to do, I think this Python code will accomplish it.

import numpy as np

# dimension of linear space
N = 100

# 5 column vectors (position vectors)
x = np.random.normal(0, 1, size=(N,5))

# matrix of displacement vectors
# the affine space containing the 5 points is
# all points of the form Ac + x[:,4]
A = x[:,:4] - x[:,4:5]

# the closest point in that affine space to the origin is
y = (A @ np.linalg.inv(A.transpose() @ A) @ A.transpose() @ (- x[:,4:5]) 
     + x[:,4:5])

# this should be zero - but isn't exactly, due to numerical linear algebra
A.transpose() @ y

# QR factorization
# the columns of q are an orthogonal basis for the columns of A
q, r = np.linalg.qr(A)

# add the vector y as a column
p = np.concatenate((q,y), axis=1)

# the orginal vectors are given by the equation p c = x
c = np.linalg.inv(p.transpose() @ p) @ p.transpose() @ x

# p c - x should be zero, but here we see the norms of the column vectors are
# all around 10^-15
np.linalg.norm(p @ c - x, axis=0)