Rank Nullity for Vector Space over Finite Field

737 Views Asked by At

We know that the standard inner product of a vector space over a finite field might not be positive definite.

e.g. $$x = (1, 1) \in \mathbb{Z_2}\times\mathbb{Z_2}$$ $$x \cdot x = 1 + 1 = 2 = 0$$

Also we defined two vectors $x,y$ to be orthogonal if $x \cdot y = 0$

So my question is,

Let $V$ be an $n$ dimensional vector space over a finite field $F$,

Let $B = \{x_1, x_2,...x_n\}$ be a basis such that

$$x_i \cdot x_j = 0 \text{ for } i \neq j \ \ (1)$$ $$x_1 \cdot \ x_1 = 0 \ \ (2)$$

Then any $1 \times n$ matrix with $x_1$ being its row would fail the rank-nullity theorem.

I know that this is impossible because you cannot have a non-zero vector being orthogonal to everything in the space. Therefore the construction of $B$ is impossible.

But you take a basis $C = \{x_1, v_2, v_3, ... v_n\}$, apply gram-schmidt to that without normalizing the vectors, then won't you get an orthogonal basis that satisfy the condition $(1), (2)$

So I'm just wondering what is wrong here?

Nevermind I figured out.... because $x_1 \cdot x_1 = 0$, gram-schmidt would blow up as soon as I try to work out the second vector.

1

There are 1 best solutions below

2
On BEST ANSWER

The rank nullity theorem just says something about the dimensions of a pair of related subspaces. Whether or not you can find orthogonal bases of those subspaces using a given inner product is a different question. Your example shows the answer may be "no".