Define all points in the affine integral lattice.

148 Views Asked by At

Define all points in the affine integral lattice $\mathcal{L}=\{(x,y,z,t) : x+y+z+t=5$ and $x-z \equiv 0$ (mod $12$)$\} \subset \mathbb{Z}^4$.

This is a question from a practice exam I have with no solutions. I'm not certain what to do but this is what I have so far. It could be completely wrong so take it with a pinch of salt. I will be using an algorithm that our lecturer described in one of our lectures.

First define $H =\{(x,y,z,t) : x+y+z+t=0$ and $x-z \equiv 0$ (mod $12$)$\}$.

We will be considering $\pi_i:\mathbb{Z}^4\rightarrow \mathbb{Z}$.

Consider $\pi_4:\mathbb{Z}^4\rightarrow \mathbb{Z}$.

Then ker($\pi_4$) we choose a vector satisfying the conditions stated in $H$. i.e. $(1,9,1,1) \in H$

Next we consider ker($\pi_4)\cap H=\{(x,y,z,0) : x+y+z=0$ and $x-z \equiv 0$ (mod $12$)$\}$.

This time I choose the vector $(1,10,1,0) \in$ ker($\pi_4)\cap H$.

I then go to do this again and find that the next two vectors must be $0$, as in the first case I consider $\{(x,y,0,0) : x+y=0$ and $x \equiv 0$ (mod $12$)$\}$ so $x$ must be zero and hence so must $y$, and similarly for the next case.

So I therefore think that I have a basis for $H$, i.e.

$$ H= \begin{bmatrix} 1 \\ 9 \\ 1 \\ 1 \\ \end{bmatrix} \cdot \mathbb{Z} + \begin{bmatrix} 1 \\ 10 \\ 1 \\ 0 \\ \end{bmatrix}\cdot \mathbb{Z} $$

And then to rewrite to include the $5$ I write: $$ \mathcal{L}= \begin{bmatrix} 0 \\ 5 \\ 0 \\ 0 \\ \end{bmatrix} + \begin{bmatrix} 1 \\ 9 \\ 1 \\ 1 \\ \end{bmatrix} \cdot \mathbb{Z} + \begin{bmatrix} 1 \\ 10 \\ 1 \\ 0 \\ \end{bmatrix}\cdot \mathbb{Z} $$

So basically my question is: have I answered the question and/or have I found a basis for $\mathcal{L}$?

EDIT: I edited the

$$ \begin{bmatrix} 0 \\ 5 \\ 0 \\ 0 \\ \end{bmatrix} $$

from a $$ \begin{bmatrix} 5 \\ 0 \\ 0 \\ 0 \\ \end{bmatrix} $$ so that it wouldn't effect the $x-z$.

2

There are 2 best solutions below

0
On

I realised that I made a mistake in my answer.

The third and fourth vector should respectively be defined as:

$ \begin{bmatrix} 12 \\ -12 \\ 0 \\ 0 \\ \end{bmatrix} $ and $ \begin{bmatrix} 0 \\ 0 \\ 0 \\ 0 \\ \end{bmatrix} $

Therefore the basis is as follows:

$$ \mathcal{L}= \begin{bmatrix} 0 \\ 5 \\ 0 \\ 0 \\ \end{bmatrix} + \begin{bmatrix} 1 \\ 9 \\ 1 \\ 1 \\ \end{bmatrix} \cdot \mathbb{Z} + \begin{bmatrix} 1 \\ 10 \\ 1 \\ 0 \\ \end{bmatrix}\cdot \mathbb{Z} + \begin{bmatrix} 12 \\ -12 \\ 0 \\ 0 \\ \end{bmatrix}\cdot \mathbb{Z} $$

0
On

Your second vector, $$ \begin{bmatrix} 1 \\ 10 \\ 1 \\ 0 \end{bmatrix}, $$ is not a member of $H$.

Specifically, $$1+10+1 = 12 \ne 0.$$ Notice that the first equality is not modulo 12.

This means that your basis is not a basis for $H$.