Any vector shorter than $\lambda_1^*$ must belong to a sublattice.

33 Views Asked by At

Let $R = \mathbb{Z}[x] / \langle x^n -1 \rangle$ and $f, g, F,$ and $G$ be polynomials in $R$.

Let $\Lambda_h = \{ (f, g)u + (F, G)v : u, v \in R \}$ be the $R$-module (lattice) generated by linear combinations of the vectors $(f, g)$ and $(F, G)$. Thus, $\Lambda \subset R^2$.

Let $\Lambda = \{ (f, g)u : u \in R \}$ and $\Lambda_2 = \{ (F, G)v : v \in R \}$. Note that $\Lambda_h = \Lambda + \Lambda_2$.

Let $\Lambda^*$ be the projection of $\Lambda_2$ orthogonally to $\Lambda$ and $\lambda_1^*$ the length of a shortest non-zero vector of $\Lambda^*$.

How can one prove the following claim?

$$\forall w \in \Lambda_h, ||w|| < \lambda_1^* \Rightarrow w \in \Lambda$$

In words, any vector of $\Lambda_h$ that is shorter than $\lambda_1^*$ must lie in the sublattice $\Lambda$.

More context

I have found this claim in Theorem 2 of this article. There, we have a ring of integers instead of $R$, but I think that it is not important.

Moreover, $fh = g \pmod q$ for some polynomial $h \in R$ and some integer $q$.