Given a lattice $\mathcal{L}$ with minimum $\lambda_1(\mathcal{L})$, how can we describe the minimum of a translated lattice $t + \mathcal{L}$ for some $t \in \text{span}(\mathcal{L})$, $t \notin \mathcal{L}$? (As $t + \mathcal{L}$ does not contain 0, it is technically not a group hence not a lattice, but $\lambda_1(t + \mathcal{L})$ should still be well defined).
My first impression is $\lambda_1(t + \mathcal{L}) \leq ||t||_2 + \lambda_1(\mathcal{L})$ but I can't seem to prove or disprove this.
I'm interested in any relationship between the minimums of $\mathcal{L}$ and $t + \mathcal{L}$.
Unless I am missing something, it seems clear to me that since $t\in (t + \mathcal{L})$, then $$\lambda_1(t + \mathcal{L}) \leq ||t||_2 .$$ However, this is not particularly useful because $t$ could be very large and still generate a translated "lattice" that is not very far from the original.
I am not sure exactly what you are trying to achieve, but here is an approach that might be of some usefulness: First, remember that you can write a basis of your lattice in matrix form with the basis vectors being the columns of the matrix. Let's assume that the dimension of the matrix is $n$.
If you have more than $n$ vectors, you can have an extended matrix and you could find a basis by doing a Hermite decomposition to bring the matrix to Hermite Normal Form.
With that introduction, here is what you can do:
This effectively gives you a way to calculate $\lambda_1(t + \mathcal{L})$ through a different object that is a lattice.
Hope this helps.