I am trying to prove $ \det(A)\neq 0$ follows that the $y_j$, are linearly independent. See below lemma for details -
Source: The excerpt is taken from "Algebraic number theory and Fermats last theorem by Ian Stewart, David Tall (3rd ed.) on page 28.
My Proof:
It is given, $y_i =\sum_ja_{ij}x_j$.
If we write $0=\sum_i b_i (y_i) =\sum_{i} b_i (\sum_ja_{ij}x_j)= \sum_{i,j} b_i (a_{ij}x_j) = \sum_{j}(\sum_{i} a_{ij}b_i) x_j$, it implies $$\sum_{i}a_{ij} b_i = 0,$$
for each column $j=1,\dots,n$,
it means for a fixed column-index $j$ we get the sum $\sum_{i}a_{ij} b_i $ for all row-index $i$, so the associated matrix is,
$$D_{1\times n}=$$ $$\begin{bmatrix} a_{11}b_1+a_{21}b_2\cdots + a_{n1}b_n & a_{12}b_2+a_{22}b_2\cdots + a_{n2}b_n & \cdots& \cdots& a_{1n} b_1+ a_{2n}b_2 \cdots +a_{nn}b_n \end{bmatrix} $$ $$= \begin{bmatrix} \sum_{i}a_{i1} b_i & \sum_{i}a_{i2} b_i & \cdots& \cdots& \sum_{i}a_{in} b_i \end{bmatrix} $$ where, $$ A_{n \times n} = (a_{ij})= \begin{bmatrix} a_{11}&a_{12}&\cdots && a_{1n}\\ a_{21}&a_{22}&\ddots&&\vdots\\ a_{31} & a_{32} & \ddots&a_{3(n-1)}&a_{3n}\\ \vdots & \vdots& \ddots & \ddots & a_{(n-1)n}\\ a_{n1} & a_{n2} & \cdots &a_{n(n-1)}& a_{nn} \end{bmatrix} $$ and, $$ b_{n \times 1}=(b_i)= \begin{bmatrix} b_{1} \\ b_{2} \\ b_{3} \\\cdots \\b_{n} \end{bmatrix}$$
$$ b^T_{1 \times n}=(b_i)=\begin{bmatrix} b_{1} & b_{2} & b_{3} &\cdots &b_{n} \end{bmatrix}$$
The vector $b$ is defined as a column-vector $\begin{bmatrix}b_{1} & b_{2} & b_{3} &\cdots &b_{n} \end{bmatrix}^T$. This is a standard convention, for instance, to write a system of linear equations in a concise form, we write $Ax=b$.
So, the $(1,j)^{th}$ entry ( $j^{th}$ column of the $1^{st}$ row) of row-vector $b^TA=C_{1 \times n}$ is $$b_{1}a_{1j} + b_{2}a_{2j} + · · · +b_{n} a_{nj}$$ $$=\sum_{k=1}^n b_{k}a_{kj}.$$
$$b^TA=C_{1 \times n}=\begin{bmatrix} \sum_{k=1}^n b_{k}a_{k1} & \sum_{k=1}^n b_{k}a_{k2} & \cdots& \cdots& \sum_{k=1}^n b_{k}a_{kn} \end{bmatrix} $$ $$=\begin{bmatrix} \sum_{k} b_{k}a_{k1} & \sum_{k} b_{k}a_{k2} & \cdots& \cdots& \sum_{k} b_{k}a_{kn} \end{bmatrix} $$ $$=\begin{bmatrix} \sum_{k} a_{k1}b_{k}& \sum_{k} a_{k2}b_{k} & \cdots& \cdots& \sum_{k} a_{kn}b_{k} \end{bmatrix} .$$
Both indices $i$ in $\sum_{i}a_{ij} b_i $ and $k$ in $\sum_{k} a_{kj}b_{k}$ take value from $1$ to $n$, we realize, $i = k $, thus, $$b^TA =\begin{bmatrix} \sum_{k} a_{k1}b_{k}& \sum_{k} a_{k2}b_{k} & \cdots& \cdots& \sum_{k} a_{kn}b_{k} \end{bmatrix} $$ $$=\begin{bmatrix} \sum_{i} a_{i1}b_{i}& \sum_{i} a_{i2}b_{i} & \cdots& \cdots& \sum_{i} a_{in}b_{i} \end{bmatrix} $$ $$ = D_{1\times n} .$$
$\therefore \sum_ia_{ij}b_j=0 \implies b^TA=0$, if $\det A \neq 0$, then $A$ has an inverse $A^{-1}$, thus, $b^TA=0\implies b^T A A^{-1}= 0 A^{-1} \implies b^T= 0\implies (b_i)= 0$, i.e. $b_i=0$ for all $i$, this implies, $y_i$ are linearly independent.
QUESTION:
I am new to the topic, and trying to learn thoroughly, so my question is, is my proof correct? Is there any misconception or error? Plz comment if it is right, post answer if you find any error. Thanks.

The simplest way to prove linear independence is to think of the vectors as elements of $\mathbb{Q}^n$ (identifying $G$ with $\mathbb{Z}^n$ via the given basis).
Since $\det(A)\ne0$, the vectors are linearly independent as elements of $\mathbb{Q}^n$ and, a fortiori, in $\mathbb{Z}^n$.
With your approach, you get the equations $$ \sum_{j}a_{ij}b_i=0\qquad (i=1,2,\dots,n) $$ but the linear system $$ \sum_{j}a_{ij}x_i=0\qquad (i=1,2,\dots,n) $$ has a unique solution when considered over the rationals (or the reals, if you prefer) because the matrix is invertible.