Given the set of vectors $\{a,b,c\}$ in $\mathbb{R^3}$ that is linearly independent. Determine the parameter $\lambda\in\mathbb R$ such that the dimension of a subspace generated by vectors $2a-3b,(\lambda-1)b-2c,3c-a,\lambda c-b$ is equal to $2$.
Attempt:
We have a subspace $S=\left\{ \begin{bmatrix} 2 \\ -3 \\ 0 \\ \end{bmatrix},\begin{bmatrix} 0 \\ \lambda-1 \\ -2 \\ \end{bmatrix},\begin{bmatrix} -1 \\ 0 \\ 3 \\ \end{bmatrix},\begin{bmatrix} 0 \\ -1 \\ \lambda \\ \end{bmatrix}\right\}.$
Row echelon form of a matrix $\begin{bmatrix} 2 & 0 & -1 & 0\\ -3 & \lambda-1 & 0 & -1 \\ 0 & -2 & 3 & \lambda\\ \end{bmatrix}$ is $\begin{bmatrix} 2 & 0 & -1 & 0\\ 0 & \lambda-1 & -3/2 & -1 \\ 0 & 0 & 3(\lambda-2)/(\lambda-1) & (\lambda-2)(\lambda+1)/(\lambda-1)\\ \end{bmatrix}.$
Rank of this matrix is equal to $2$ if $\lambda=2$.
This means that for $\lambda=2$ the dimension of a subspace $S$ is equal to $2$.
Question: Is this correct?
Also, what does it mean that vectors $a,b,c$ are linearly independent if we don't know their dimensions?
A set of vectors $\{ v_{1}, \ldots, v_{k} \}$ is linearly independent if $\sum_{i=1}^{k} a_{i}v_{i} = 0 \implies a_{i} = 0$ for all $i \in \{1, \ldots, k\}$. That is, all vectors formed from linear combinations of our set of vectors are unique. Observe that this definition is independent of the dimension of the vector space. Of course, a maximal independent set is called a basis, which has cardinality equal to the dimension of the vector space.
Now consider: $$\{ (0, \lambda-1, -2), (0, -1, \lambda) \}$$
If we set $\lambda = 2$, then we have that these vectors are multiples of each other:
$$(0, 1, -2) = -1 \cdot (0, -1, 2)$$
Which implies the original set of vectors in your problem is linearly dependent. This is faster than doing the row-reduction procedure.