I haven't taken a course on Linear Algebra, but I kind of used the concepts of row/column vectors, matrices a lot. But somehow only with respect with the $\mathbb{R}^n$ vector space. When talking about other vector spaces, like a function space, do matrices (and implicitly column/row vectors) have that great impact?
My intuition says yes, because:
In $\mathbb{R}^3$:
Let $B = \{e_1=(1,0,0),e_2=(0,1,0),e_3=(0,0,1)\}$ be a basis of $\mathbb{R}^3$, where (a,b,c) is a 3-tuple
Then: $[(1,2,3)]_B = \begin{bmatrix}1\\2\\3\end{bmatrix}$
In the function space of polynomials:
Let $B = \{1,x,x^2\}$ be a basis
Then: $[1+2x+3x^2]_B = \begin{bmatrix}1\\2\\3\end{bmatrix}$.
Are these considerations useful or good to be kept in mind when working with linear algebra? I mean, is it important not to forget about matrices when turning from $\mathbb{R}^n$ to another vector space?