Column vectors as entries of a column vector

1.9k Views Asked by At

I'm having a hard time understanding notation. Because of notation, I've lost hours and hours trying to understand a simple concept. I'm going to post a picture of the pdf that I've so that you can see exactly what I'm seeing and therefore have eventually the same doubts that I've (I hope not).

enter image description here

As far as I understand from the explanation $b_1, b_2, \cdots, b_n$ (in bold) of the basis $B$ are column vectors.

I understand everything until the "or" (which comes after the system of equations). I don't understand this

enter image description here

I've a few questions that might help you to help me.

  1. What's the meaning of putting column vectors in a column vector. This quite blows my stupid mind.

  2. How is the linear system of equations

enter image description here

(before the "or") equivalent to this? Maybe a step my step explanation would clarify.

Note that I understood well the linear system of equations and everything else, except, again, the equivalence between the linear system of equations and the equation that comes immediately after, which apparently should be equivalent.

3

There are 3 best solutions below

2
On

$S$ is just the matrix n-by-n: $S=s_{ij}$. So if you make the product of the matrix $S$ by the column vector $(b_1, \ldots, b_n)$ (sorry i wrote it as a row vector) you get exactly that system of equations.

4
On

The representation in terms of vectors it is just a more compact way to express a linear system as the one you are given. $S$ in particular is the matrix of coefficients, which contains in its row and columns all the coefficients of the linear system, namely $s_{11},s_{12},...$ :

\begin{bmatrix} s_{11}\ s_{12}\ s_{13}\ ... \ s_{1n}\\ ........\\ ........\\ s_{n1}\ s_{n2} \ s_{n3} \ \dots s_{nn} \end{bmatrix}

in such a way that if you multiply the matrix $S$ times the vector $\bf{B}$ you obtain the RHS of your linear system.

0
On

The source of my confusion was that I was studying the "change of basis" topic also from another resource:

http://www.math.ku.edu/~lerner/LAnotes/Chapter14.pdf

In that resource, a change of basis matrix $P$ from a basis $e$ to a basis $f$ puts in relation two matrices $F$ and $E$ as follows

$$F = E \cdot P$$

where $F = \begin{pmatrix} f_1, \cdots , f_n \end{pmatrix}$ (and $f_i$ are column vectors) and $E = \begin{pmatrix} e_1, \cdots , e_n \end{pmatrix}$ (and $e_i$ are column vectors).

As you can see, this is a little bit different from the expression that I've in my question above.

enter image description here

But now just take the transpose of both sides

$$\begin{pmatrix} \widetilde{b}_1 \\ \vdots \\ \widetilde{b}_n \end{pmatrix}^T = \left( S \begin{pmatrix} b \\ \vdots \\ b_n \end{pmatrix} \right) ^T$$

$$\begin{pmatrix} \widetilde{b}_1 & \vdots & \widetilde{b}_n \end{pmatrix} = \begin{pmatrix} b \\ \vdots \\ b_n \end{pmatrix}^T \left( S \right) ^T$$

$$\begin{pmatrix} \widetilde{b}_1 & \vdots & \widetilde{b}_n \end{pmatrix} = \begin{pmatrix} b & \cdots & b_n \end{pmatrix} \left( S \right) ^T$$

In that article they also note that $S$ actually contains the columns and rows reverse (as we wouldn't expect), but now everything is clear. We have taken the transpose.

So, $S$ (in that article) looks like $$\begin{pmatrix} s_{11} & s_{21} & \cdots & s_{n1} \\ s_{12} & s_{22} & \cdots & s_{n2} \\ \cdots \\ s_{1n} & s_{2n} & \cdots & s_{nn}\end{pmatrix}$$