What is the difference between a matrix and a tuple of vectors?

682 Views Asked by At

Why do we need the term matrix? Why can't we just use vectors to define everything we need?

I understand we need the terms object, set, group, field, vector, and vector space.

I don't understand why we need the term "matrix".

Is it just shorthand? Similar to the term "Ket" used in Dirac Notation for Quantum Mechanics? There a "Bra" is a co-vector and a "Ket" is a vector.

BTW, a $n \times 1$ matrix is a vector.

3

There are 3 best solutions below

3
On BEST ANSWER

A vector is an element of a vector space. Associated with each vector space is a field, and elements of the field are called scalars. If $V$ is a vector space, and $F$ is the associated field of scalars, then we say that "$V$ is a vector space over $F$".

For example, $\mathbb{R}^3$ is a vector space over the field $\mathbb{R}$. In the context of this example, elements of $\mathbb{R}^3$ are called vectors, and elements of $\mathbb{R}$ are called scalars.

An $m\times n$ matrix is an $m\times n$ array of scalars. Let $F$ be a field, and let $F^{m\times n}$ be the set of all $m\times n$ matrices with coefficients in $F$. Since there is a nice way of adding matrices together and a nice way of multiplying a matrix by a scalar, we notice that $F^{m\times n}$ is actually a vector space over $F$. So we can think of matrices as vectors. Since we can think of matrices as vectors, why have a separate name for matrices? Why not just think of matrices as vectors?

The answer is that there are special things that you can do with matrices that you can't always do with vectors. In addition to being able to add matrices, and multiply matrices by scalars, we can also multiply matrices by matrices.

Matrix multiplication (i.e. multiplying matrices by matrices) is important and it is used to do a variety of things. One of the main things we can with matrices is the following:

If we have two finite dimensional vector spaces $U$, $V$ over a field $F$, and we have a linear transformation from $T:U\to V$; then once we pick a basis of $U$ and a basis for $V$, there is a nice way to represent $T$ as a matrix. If $S:V\to W$ is another linear transformation (where $W$ is another finite dimensional vector space over $F$), then once we pick a basis for $W$ we can get a matrix for $S$ as well. If $A$ is the matrix for $S$ and $B$ is the matrix for $T$, then $AB$ will be the matrix for the composition $S\circ T:U\to W$.

The above example is probably the most important use of matrices in linear algebra, and it is the reason why matrix multiplication has the peculiar definition that it is has. Although once we have this definition, it turns out that there are other things we can use matrices for. I close by giving one such example:

If $V$ is a finite dimensional vector space over $F$, then we can define a bilinear form to be a map $V\times V\to F$, that is linear in both variables. Well it turns out that once we have pick a basis for $V$, there is a nice of representing each bilinear form $V\times V\to F$ as a matrix.

0
On

In many ways they're the same and you can treat matrices like vectors in every way. The main distinction is matrix multiplication which is a powerful computational device which we use to model linear functions between vector spaces in some basis. It's useful to keep the maps separated from the vectors and the multiplication gives them additional structure a vector may not have.

0
On

I don't mean to revive this question although it's been answered a year and three months ago, with an accepted answer, but I just wanted to illustrate how to picture a matrix like a vector and mention that multiplication allows one to have a ring structure, but you may or may not be interested in Rings... Also, this link provides some great answers as to the usefulness of matrix multiplication.

Just like vectors in $\mathbb{R}^n$ can be pictured as linear combinations of the vectors in the standard basis

$\mathcal{S}=\underbrace{\left\{\begin{pmatrix}1\\ 0\\ \vdots\\ 0\end{pmatrix},\begin{pmatrix}0\\ 1\\ \vdots\\ 0\end{pmatrix},\cdots,\begin{pmatrix}0\\ 0\\ \vdots\\ 1\end{pmatrix}\right\}}_{\text{$n$ - vectors}}$

$\vec{v}=\begin{pmatrix}a_1\\ a_2\\ \vdots\\ a_n\end{pmatrix}=a_1\begin{pmatrix}1\\ 0\\ \vdots\\ 0\end{pmatrix}+a_2\begin{pmatrix}0\\ 1\\ \vdots\\ 0\end{pmatrix}+\cdots+a_n\begin{pmatrix}0\\ 0\\ \vdots\\ 1\end{pmatrix}$

we can write a matrix similarly as a linear combination of the standard basis of $n\times m$ matrices

$\mathcal{S}_{n\times m}= \underbrace{\left\{\begin{pmatrix}1&0&\cdots&0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&0\end{pmatrix},\begin{pmatrix}0 & 1&\cdots &0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&0\end{pmatrix},\cdots,\begin{pmatrix}0&0&\cdots&0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\0&0&\cdots& 1\end{pmatrix}\right\}}_{\text{$n\times m$ - vectors/matrices}}$

$(a)_{ij}=\begin{bmatrix} a_{11}&a_{12}&\cdots&a_{1m}\\a_{21}&a_{22}&\cdots&a_{2m}\\\vdots&\vdots&\ddots&\vdots\\a_{n1}&a_{n2}&\cdots&a_{nm}\end{bmatrix}= a_{11}\begin{pmatrix}1&0&\cdots&0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&0\end{pmatrix}+a_{12}\begin{pmatrix}0 & 1&\cdots &0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&0\end{pmatrix}+\cdots+a_{nm}\begin{pmatrix}0&0&\cdots&0\\ 0&0&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\0&0&\cdots& 1\end{pmatrix}$

which can be represented as the column vector $\begin{pmatrix}a_{11}\\ a_{12}\\ \vdots\\ a_{nm}\end{pmatrix}$.