Can I use set builder notation to define indicies of a matrix?

555 Views Asked by At

If I have two sets $A = \{a_1,a_2,\dots,a_n\}$ and $B = \{b_1,b_2,\dots,b_m\}$, can I define a matrix in the following way?

$C = \{c_{ij} \mid (i,j) \in A \times B\}$

or do the $i$ and $j$ refer to the values of $a_1,a_2$, etc - and not their indices?

2

There are 2 best solutions below

0
On

As you say, $(i,j)\in A \times B$ refers to the elements, not to the indices, sets have no notion of indices. So if you want to be really strict and rigorous with your notation, you have to use ordered sets $A=\langle a_1, a_2, ... \rangle$ and $B=\langle b_1, b_2, ... \rangle$ (or some kind of similar construction). However, what does $c_{ij}$ means here? If you already have a way of specifying $c_{ij}$, then using $A$ and $B$ is not necessary.

0
On

It is indeed reasonable to use arbitrary sets as the index sets for matrices, rather than specifically sets of the form $\{ 1, 2, \ldots, n \}$.

Note that even infinite sets are reasonable here, giving you infinite dimensional matrices! This introduces some extra complications that you have to deal with; e.g. matrix multiplication would involve an infinite sum, so if you want to multiply such matrices you have to impose some restrictions.

Also note that the empty set is reasonable here too. This also introduces complications, but these are mainly problems of being unfamiliar with how linear algebra degenerates, rather than technical obstructions. For example, every $0 \times 0$ matrix is both a zero matrix and an identity matrix, and the choice of index sets need to be included in the definition of the matrix, so that you can tell the difference between $0\times 0$ and $0 \times 1$ matrices.