Is the rank of a Tensor different from the rank of a Matrix?

3.9k Views Asked by At

As far as I know, the rank of a matrix is the dimension of the vector space generated by columns.

In NumPy notation, x = np.array([[1, 2], [2, 4]]) has a rank of one.

np.linalg.matrix_rank(x) confirms that it is one.

While studying the TensorFlow page shown below, https://www.tensorflow.org/get_started/get_started I saw the following remarks:

A tensor's rank is its number of dimensions. Here are some examples of tensors:

3 # a rank 0 tensor; this is a scalar with shape []
[1. ,2., 3.] # a rank 1 tensor; this is a vector with shape [3]
[[1., 2., 3.], [4., 5., 6.]] # a rank 2 tensor; a matrix with shape [2, 3]
[[[1., 2., 3.]], [[7., 8., 9.]]] # a rank 3 tensor with shape [2, 1, 3]

I'm totally confused.

Q1. What is the relationship between the rank of a Matrix and the rank of a Tensor? Is it a completely different thing?

Q2. In the case of the above matrix x = np.array([[1, 2], [2, 4]]), is the rank two if we assume that it is a tensor not a matrix?

Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

"number of dimensions" is a pretty terrible description. Looking at the website, their interpretation of "tensor" is not the same as the strict mathematical definition (which is to do with multilinear maps with certain properties).

As far as that program is concerned, a tensor is a vector of vectors of vectors of... of vectors, where the rank is the number of nestings of "of vectors", so

  • A tensor of rank $1$ is a vector, which is a one-dimensional array, [a,b].
  • A tensor of rank $2$ is a vector of vectors, or a matrix, or a two-dimensional array, [[a,b],[c,d]].
  • A tensor of rank $3$ is a vector of vectors of vectors, so something with three nestings, [[[a,b],[c,d]],[[e,f],[g,h]]] sort of thing.
  • &c.

The "dimension" here is the (tensorial) rank, or the number of inputs you need to locate an entry. This is of course not the same as "dimension" in the sense that something has "dimension $n$" if is an ordered list of length $n$.

0
On

The rank you mentioned are defined differently in the two kinds of context you mentioned.

In tensorflow, the (tenorial) rank is simply the number of indices to speficy an entry. Eg. a = np.array([[[a,b],[c,d]],[[e,f],[g,h]]]), the tensor rank is 3. This is also the same as len(a.shape). Furthermore, you can always flatten this tensor, which will reduce its rank. On the other hand, in a.shape = (1, 1, 2), the entries in a.shape tuple are vector dimensions for each index, ie the range of each index.

In linear algebra, the vector rank is the number of linear independent vectors (ie noncoplanary from geometric point of view). This vector rank decides the dimension of the spanned space. Note that here the dimension is the mathematical dimension which measures the linear independency, not the dimension above which is the simple counting.

Furthermore, in linear algebra, the basis of vector space can be matrix rows, matrix columns, even matrix entries, or any F-modules. The number of the basis is vector dimension.