As far as I know, the rank of a matrix is the dimension of the vector space generated by columns.
In NumPy notation,
x = np.array([[1, 2], [2, 4]]) has a rank of one.
np.linalg.matrix_rank(x) confirms that it is one.
While studying the TensorFlow page shown below, https://www.tensorflow.org/get_started/get_started I saw the following remarks:
A tensor's rank is its number of dimensions. Here are some examples of tensors:
3 # a rank 0 tensor; this is a scalar with shape []
[1. ,2., 3.] # a rank 1 tensor; this is a vector with shape [3]
[[1., 2., 3.], [4., 5., 6.]] # a rank 2 tensor; a matrix with shape [2, 3]
[[[1., 2., 3.]], [[7., 8., 9.]]] # a rank 3 tensor with shape [2, 1, 3]
I'm totally confused.
Q1. What is the relationship between the rank of a Matrix and the rank of a Tensor? Is it a completely different thing?
Q2. In the case of the above matrix x = np.array([[1, 2], [2, 4]]), is the rank two if we assume that it is a tensor not a matrix?
Thank you!
"number of dimensions" is a pretty terrible description. Looking at the website, their interpretation of "tensor" is not the same as the strict mathematical definition (which is to do with multilinear maps with certain properties).
As far as that program is concerned, a tensor is a vector of vectors of vectors of... of vectors, where the rank is the number of nestings of "of vectors", so
[a,b].[[a,b],[c,d]].[[[a,b],[c,d]],[[e,f],[g,h]]]sort of thing.The "dimension" here is the (tensorial) rank, or the number of inputs you need to locate an entry. This is of course not the same as "dimension" in the sense that something has "dimension $n$" if is an ordered list of length $n$.