So, I was studying quantum mechanics, and a question came up. In quantum mechanics, we work in a Hilbert space, and the basis that we choose can represent all the information that we can have in order to describe a system. When we need more information to describe the problem we just increase the dimension of our space. So for a given basis, we can represent a vector in this space as a column matrix $\begin{bmatrix} \alpha_{1} \\ \alpha_{2} \\ \vdots \\ \alpha_{m} \\ \end{bmatrix}$ that is a rank 1 tensor. In this case, it seems that I just have to increase the number of the ordered values $\alpha$ in order to describe a more complex problem in QM. But there is others problems in physics, like deformations, that we use rank 2 tensor, see here. So there as problems that even if their get more complicated I can only increase the dimension of a rank 1 tensor, but there are other problems that seem to not be enough just increase the dimension of a rank 1 tensor. So because when I increase the rank of a tensor, I have more structure, and then I can represent more information. So my question is: What is the difference in the information that a rank 2 tensor can hold that a rank 1 tensor can't? Can I represent the information of a rank 2 tensor with rank 1 tensors? Can I extend this idea to a rank n tensor ?
By way of example, can a column matrix with 4 values describe the same problem as a 2x2 matrix ?