Is there an algebraic treatment of ragged tensors?

55 Views Asked by At

A ragged tensor in certain machine learning libraries is a multidimensional array with a non-rectangular shape. However, this is within a context where tensors are multidimensional arrays. In pure mathematics, a tensor is a multilinear object with algebraic properties. Has there been any pure mathematical work on taking ragged tensors in the ML context and finding a similar algebraic structure for them?

Addendum Ragged tensors in ML are useful when features exist that do not have a consistent shape. From the tensorflow documentation:

Your data comes in many shapes; your tensors should too. Ragged tensors are the TensorFlow equivalent of nested variable-length lists. They make it easy to store and process data with non-uniform shapes, including:

  • Variable-length features, such as the set of actors in a movie.
  • Batches of variable-length sequential inputs, such as sentences or video clips.
  • Hierarchical inputs, such as text documents that are subdivided into sections, paragraphs, sentences, and words.
  • Individual fields in structured inputs, such as protocol buffers.