Calculating sum of matrix elements squared $\sum_{j} \sum_{k} X_{j, k}^2$ using dot product

119 Views Asked by At

Given a 1-D vector $v$, if asked to calculate $\sum_{i} v_i^2$, one can use a dot product trick: $\sum_{i} v_i^2 = v^T v$.

I have a 2-D matrix $X$, and similarly want to calculate $\sum_{j} \sum_{k} X_{j, k}^2$.

How can one use dot product in this case? Does a similar dot product trick exist?

3

There are 3 best solutions below

1
On BEST ANSWER

This is the square of the Frobenius norm of $X$, and can be expressed as:

$$\|X\|_F^2 = \text{Tr}(X^\top X) = \text{Tr}(XX^\top)$$

where Tr is the trace function.

Incidentally, this norm is induced from the inner product $\langle A, B \rangle := \text{Tr}(A^\top B)$ on matrices of a given shape (similar to how $\|v\|^2$ is a norm induced by the inner product $\langle v, w\rangle := v^\top w$ on $\mathbb{R}^n$).

1
On

Note that $$ \sum\limits_{j} \sum\limits_{k} \ |a_{jk}|^2 = \mbox{Trace}(A^T A) $$

This is often used to define the Frobenius norm of a real matrix $A$.

Note that $$ \Vert A \Vert_F = \sqrt{\sum\limits_{j} \sum\limits_{k} \ |a_{jk}|^2} = \sqrt{\mbox{Trace}(A^T A)} $$

0
On

One other solution would be to reshape the matrix $X$ into a 1-D vector, and then one can leverage the suggested 1-D vector formula.

Here's how it can be done in Python:

import numpy as np

x = X.reshape(-1, 1)  # -1 means infer dimensions based on others
out: float = np.dot(x.T, x).item(). # .item() means extract value
print(out)

I am new to linear algebra, so I am not sure the correct mathematical notation for this.