Norm or metric that takes into account the proximity of entries of two matrices

37 Views Asked by At

I was wondering whether there exists a crisp norm or metric on $\mathbb R^{n \times n}$ that has the following behaviour:

  • Let $A, B \in \mathbb R^{n \times n}$ two matrices. Now I would like them to have a small distance if their values from roughly the same region of the matrix are close to each other

Example. Let $A, B, C \in \mathbb R^{n \times n}$ and let $n \gt\gt 2$

$$ A_{ij} = \begin{cases} 1, & \textbf{if } i = j = 1 \\ 0, & \textbf{otherwise} \end{cases} $$

$$ B_{ij} = \begin{cases} 1, & \textbf{if } i = j = 2 \\ 0, & \textbf{otherwise} \end{cases} $$

$$ C_{ij} = \begin{cases} 1, & \textbf{if } i = j = n \\ 0, & \textbf{otherwise} \end{cases} $$

Then I would like $dist(A,B)$ to be much smaller than $dist(A,C)$ or $dist(B,C)$

My Thoughts. I had two ideas:

  1. One could calculate averages of small submatrices (like $3 \times 3 \text{ matrices if $n = 9$})$ and then calculate ordinary norms like Frobenius...)
  2. One could look at a sum like: $$ \sum_{i,j \in [n]\times[n]} \sum_{k, l \in [n]\times[n]} (A_{ij}-B_{kl})^2\dot(i-k+1)^2\dot(j-l+1)^2$$ which weighs differences that are far apart stronger than those close together.

However I am not sure whether those would yield metrics (I think neither in the first nor the second idea this is the case). Furthermore I think there must be something nicer which I am not able to spot.

Question.

  1. Are the above appropriate ideas?
  2. Are there better ways of doing that? If so how do they look?
  3. Are there computationally efficient ways of doing that?

P.S. Sorry for my bad MathJax, this is literally the first time doing anything related to tex. Please just correct me if something is off :)

1

There are 1 best solutions below

2
On BEST ANSWER

$ \def\LR#1{\left(#1\right)} \def\op#1{\operatorname{#1}} \def\dist#1{\op{dist}\LR{#1}} \def\frob#1{\big\| #1 \big\|_F} \def\cd{\circledast} \def\qiq{\quad\implies\quad} $Use a Gaussian kernel to blur each matrix, then calculate the Frobenius norm of the resulting matrices.

Let $\cd$ denote convolution, then $$\eqalign{ \dist{A,B} &= \frob{\,G\cd\LR{A-B}} \\ }$$ The characteristics of the Gaussian ($\sigma$, radius, etc) will affect the numerical behavior of this function.

This is a natural extension of your idea of partitioning the matrices and calculating a weighted sum of the norms of the partitions.