Proof with mapping which preserves norms

106 Views Asked by At

Suppose that $\phi:R^d\rightarrow R^k$ is a linear mapping where all norms are preserved up to a multiplicative factor. In a sense, for $1>\epsilon>0$ and for all $i\in\ [1,...,n]$ $$(1-\epsilon)\|x_i\|^2\leq \|\phi(x_i)\|^2\leq (1+\epsilon)\|x_i\|^2$$ and for all $i,j\in\ [1,...,n]$ $$(1-\epsilon)\|x_i-x_j \|^2\leq \|\phi(x_i)-\phi(x_j)\|^2\leq (1+\epsilon)\|x_i-x_j\|^2$$ If $\|x_i\|\leq 1$ is it true that for all $i,j\in\ [1,...,n]$ $$\|\phi(x_i)^\top \phi(x_j)-(x_i^\top x_j)\|\leq K\epsilon$$ where $K$ is some constant?

My thoughts: So, I was thinking that the Cauchy-Schwarz inequality might be of use here. Also, after researching this online, I came across the Johnson-Lindenstrauss Lemma which may help here. It looks like it would, but not sure how to make use of it here.

1

There are 1 best solutions below

0
On

Inequalities for the Euclidean 2-Norm

To begin with, assume that $\langle x, x \rangle = x^T x$ is the inner product being used on $\mathbb{R}^N$ and that $\| x \|_2 = \sqrt{\langle x, x \rangle} $ is the induced norm.

If we expand the two following expressions, $$ \begin{align} \|\phi(x_i) - \phi(x_j)\|^2_2 =& \|\phi(x_i) \|^2_2 - 2 \langle \phi(x_i),\phi(x_j) \rangle + \|\phi(x_j)\|^2 \\ \|x_i - x_j\|^2_2 =& \|x_i\|^2_2 - 2\langle x_i, x_j \rangle+ \|x_j\|^2_2 \end{align} $$

And so we have that, \begin{align} \|\phi(x_i) - \phi(x_j) \|^2_2 - \|x_i - x_j\|^2_2 =& \\ (\|\phi(x_i) \|^2_2 - \|x_i\|^2_2) &+ (\|\phi(x_j) \|^2_2 - \|x_j\|^2_2) +2(\langle x_i, x_j \rangle - \langle \phi(x_i), \phi(x_j)\rangle) \tag{*} \end{align}

Rearranging the assumed inequalities in the question, yields, $$ \begin{align} -\varepsilon\|x_i\|^2_2 \leq \|\phi(x_i)\|^2_2 - \|x_i\|^2_2 \leq& \varepsilon\|x_i\|^2_2 \tag{1} \\ -\varepsilon \|x_i - x_j\|^2_2 \leq \|\phi(x_i) - \phi(x_j) \|^2_2 - \|x_i -x_j\|^2_2 \leq& \varepsilon \|x_i - x_j\|^2_2 \tag{2} \end{align} $$ Using (1) and (2) in (*) gives, $$ -\varepsilon \|x_i - x_j\|^2_2 \leq \varepsilon\|x_i\|^2_2 + \varepsilon\|x_j\|^2_2 + 2(\langle x_i, x_j \rangle - \langle \phi(x_i), \phi(x_j) \rangle) \tag{3} $$ and, $$ \varepsilon \|x_i - x_j\|^2_2 \geq -\varepsilon\|x_i\|^2_2 - \varepsilon\|x_j\|^2_2 + 2(\langle x_i, x_j \rangle - \langle \phi(x_i), \phi(x_j) \rangle) \tag{4} $$

If we now rearrange equation (3) we have that, $$ \begin{align} \langle \phi(x_i), \phi(x_j) \rangle- \langle x_i, x_j \rangle \leq& \varepsilon \frac{1}{2}(\|x_i - x_j\|^2_2 + \|x_i\|^2_2 + \|x_j\|^2_2)\tag{6} \end{align} $$

Similarly, rearranging equation (4) gives, $$ \begin{align} \langle \phi(x_i), \phi(x_j) \rangle - \langle x_i, x_j \rangle \geq& -\varepsilon\frac{1}{2}(\|x_i - x_j\|^2_2 + \|x_i\|^2_2 + \|x_j\|^2_2) \tag{7} \end{align} $$

For an arbitrary norm

Since we are talking about $\mathbb{R}^N$, all norms are equivalent. This means that there exists $\alpha, \beta \in \mathbb{R}$, greater than zero, such that $\forall x\in\mathbb{R}^N$,

$$ \alpha \|x \| \leq \|x\|_2 \leq \beta \| x\| \tag{8} $$

Following from inequalities (6) and (7) we have that, $$ -\varepsilon (\frac{1}{2} \|x_i - x_j\|^2_2 + \|x_i\|^2_2 + \|x_j\|^2_2) \leq \langle \phi(x_i), \phi(x_j) \rangle - \langle x_i, x_j \rangle \leq \varepsilon \frac{1}{2} \|x_i - x_j\|^2_2 + \|x_i\|^2_2 + \|x_j\|^2_2 $$

Using inequality (8) and the fact that $\|x_i\| \leq 1$ then gives, $$ -2\beta^2\varepsilon \leq \langle \phi(x_i), \phi(x_j) \rangle - \langle x_i, x_j \rangle \leq 2\beta\varepsilon $$

and so, $$ |\langle \phi(x_i), \phi(x_j) \rangle- \langle x_i, x_j \rangle \rangle | \leq 2\beta^2\varepsilon $$

If we set $K = 2\beta^2$ and use matrix notation for the inner product, we have that,

$$ |\phi(x_i)^T\phi(x_j) - (x^T_i x_j)| < K\varepsilon $$ as required.