Operator norm $ ( \ell_2 \to \ell_1)$

173 Views Asked by At

Let $X$ be a finite dimensional normed vector space and $Y$ an arbitrary normed vector space. $ T:X→Y$.

I want to calculate $\|T\|$ for where $X = K^n$, equipped with the Euclidean norm $\|\cdot\|_2$, $Y := \ell_1(\mathbb{N})$ and $Tx := (x_1,\ldots,x_n,0,0,\ldots) \in \ell_1(\mathbb{N})$, for all $x = (x_1,\ldots,x_n) \in K^n$.

I do not know how to continue $$ ||T∥_2 = \sup \limits_{x \neq 0} \frac{∥Tx∥_1}{∥x∥_2} = \sup \limits_{x \neq 0} \frac{∥(x_1,…,x_n,0,0,…)∥_1}{∥(x_1,…,x_n)∥_2} = \sup \limits_{x \neq 0} \frac{|x_1|+…+|x_n|}{(|x_1|^2+…+|x_n|^2)^{\frac{1}{2}}}= ? $$

2

There are 2 best solutions below

2
On

I will elaborate on my comment above

Given an operator $T: X \rightarrow Y$ where $$X = \mathbb{K}^{n}$$ $$Y = l^{1}(\mathbb{N})$$ its norm is given by $$||T||_{op} = \sup_{x \neq 0}{\frac{||Tx||_{1}}{||x||_{2}}} = \sup_{||x||_{2} \leq 1}{||Tx||_{1}} = \sup_{||x||_{2} = 1}{||Tx||_{1}}$$

So we have to maximize $$||Tx||_{1} = |x_{1}| + \ldots + |x_{n}|$$ given $$||x||_{2} := (x_{1}^{2} + \ldots + x_{n}^{2})^{\frac{1}{2}} = 1$$

Let $t_{i} := |x_{i}|$, then our problem reformulates as follows: $$t_{1} + t_{2} + \ldots + t_{n} \rightarrow \text{max}$$ $$t_{1}^{2} + t_{2}^{2} + \ldots + t_{n}^{2} = 1$$ $$t_{i} \geq 0, \ \forall i = 1, \ldots, n$$

The Lagrangian for the problem is: $$L = (t_{1} + \ldots + t_{n}) - \lambda (t_{1}^{2} + \ldots + t_{n}^{2} - 1)$$ and $\lambda$ stays for the lagrange multiplier.

The necessary extremum condition implies: $$\frac{\partial L}{\partial t_{i}} := 1 - \lambda(2t_{i}) = 0$$ thus $t_{i} = \frac{1}{2 \lambda}$. Since $t_{i} \geq 0$ it follows that $\lambda > 0$.

The stationary point is $x = (\frac{1}{2 \lambda}, \ldots, \frac{1}{2 \lambda})$ and we shall find the lambda such that the solution satisfy the second condition in the problem above.

$||x||_{2} = 1$ is equivalent to $$ n \cdot \frac{1}{4 \lambda^{2}} = 1$$ from which we get $\lambda = \frac{\sqrt{n}}{2}$ (note that we shall choose it positive due to the restrictions mentioned above)

Thus $t_{i} = \frac{1}{\sqrt{n}}$ and the maximum equals $$\text{max} = n \cdot \frac{1}{\sqrt{n}} = \sqrt{n}$$

0
On

Using Cauchy-Schwarz inequality, we have $$ \sum_{i=1}^n\left\lvert x_i\right\rvert=\sum_{i=1}^n\left\lvert x_i\right\rvert\cdot 1\leqslant \left(\sum_{i=1}^nx_i^2\right)^{1/2}\left(\sum_{i=1}^n1\right)^{1/2}=\sqrt n\left(\sum_{i=1}^nx_i^2\right)^{1/2} $$ hence $\left\lVert T\right\rVert\leqslant \sqrt n$. For the opposite inequality, look at the case where $x_i=1$ for all $i\in\{1,\dots,n\}$.