If the matrix $A \in \mathbb{R}^{n \times n}$ is symmetric and positive definite then we can define the improduct $$(x,y)_A = x^T A y$$ and the related $A$-norm $$\left \| x \right \|_A = \sqrt{(x,x)_A} = \sqrt{x^T A x}$$ I guess we can also define the norm of a matrix $X \in \mathbb{R}^{n \times n}$ as $$\left \| X \right \|_A = \sup \left\{{\frac {\|Xx\|_A}{\|x\|_A}}:x\in \mathbb{R}^{n}{\text{ with }}x\neq 0\right\}$$ On the wikipediapage https://en.wikipedia.org/wiki/Matrix_norm there is a written that in some cases, it holds that $$\|AB\| \le \|A\|\|B\| $$ My question: Does this also holds for the $A$-norm that I defined?
kind regards Koen
It holds in even higher generality: If $A \in \mathbb R^{m \times k}$ and $B \in \mathbb R^{k \times n}$, then $AB \in \mathbb R^{m \times n}$ and we have the inequality $$ ||AB|| \leq ||A|| \cdot ||B||.$$ To see this, it suffices to show that for any vector $x \in \mathbb R^n$ with $x \neq 0$ the inequality $$\frac {||ABx||}{||x||} \leq ||A|| \cdot ||B||$$ holds. If $Bx = 0 \in \mathbb R^k$, then the left-hand side is zero and the inequality certainly holds. If $Bx \neq 0$, then we can do the following trick $$\frac {||ABx||}{||x||} = \frac {||ABx||}{||Bx||} \cdot \frac{||Bx||}{||x||} \leq ||A|| \cdot ||B||.$$
In fact, the same trick works on bounded endomorphisms between not necessarily finite dimensional Banach spaces.