I have the following problem:
Let's have the scalar product in space $ℝ^2$ given by the expression:
$$ <x, y> = 2x_1 y_1 + x_2 y_2 + x_1 y_2 + x_2 y_1 $$
For a defined dot product, formulate the precise definition of the Cauchy-Schwarz inequality.
For a given dot product, define the projection matrix onto the span{a} of the line a defined by the vector $ v = (1, 0)^T$.
For task 1, I was not really sure how to express the norm inducted by the dot product; therefore, I am struggling with the RHS of the inequality:
$$ |<x, y>| ≤ ||x|| \ ||y|| $$
More precisely:
$$ 2x_1 y_1 + x_2 y_2 + x_1 y_2 + x_2 y_1 ≤ \lVert x \rVert \ \lVert y \rVert $$
Should I express the euclidian norm in the way of $ \sqrt{<x, x>} $, then get rid of the root and express it in some more suitable way?
For task 2, I was able to determine the projection onto the line generated by span(a):
$$ x_u = \frac{ax^T}{a^Ta} $$
from that, we get after substituting into the defined dot product the following equation:
$$ \frac{<x, v>}{\lVert v \rVert} v = \frac{(2x_1 + x_2)}{\sqrt{2}} . (1, 0)^T $$
I believe this should be it, but now I need to express it using the projection matrix. I believe it by multiplying given vectors; I got the following matrix:
$$ \begin{bmatrix} 2 & 0 \\ 1 & 0 \end{bmatrix} $$
But somehow, It does not seem right.
Could anyone help me out?
Thanks in advance.
For part 1.
Any inner product $\langle ,\rangle$ will induce the norm
$$ \Vert \cdot \Vert : x \mapsto \langle x,x\rangle^{\frac{1}2}$$
If the inner product is the usual dot product $ \cdot :(x,y) \mapsto \sum_i x_iy_i$ then you find the usual euclidian norm. If the inner product is different then you usually find a different norm. In our case norm is given by
$$ \Vert(x_1,x_2)\Vert^2 = 2x_1x_1 + x_2x_2 + x_1x_2 + x_2x_1 = 2x_1^2 + x_2^2 + 2x_1x_2 = x_1^2 + (x_1+x_2)^2$$
For part 2.
To find the projection of a vector $x$ onto the span of $a$. You need to find $\lambda \neq 0$ such that
$$ \langle \lambda a, x- \lambda a\rangle = 0 $$
Solving for $\lambda$ gives
$$ \lambda\langle a, x\rangle - \lambda^2\langle a,a\rangle = 0 \iff \lambda = \frac{\langle a, x\rangle}{\langle a, a\rangle} $$
So our projection is the following map
$$P : x \mapsto \frac{\langle a, x\rangle}{\langle a, a\rangle} \times a $$
Finding the matrix from here should not be too difficult. Indeed we have $$ \langle a, a\rangle = 2$$ and $$ \langle a, x\rangle = 2x_1 + x_2$$ so our projection is the map
$$ P : (x_1,x_2) \mapsto \left(\frac{2x_1+x_2}{2},0\right) = \left(x_1 + \frac{1}{2}x_2,0 \right)$$
In matrix form we have
$$ \begin{bmatrix} 1 & 1/2 \\ 0 & 0\end{bmatrix} \begin{bmatrix} x_1 \\ x_2\end{bmatrix} = \begin{bmatrix} x_1 + \frac{1}{2}x_2 \\ 0\end{bmatrix}.$$