Bilinear maps and functions of the form $(x,y) \mapsto ux^2 + 2vxy + wy^2$

1k Views Asked by At

I was recently reading from the book "Vectors, Pure and Applied", $\S 16.1$ on bilinear forms. It begins

In section 8.3 we discussed functions of the form $$(x,y) \mapsto ux^2 + 2vxy + wy^2$$ These are special cases of the idea of a bilinear function.

Definition 16.1.1 If $U,V$ and $W$ are vector spaces over $\mathbb{F}$, then $a:U\times V \to W$ is a bilinear function if the map $$\alpha_{,v}:U\to W \text{ given by } \alpha_{,v}(u) = \alpha(u,v)$$ is linear for each fixed $v \in V$ and the map $$\alpha_{u,}:V\to W \text{ given by } \alpha_{u,}(v) = \alpha(u,v)$$ is linear for each fixed $u \in U.$

It's my understanding that this definition is simply saying that a bilinear function is linear in each argument with the other held fixed. However, the initial "special case" appears to not be a bilinear function at all, for fixing $y$ we have

$$(cx, y) \mapsto uc^2 x^2 + 2vcxy + wy^2 \neq c(ux^2 + 2vxy + wy^2)$$

for some scalar $c$.

What am I missing? Or have I completely forgotten the definition of linearity?

2

There are 2 best solutions below

5
On BEST ANSWER

The wording "...are special cases..." seems rather misleading here, but there is a sensible explanation for what's going on:

Given any vector space $\Bbb V\newcommand{\bfx}{{\bf x}}\newcommand{\bfy}{{\bf y}}$ over a field $\Bbb F$ and a bilinear function $B: \Bbb V \times \Bbb V \to \Bbb F$, we can build a quadratic form $Q : \Bbb V \to \Bbb F$ by defining $$Q(\bfx) := B(\bfx, \bfx).$$

This map is not linear: Indeed, as you've observed (in a special case, see below), we have for $a \in \Bbb F$ that $$Q(a \bfx) = B(a \bfx, a \bfx) = a^2 B(\bfx, \bfx) = a^2 Q(\bfx) .$$ Furthermore, one cannot in general determine $Q(\bfx + \bfy)$ in terms of $Q(\bfx)$ and $Q(\bfy)$ alone.

Now, for the "special cases":

Example Consider the bilinear form $B: \Bbb F^2 \times \Bbb F^2 \to \Bbb F$ given by $$B(\bfx, \bfx') := (\bfx')^T \pmatrix{u&v\\v&w} \bfx . $$ (Indeed, given any finite-dimensional vector space over $\Bbb F$, any bilinear form has the form $B(\bfx, \bfy) = \bfy^T A \bfx$ for some unique matrix $A$.) Its associated quadratic form is $$Q(\bfx) = \bfx^T \pmatrix{u&v\\v&w} \bfx ,$$ and if we write the components of $\bfx$ as $$\bfx = \pmatrix{x\\y} ,$$ computing (multiplying matrices) gives $$Q\pmatrix{x\\y} = u x^2 + 2 v xy + w y^2 .$$ In other words the maps $$(x, y) \mapsto u x^2 + 2 v xy + w y^2$$ are quadratic forms induced by bilinear forms on $\Bbb F^2$.

If a bilinear form $B$ is symmetric, that is, if $$B(\bfx, \bfy) = B(\bfy, \bfx)$$ for all $\bfx, \bfy \in \Bbb V$, we can recover $B$ from the quadratic form it defines by $$B(\bfx, \bfy) = \frac{1}{2}[Q(\bfx + \bfy) - Q(\bfx) - Q(\bfy)] ,$$ at least provided that $\operatorname{char} \Bbb F \neq 2$.

Example (continued) Using this formula to recover the bilinear form associated to our $Q$ is $$B(\bfx, \bfx') = (\bfx')^T \pmatrix{u&v\\v&w} \bfx ,$$ and in terms of the components of $\bfx, \bfy'$ this is (in analogy with the presentation in which the quadratic form was given) $$((x, y), (x', y') \mapsto u x x' + v(x y' + y x') + w y y' .$$

(If $B$ is not symmetric, this construction instead recovers the symmetrization of $B$, namely, the map $(\operatorname{Sym} B)(\bfx, \bfy) := \frac{1}{2}[B(\bfx, \bfy) + B(\bfy, \bfx)]$, and it's easy to check that this map is a symmetric bilinear form.)

A bilinear form is symmetric if its matrix $A$ w.r.t. some, equivalently any, basis is symmetric. Since the matrices of the form $$\pmatrix{u&v\\v&w}$$ are precisely the symmetric $2 \times 2$ matrices over $\Bbb F$, the "special cases" in fact are precisely all of the symmetric bilinear forms on $\Bbb F^2$.

0
On

People seem to have trouble making this concrete. Over the reals, finite dimensional, we get this: suppose we have a quadratic form on $\mathbb R^n$ $$ Q(x_1, x_2, ..., x_n) = \; \; \; A x_1^2 + B x_1 x_2 + C x_2^2 + \cdots $$ where all the terms are either constant multiples of some $x_k^2,$ or constant multiples of some $x_i x_j.$

Alright, calculate the Hessian matrix of second partial derivatives of $Q.$ Call this matrix $H.$ I was surprised to learn this was named after a person. https://en.wikipedia.org/wiki/Otto_Hesse For decades I assumed it had something to do with soldiers in the American Revolutionary War, 1775-1783.

Then make a column vector $$ v = \left( \begin{array}{c} x_1 \\ x_2 \\ \cdots \\ x_n \end{array} \right) $$ with row vector $v^T,$ its transpose.

We get $$ Q(v) = \frac{1}{2} \; \; v^T H v $$ and $$ B(v,w) = \frac{1}{2} \; \; v^T H w = \frac{1}{2} \; \;w^T H v $$ In both cases, these are now 1 by 1 matrices, that is numbers. In particular, the transpose of a 1 by 1 matrix is itself, which we see in $v^T H w = w^T H v.$