Let V a real vector space of finite dimension, with a positive definite inner product and let A a symmetric operator in this space

148 Views Asked by At

I want to know how can I generalize this excersice. Let V a real vector space of finite dimension, with a positive definite inner product and let A a symmetric operator in this space a)We have $g(v,w)=<Av,w>$ show that the g function is a symmetric bilinear form.$$ I suppose that $<Av,w>=\sum_{i=1}^{n}Av_{i}w_{i}$ $i)g(v+v',w)=\sum_{i=1}^{n}A(v_{i}+v_{i}')w_{i}=\sum_{i=1}^{n}Av_{i}w_{i}+Av_{i}'w_{i}=Av_{i}w_{i}+Av_{i}'w_{i}=g(v,w)+g(v',w)$ $ii)g(\lambda v ,w)=\sum_{i=1}^{n}\lambda Av_{i}w_{i}=\lambda \sum_{i=1}^{n}Av_{i}w_{i}$ $iii)g(v,w+w')=\sum_{i=1}^{n}Av_{i}(w_{i}+w_{i}')=\sum_{i=1}^{n}Av_{i}w_{i}+Av_{i}w_{i}'=\sum_{i=1}^{n}Av_{i}w_{i}+\sum_{i=1}^{n}Av_{i}w_{i}'=g(v,w)+g(v,w')$ $iv)g(v,\lambda w)=\sum_{i=1}^{n} Av_{i}\lambda w_{i}=\lambda \sum_{i=1}^{n}Av_{i}w_{i}=\lambda g(v,w)$
$v)g(v,w)=\sum_{i=1}^{n}Av_{i}w_{i}=\sum_{i=1}^{n}Aw_{i}v_{i}=g(w,v)$

b)If $g$ is a symmetric bilinear form, show that exist a symmetric operator A so that $g(v,w)=<Av,w>$ In this part I don't know what I need to do, so i someone can tell me what I suppose to do in this part I'll gratefull.

For a) This was what I do, but they never gave me the $<v,w>$, I just suppose it, I want to generalize this proces.

1

There are 1 best solutions below

0
On

For part a) you don't need to (and shouldn't) suppose that the inner product is any more specific than being a positive definite symmetric bilinear form on $\ V\ $, which it must be by definition. You can, if you wish (although it's not necessary) express it in terms of the coordinates of the vectors in $\ V\ $ with respect to any basis, $\ u_1, u_2,\dots, u_n\ $, of $\ V\ $. This is what you appear to be trying to do when you write $$ \langle Av,w\rangle=\sum_{i=1}^nAv_iw_i\ , $$ but the expression on the right of this equation is incorrect (in fact, I can't see how to make any sense of it). If $\ v_i\ $ and $\ w_i\ $ are the coordinates of $\ v\ $ and $\ w\ $ with respect to the basis $\ u_1, u_2,\dots, u_n\ $ $\big($that is,$\ v=\sum_\limits{i=1}^nv_iu_i\ $, and $\ w=$$\,\sum_\limits{i=1}^nw_iu_i\ \big)$ then you will have $$ \langle Av,w\rangle=\sum_{i=1}^n\sum_{j=1}^n\big\langle Au_i,u_j\big\rangle v_iw_j\ . $$ The matrix which has the real number $\ a_{ij}=\langle Au_i,u_j\rangle\ $ as the entry in its $\ i^{\,\text{th}}\ $ row and $\ j^{\,\text{th}}\ $ is called the coordinate matrix of the linear operator $\ A\ $. It must be symmetric if the bilinear form $\ v,w\mapsto \langle Av,w\rangle\ $ is, and the dimension of its kernel must be the same as that of $\ A\ $, but it can otherwise be arbitrary:—if $\ A\ $ is a symmetric operator, and $\ B\ $ is any symmetric matrix with entries $\ b_{ij}\ $, whose kernel has the same dimension as that of $\ A\ $, then it's always possible to choose a basis $\ u_1, u_2,\dots, u_n\ $ such that $\ \langle Au_i,u_j\rangle=b_{ij}\ $.

It's not necessary, however, to use a coordinate representation to prove either part a) or part b). In fact, it's simpler to prove part a) just by appealing to the bilinearity and symmetry of the inner product and the linearity and symmetry of $\ A\ $: \begin{align} \big\langle A\big(\lambda_1v_1+\lambda_2v_2\big),w\big\rangle&=\big\langle\lambda_1Av_1+\lambda_2Av_2,w\big\rangle\\ &=\lambda_1\big\langle Av_1, w\big\rangle+\lambda_2\big\langle Av_2, w\big\rangle \end{align} by the linearity of $\ A\ $ and the linearity of the inner product in its first argument, $$ \big\langle Av,\lambda_1w_1+\lambda_2w_2\big\rangle=\lambda_1\big\langle Av, w_1\big\rangle+\lambda_2\big\langle Av, w_2\big\rangle $$ by the linearity of the inner product in its second argument, and \begin{align} \langle Av,w\rangle&=\langle v,Aw\rangle\\ &=\langle Aw,v\rangle\ , \end{align} by the symmetry of $\ A\ $ and the symmetry of the inner product.

Although you can also prove part b) without using a coordinate representation, I think it's probably more straightforward to prove it by using one. Let $\ e_1,e_2,\dots,e_n\ $, be an orthonormal basis for $\ V\ $ (that is $\ \big\langle e_i,e_i\big\rangle=1\ $ and $\ \big\langle e_i,e_j\big\rangle=0\ $ if $\ j\ne i\ $). Such a basis can always be constructed from any arbitrary basis with the Gram-Schmidt procedure. Given a symmetric bilinear form $\ g:V\times V\rightarrow\mathbb{R}\ $, then for any vector $\ v=\sum_\limits{i=1}^nv_ie_i\in V\ $ define $$ Av=\sum_{k=1}^n \sum_{i=1}^nv_ig\big(e_i,e_k\big)e_k\ . $$ This defines a linear operator $\ A\ $ on $\ V\ $, and if $\ v=\sum_\limits{i=1}^nv_ie_i\ $, $\ w=\sum_\limits{j=1}^nw_je_j\ $ are arbitrary vectors in $\ V\ $, then \begin{align} \langle Av,w\rangle&=\left\langle \sum_{k=1}^n \sum_{i=1}^nv_ig\big(e_i,e_k\big)e_k,\sum_{j=1}^nw_je_j\right\rangle\\ &=\sum_{j=1}^n\sum_{k=1}^n \sum_{i=1}^nv_ig\big(e_i,e_k\big)w_j\big\langle e_k,e_j\ \big\rangle\\ &=\sum_{j=1}^n\sum_{i=1}^nv_iw_jg\big(e_i,e_j \big)\\ &=g\left(\sum_{i=1}^nv_ie_i, \sum_{j=1}^nw_je_j\right)\\ &=g(v,w)\ . \end{align} I'll leave it for you to prove that $\ A\ $ is symmetric whenever $\ g\ $ is. All you need to do is replace $\ g\big(e_i,e_k \big)\ $ in with $\ g\big(e_k,e_i \big)\ $ on the right side of the first equation in the above proof and rearrange the terms of the resulting expression to show that it's equal to $\ \langle v,Aw\rangle\ $.