I know that a gram matrix is defined as $G = V^*V$ where $G = (\langle x_i , x_j \rangle)_{i=1, j=1}^n$ where $x_i, x_j \in H$ and $H$ is some inner product space, say a Hilbert space. I'm confused about how to think of this definition when we talk about infinite dimensional hilbert spaces, say $L^2[a,b]$. If i take a finite subset $(f_i)_{i=1}^n \subset L^2$, then constructing a matrix of inner products between the elements is straight forward, but what does $V^*V$ actually mean now? Does this definition still hold, how can I decompose $G$?
2026-03-25 01:24:04.1774401844
gram matrix for elements of an infinite dimensional hilbert space
699 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in REAL-ANALYSIS
- how is my proof on equinumerous sets
- Finding radius of convergence $\sum _{n=0}^{}(2+(-1)^n)^nz^n$
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Justify an approximation of $\sum_{n=1}^\infty G_n/\binom{\frac{n}{2}+\frac{1}{2}}{\frac{n}{2}}$, where $G_n$ denotes the Gregory coefficients
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Is this relating to continuous functions conjecture correct?
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Absolutely continuous functions are dense in $L^1$
- A particular exercise on convergence of recursive sequence
Related Questions in FUNCTIONAL-ANALYSIS
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- Prove or disprove the following inequality
- Unbounded linear operator, projection from graph not open
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- Elementary question on continuity and locally square integrability of a function
- Bijection between $\Delta(A)$ and $\mathrm{Max}(A)$
- Exercise 1.105 of Megginson's "An Introduction to Banach Space Theory"
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- If $A$ generates the $C_0$-semigroup $\{T_t;t\ge0\}$, then $Au=f \Rightarrow u=-\int_0^\infty T_t f dt$?
Related Questions in HILBERT-SPACES
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- hyponormal operators
- a positive matrix of operators
- If $S=(S_1,S_2)$ hyponormal, why $S_1$ and $S_2$ are hyponormal?
- Is the cartesian product of two Hilbert spaces a Hilbert space?
- Show that $ Tf $ is continuous and measurable on a Hilbert space $H=L_2((0,\infty))$
- Kernel functions for vectors in discrete spaces
- The space $D(A^\infty)$
- Show that $Tf$ is well-defined and is continious
- construction of a sequence in a complex Hilbert space which fulfills some specific properties
Related Questions in INFINITE-MATRICES
- Inverse of an Infinite Matrix (with factorials)
- Matrix exponential, containing a thermal state
- Spectrum of an infinite matrix
- Infinite matrix which cannot be represented by bounded linear operator
- Distance from a vector to a linear span vectors in a separable Hilbert space.
- Derive prime-identifying functions from inverse Vandermonde and Bernoulli numbers
- Is the Birkhoff–von Neumann theorem true for infinite matrices?
- Inverse of an infinite matrix with factorial entries
- Cutting off an infinite matrix(Making a finite matrix from an infinite matrix)
- Can infinite matrices represent nonlinear operators?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
For separable infinite-dimensional Hilbert spaces $(\mathcal H,\langle\,\cdot,\cdot\,\rangle_{\mathcal H})$ (such as for example $L^2)$, this matrix decomposition still works if one considers matrices of infinite size. Use that every such Hilbert space $\mathcal H$ has a countable orthonormal basis $(g_n)_{n\in\mathbb N}$, so it is isometrically isomorphic to the Hilbert space $\ell_2$ of square-summable sequences via the linear mapping $$ \phi:\mathcal H\to\ell_2\qquad x\mapsto\begin{pmatrix}\langle g_1,x\rangle_{\mathcal H}\\\langle g_2,x\rangle_{\mathcal H}\\\vdots\end{pmatrix} $$ (this can be seen via the basis expansion and Parseval's identity). Now $\phi(x)$ is a vector of infinite length, so we can consider the "$\infty\times n$" matrix $V$ generated by $f_1,\ldots,f_n\in\mathcal H$ via $$ V=\begin{pmatrix}\phi(f_1)&\ldots&\phi(f_n)\end{pmatrix}=\begin{pmatrix} \langle g_1,f_1\rangle&\langle g_1,f_2\rangle&\cdots&\langle g_1,f_n\rangle\\\langle g_2,f_1\rangle&\cdots&\cdots&\langle g_2,f_n\rangle\\\vdots&\cdots&\cdots&\vdots\\ \end{pmatrix}. $$ Then one has $$ V^\dagger V=\begin{pmatrix}(\phi(f_1))^\dagger\\\vdots\\(\phi(f_n))^\dagger\end{pmatrix}\begin{pmatrix}\phi(f_1)&\ldots&\phi(f_n)\end{pmatrix}=\big( (\phi(f_i))^\dagger\phi(f_j)\big)_{i,j=1}^n\in\mathbb C^{n\times n} $$ with $$ (\phi(f_i))^\dagger\phi(f_j)=\langle\phi(f_i),\phi(f_j)\rangle_{\ell_2}=\sum_{k\in\mathbb N}\overline{\langle g_k,f_i\rangle}\langle g_k,f_j\rangle=\Big\langle\sum_{k\in\mathbb N}\langle g_k,f_i\rangle g_k,f_j\Big\rangle=\langle f_i,f_j\rangle $$ as is readily verified again using the basis expansion in $\mathcal H$, so in total $G=V^\dagger V$. This concept of course still holds if one considers countably many vectors $(f_n)_{n\in\mathbb N}$ from $\mathcal H$, so the respective Gram matrix $G=(\langle f_i,f_j\rangle)_{i,j\in\mathbb N}$ is of infinite size - although the concept of the Gram determinant becomes a bit more involved in this case.
Footnote: as a mathematical physicist, I'm used to the inner product being linear in the second argument. Of course, all of this still holds if the inner product is defined to be linear in the first argument as usual in pure mathematics, by changing $\langle g_i,x\rangle$ to $\langle x,g_i\rangle$ etc.