Existence of a particular basis for a finite-dimensional subspace of functions

109 Views Asked by At

There is this question that I saw, but have no idea how to solve it completely:

Let $X$ be a set, and consider $\mathbb{C}$-vector space Maps($X, \mathbb{C}$) of $\mathbb{C}$-valued functions on $X$. Let $V \subseteq$ Maps($X, \mathbb{C}$) be any finite dimensional subspace of Maps($X, \mathbb{C}$) with dimension $n$. Show that there exists points $x_1, \dots, x_n$ in $X$ and a $\mathbb{C}$-basis $f_1, \dots, f_n$ of $V$ such that for any $i, j \in \{1, \dots, n\}$, one has

$f_i(x_j)=\delta_{ij}=\begin{cases}1 \text{ if } i=j \\ 0 \text{ if } i \neq j \end{cases}$.

My thinking process: since $V$ is of dimension $n$, I can find $n$ elements in $V$ that forms a basis. Since $V$ consists of functions from $X$ to $\mathbb{C}$, the $n$ functions, labelled $g_1,\dots g_n$, are linearly independent of each other.

Now if I can find a set of $n$ points in $X$ (I have no idea how yet), and if I rewrite the values $f_1(x_1), f_2(x_1), \dots, f_n(x_1)$ for all $x_1, \dots, x_n$ into a matrix and I can show that this matrix is invertible, then since $V$ is a vector space and linear combinations of $g_i$ will be in $V$, and the matrix can be reduced to identity matrix (because it is invertible), I would get what I needed. However, how do I prove all these parts that I need? Or is this even the correct way?

Edit: formatting.

1

There are 1 best solutions below

0
On BEST ANSWER

Assume that $\dim V=n$ and that $g_1,\ldots,g_n\in V$ are linearly independent.

Claim: There exist $x_1,\ldots,x_n\in X$ such that if $f,g\in V $ and $f(x_j)=g(x_j)$ for all $j$, then $f=g$ (proof at the end).

Using the claim, the $n\times n$ matrix $A=(g_k(x_j))$ is invertible (if it weren't, we would find a nonzero linear combination $r_1g_1+\cdots+r_ng_n$ that is zero at all $x_j$, and by the claim they would be linearly dependent). Let $B$ be the inverse of $A$.

Now the equality $BA=I$ can be seen as $$ \sum_\ell b_{k \ell}g_\ell(x_j)=\delta_{kj}=\delta_k(x_j), $$ where $\delta_k$ is the function that is $1$ at $x_k$ and zero elsewhere. By the claim, $$ \delta_k=\sum_\ell b_{k \ell}g_\ell\in V. $$ It is obvious that $\delta_1,\ldots,\delta_n$ are linearly independent, so they form a basis for $V$.


Proof of the claim: by linearity, we may prove this version of the claim:

There exist $x_1,\ldots,x_n\in X$ such that if $f\in V $ and $f(x_j)=0$ for all $j$, then $f=0$

Assume that the negation of the claim holds: that for every $x_1,\ldots,x_n\in X$ there exists $f\in V$ with $f(x_j)=0$ for all $j$ by $f\ne0$.

Since $V$ is nonzero, there exists $h_0\in V$ and $x_1\in X$ with $h_0(x_1)=1$. Now apply the negation of the claim to $x_1,\ldots,x_1$: there exists $h_1\in V$ with $h_1(x_1)=0$, and $x_2\in X$ with $h_1(x_2)=1$. Next we apply the negation to $x_1,x_2,\ldots,x_2$: there exists $h_2\in V$ with $h_2(x_1)=h_2(x_2)=0$, and $x_3\in X$ with $h_2(x_3)=1$. Continuing in the same fashion, we end up with $x_1,\ldots,x_{n+1}$ and $h_0,h_1,\ldots,h_n\in V$ such that, for each $k$, $$ h_k(x_1)=\cdots=h_k(x_k)=0,\ \ \ h_k(x_{k+1})=1. $$ The functions $h_0,\ldots,h_n$ are linearly independent: if $\sum_{k=0}^n\lambda_kh_k=0$, evaluating at each of $x_1,\ldots,x_{n+1}$ we obtain that $\lambda_0=\cdots=\lambda_n=0$.

As $V$ is $n$-dimensional, we have obtained a contradiction, which proves the claim.