Linear algebra exam preparation

101 Views Asked by At

Now I tried Linear algebra. Please help clarifications, my exam is cming

  1. Let $A$ be a matrixon $\mathbb R$ with the size $2\times 3$ and is not a zero matrix. Then $\text{ rank}(A)=2$ [True or False]

False. Let $A=\pmatrix{1 & 1 & 1 \\ 0 & 0 &0}$ has $\text{ rank}(A)=1$ $\blacksquare$

  1. Let $U_1, U_2$ are subspaces of $\mathbb R^3$ with $\text{dim}=2$ and $U_1\neq U_2$ then $U_1\cap U_2$ is a subspace with $\text{dim}=1$ [True or False]

I think it's true here, but I have only an idea (not sure how to write the proof). As $U_1,U_2$ has dimension $2$ in $\mathbb R^3$ (which has $\text{dim}=3$) then there exist a unique pair $(u_1,u_2)$, $u_1\in U_1, u_2\in U_2$ such that $u_1,u_2$ is linearly dependent since $U_1\neq U_2$ and $\text{dim}(U_1)=\text{dim}(U_2)=2, \text{dim}(\mathbb R^3)=3$. Then $U_1\cap U_2$ has one dimension in common.

  1. Let V be the real vector space of all infinitely differentiable functions $f:\Bbb R\to\Bbb R$. Determine whether$$W=\left\{f\in V: (x^2+1)f''(x)-xf'(x)=e^xf(x),\forall x\in\mathbb R\right\}$$ is a subspace of $V$.

I think it is and I plan to show $2$ properties

$i)$ Consider $f(x)=0$ we have that $$(x^2+1)f''(x)-xf'(x)=e^xf(x),$$ since $f''(x)=f'(x)=f(x)=0$. Thus, $W\neq\emptyset$.

$ii)$ $\forall \alpha, \beta\in\mathbb R$ and $f_2,f_2\in W$ we have $$(x^2+1)f_1''(x)-xf'_1(x)=e^xf_1(x)$$ $$(x^2+1)f_2''(x)-xf'_2(x)=e^xf_2(x).$$ Thus, $(x^2+1)\Big(\alpha f_1''(x)+\beta f_2''(x)\Big)-x\Big(\alpha f'_1(x)+\beta f'_2(x)\Big)=e^x\Big(\alpha f_1(x)+\beta f_2(x)\Big)$, which means that $\alpha f_1+\beta f_2\in W$. Therefore, $W$ is a subspace of $V$.

  • I assumed that $V$ is a vector space over the field $K=\mathbb R$, but how do we be so sure that it is the case? See from $V=\left\{ f:\mathbb R...\right\}$?
  1. Let $n\in\mathbb N$ and $A\in M_n(\mathbb R)$ (the set of all matrices of the size $n\times n$ with real elements) has the property that $\exists k\ge 2$ such that $A^k=0$ but $A^{k-1}\neq 0$. Prove that $I,A,\dots ,A^{k-1}$ are linearly independent.

I am not sure for this. Let $$a_0I+a_1A+\cdots a_{k-1}A^{k-1}=0,$$ we multiply through by $A^{k-1}$ to obtain $a_0A^{k-1}=0\Longrightarrow a_0=0$ (since $A^k=0$ the other terms disappear?). With the same arguments for multiplying $A^{k-2}, A^{k-3},\dots A$ after replacing the known $a's$ with $0$, we acquire $a_0=a_1=\dots = a_{k-1}$ $\blacksquare$